For comprehensive and current results, perform a real-time search at Science.gov.

1

Optimal randomized scheduling by replacement

In the replacement scheduling problem, a system is composed of n processors drawn from a pool of p. The processors can become faulty while in operation and faulty processors never recover. A report is issued whenever a fault occurs. This report states only the existence of a fault but does not indicate its location. Based on this report, the scheduler can reconfigure the system and choose another set of n processors. The system operates satisfactorily as long as, upon report of a fault, the scheduler chooses n non-faulty processors. We provide a randomized protocol maximizing the expected number of faults the system can sustain before the occurrence of a crash. The optimality of the protocol is established by considering a closely related dual optimization problem. The game-theoretic technical difficulties that we solve in this paper are very general and encountered whenever proving the optimality of a randomized algorithm in parallel and distributed computation.

Saias, I.

1996-05-01

2

Temporal tracking on cyclic-interval reinforcement schedules.

Pigeons were exposed to four cycles per session of a schedule in which the duration of successive interreinforcement intervals differed by t-sec. A cycle was composed of seven increasing and seven decreasing intervals, from 2t to 8t sec in length. In Exp. 1, postreinforcement pause tracked interval duration on five cyclic schedules, with values of t ranging from 2 to 40 sec. Tracking was better at shorter t values, and when discriminative stimuli signalled increasing and decreasing parts of the cycle. Pooled data for the whole experiment showed postreinforcement pause to bear a power function relationship to interval length, with a smaller exponent than the comparable function for fixed-interval schedules. Tests in a second experiment showed that pigeons trained on an arithmetic progression could also track schedules in which successive intervals followed either a logarithmic or a geometric progression, although tracking was more stable in the logarithmic case. PMID:16811558

Innis, N K; Staddon, J E

1971-11-01

3

A simple BASIC program to generate values for variable-interval schedules of reinforcement.

A BASIC program to generate values for variable-interval (VI) schedules of reinforcement is presented. A VI schedule should provide access to reinforcement with a constant probability over a time horizon. If the values in a VI schedule are calculated from an arithmetic progression, the probability of reinforcement is positively correlated with the time since the last reinforcer was delivered. Fleshler and Hoffman (1962) developed an iterative equation to calculate VI schedule values so that the probability of reinforcement remains constant. This easy-to-use program generates VI schedule values according to the Fleshler and Hoffman equation, randomizes the values, and saves the values in ASCII to a disk file. PMID:16795763

Hantula, D A

1991-01-01

4

Randomized online scheduling on two uniform machines

We study the problem of online scheduling on two uniform machines with speeds 1 and s > 1. A r$ zz 1.61803 competitive deterministic algorithm was already known. We present the first randomized results for this problem: We show that randomization does not help for speeds s 1 2, but does help for all s < 2. We present a

Leah Epstein; John Noga; Steven S. Seiden; Ji?í Sgall; Gerhard J. Woeginger

1999-01-01

5

On 2Player Randomized Mechanisms for Scheduling

In this paper, we study randomized truthful mechanisms for scheduling unrelated machines. We focus on the case of scheduling\\u000a two machines, which is also the focus of many previous works [12,13,6,4]. For this problem, [13] gave the current best mechanism\\u000a with an approximation ratio of 1.5963 and [14] proved a lower bound of 1.5. In this work, we introduce a

Pinyan Lu

2009-01-01

6

Drug discrimination under two concurrent fixed-interval fixed-interval schedules.

Pigeons were trained to discriminate 5.0 mg/kg pentobarbital from saline under a two-key concurrent fixed-interval (FI) 100-s FI 200-s schedule of food presentation, and later tinder a concurrent FI 40-s FI 80-s schedule, in which the FI component with the shorter time requirement reinforced responding on one key after drug administration (pentobarbital-biased key) and on the other key after saline administration (saline-biased key). After responding stabilized under the concurrent FI 100-s FI 200-s schedule, pigeons earned an average of 66% (after pentobarbital) to 68% (after saline) of their reinforcers for responding under the FI 100-s component of the concurrent schedule. These birds made an average of 70% of their responses on both the pentobarbital-biased key after the training dose of pentobarbital and the saline-biased key after saline. After responding stabilized under the concurrent FI 40-s FI 80-s schedule, pigeons earned an average of 67% of their reinforcers for responding under the FI 40 component after both saline and the training dose of pentobarbital. These birds made an average of 75% of their responses on the pentobarbital-biased key after the training dose of pentobarbital, but only 55% of their responses on the saline-biased key after saline. In test sessions preceded by doses of pentobarbital, chlordiazepoxide, ethanol, phencyclidine, or methamphetamine, the dose-response curves were similar under these two concurrent schedules. Pentobarbital, chlordiazepoxide, and ethanol produced dose-dependent increases in responding on the pentobarbital-biased key as the doses increased. For some birds, at the highest doses of these drugs, the dose-response curve turned over. Increasing doses of phencyclidine produced increased responding on the pentobarbital-biased key in some, but not all, birds. After methamphetamine, responding was largely confined to the saline-biased key. These data show that pigeons can perform drug discriminations under concurrent schedules in which the reinforcement frequency under the schedule components differs only by a factor of two, and that when other drugs are substituted for the training drugs they produce dose-response curves similar to the curves produced by these drugs under other concurrent interval schedules. PMID:10966096

McMillan, D E; Li, M

2000-01-01

7

A quantitative analysis of the responding maintained by interval schedules of reinforcement.

Interval schedules of reinforcement maintained pigeons' key-pecking in six experiments. Each schedule was specified in terms of mean interval, which determined the maximum rate of reinforcement possible, and distribution of intervals, which ranged from many-valued (variable-interval) to single-valued (fixed-interval). In Exp. 1, the relative durations of a sequence of intervals from an arithmetic progression were held constant while the mean interval was varied. Rate of responding was a monotonically increasing, negatively accelerated function of rate of reinforcement over a range from 8.4 to 300 reinforcements per hour. The rate of responding also increased as time passed within the individual intervals of a given schedule. In Exp. 2 and 3, several variable-interval schedules made up of different sequences of intervals were examined. In each schedule, the rate of responding at a particular time within an interval was shown to depend at least in part on the local rate of reinforcement at that time, derived from a measure of the probability of reinforcement at that time and the proximity of potential reinforcements at other times. The functional relationship between rate of responding and rate of reinforcement at different times within the intervals of a single schedule was similar to that obtained across different schedules in Exp. 1. Experiments 4, 5, and 6 examined fixed-interval and two-valued (mixed fixed-interval fixed-interval) schedules, and demonstrated that reinforcement at one time in an interval had substantial effects on responding maintained at other times. It was concluded that the rate of responding maintained by a given interval schedule depends not on the overall rate of reinforcement provided but rather on the summation of different local effects of reinforcement at different times within intervals. PMID:5672248

Catania, A C; Reynolds, G S

1968-05-01

8

A quantitative analysis of the responding maintained by interval schedules of reinforcement1

Interval schedules of reinforcement maintained pigeons' key-pecking in six experiments. Each schedule was specified in terms of mean interval, which determined the maximum rate of reinforcement possible, and distribution of intervals, which ranged from many-valued (variable-interval) to single-valued (fixed-interval). In Exp. 1, the relative durations of a sequence of intervals from an arithmetic progression were held constant while the mean interval was varied. Rate of responding was a monotonically increasing, negatively accelerated function of rate of reinforcement over a range from 8.4 to 300 reinforcements per hour. The rate of responding also increased as time passed within the individual intervals of a given schedule. In Exp. 2 and 3, several variable-interval schedules made up of different sequences of intervals were examined. In each schedule, the rate of responding at a particular time within an interval was shown to depend at least in part on the local rate of reinforcement at that time, derived from a measure of the probability of reinforcement at that time and the proximity of potential reinforcements at other times. The functional relationship between rate of responding and rate of reinforcement at different times within the intervals of a single schedule was similar to that obtained across different schedules in Exp. 1. Experiments 4, 5, and 6 examined fixed-interval and two-valued (mixed fixed-interval fixed-interval) schedules, and demonstrated that reinforcement at one time in an interval had substantial effects on responding maintained at other times. It was concluded that the rate of responding maintained by a given interval schedule depends not on the overall rate of reinforcement provided but rather on the summation of different local effects of reinforcement at different times within intervals. PMID:5672248

Catania, A. Charles; Reynolds, G. S.

1968-01-01

9

Daily administration of cocaine often results in the development of tolerance to its effects on responding maintained by fixed-ratio schedules. Such effects have been observed to be greater when the ratio value is small, whereas less or no tolerance has been observed at large ratio values. Similar schedule-parameter-dependent tolerance, however, has not been observed with fixed-interval schedules arranging comparable interreinforcement intervals. This experiment examined the possibility that differences in rate and temporal patterning between the two types of schedule are responsible for the differences in observed patterns of tolerance. Five pigeons were trained to key peck on a three-component multiple (tandem fixed-interval fixed-ratio) schedule. The interval values were 10, 30, and 120 s; the tandem ratio was held constant at five responses. Performance appeared more like that observed under fixed-ratio schedules than fixed-interval schedules. Effects of various doses of cocaine given weekly were then determined for each pigeon. A dose that reduced responding was administered prior to each session for 50 days. A reassessment of effects of the range of doses revealed tolerance. The degree of tolerance was similar across components of the multiple schedule. Next, the saline vehicle was administered prior to each session for 50 days to assess the persistence of tolerance. Tolerance diminished in all subjects. Overall, the results suggested that schedule-parameter-dependent tolerance does not depend on the temporal pattern of responding engendered by fixed-ratio schedules. PMID:15693524

Pinkston, Jonathan W; Branch, Marc N

2004-01-01

10

Dynamic equilibrium on a cyclic-interval schedule with a ramp

Five human subjects pressed a panel for money on a cyclic-interval schedule that arranged recurring periods of linearly increasing reinforcement rates (ramps). Response rate versus time functions for all subjects showed recurring periods of linearly increasing response rates. The responding of four of the five subjects was in phase with the reinforcement input. The remaining subject showed a two-minute phase shift. These results suggest that organisms may act like simple amplifiers on cyclic-interval schedules, that is, the form of the input signal is not changed by the organism, but is returned with amplification. By analogy with the variable-interval case, the controlling variable on cyclic-interval schedules with rate ramps may be the constant reinforcement acceleration that is arranged by the schedule. PMID:16812234

McDowell, J. J; Sulzen, Helena M.

1981-01-01

11

Confidence intervals for negative binomial random variables of high dispersion.

We consider the problem of constructing confidence intervals for the mean of a Negative Binomial random variable based upon sampled data. When the sample size is large, it is a common practice to rely upon a Normal distribution approximation to construct these intervals. However, we demonstrate that the sample mean of highly dispersed Negative Binomials exhibits a slow convergence in distribution to the Normal as a function of the sample size. As a result, standard techniques (such as the Normal approximation and bootstrap) will construct confidence intervals for the mean that are typically too narrow and significantly undercover at small sample sizes or high dispersions. To address this problem, we propose techniques based upon Bernstein's inequality or the Gamma and Chi Square distributions as alternatives to the standard methods. We investigate the impact of imposing a heuristic assumption of boundedness on the data as a means of improving the Bernstein method. Furthermore, we propose a ratio statistic relating the Negative Binomial's parameters that can be used to ascertain the applicability of the Chi Square method and to provide guidelines on evaluating the length of all proposed methods. We compare the proposed methods to the standard techniques in a variety of simulation experiments and consider data arising in the serial analysis of gene expression and traffic flow in a communications network. PMID:21969971

Shilane, David; Evans, Steven N; Hubbard, Alan E

2010-01-01

12

Effects of concurrent schedules on human fixed-interval performance1

Young adults performed a lever-pressing task for money on two schedules of reinforcement: concurrent fixed-interval 1 min—differential-reinforcement-of-low-rate 20-sec, and concurrent fixed-interval 1-min—fixed ratio 100 responses. All subjects were trained on both schedules. Fixed-interval performance concurrent with the differential reinforcement procedure was characterized by high constant rates with no post-reinforcement pauses. Fixed-interval performance concurrent with fixed ratio was characterized by low rates and lengthy post-reinforcement pauses. These results differ from those obtained in prior studies on the effects of conditioning history upon subsequent fixed-interval performance. The prior work, using non-concurrent procedures, had shown that fixed-interval performance following differential reinforcement of low rates was characterized by post-reinforcement pauses and low rates, while fixed-interval performance following fixed ratio exhibited high constant rates and no post-reinforcement pause. The present results suggest that alternative concurrent contingencies are another major determinant of human fixed-interval performance. PMID:16811607

Poppen, Roger

1972-01-01

13

Fixed-interval and fixed-ratio reinforcement schedules with human subjects

Operant laboratory studies were conducted as part of the regular activities of a psychiatric research ward. This report includes only some early data obtained from the ward staff, not the patients. A multiple schedule having alternating fixed-ratio and fixed-interval components permitted observations of acquisition and maintenance of behavior at low schedule values, transition to and final performance at greater schedule values, and behavioral changes after a limited-hold contingency was added to the fixed-interval. Prior to the added limited-hold, subjects used watches to time the interval, and usually responded only once before obtaining each fixed-interval reinforcement. Short limited-hold values eliminated clock watching and increased fixed-interval responding. Subjects communicated freely with each other, and it was clear that their performances were controlled both by the contingencies and by instructions. Just as clearly, the instructions themselves were controlled by the contingencies. It was concluded that the kinds of verbal control that were responsible for “nonstandard” fixed-interval performances did not require the postulation of any new behavioral principles. PMID:22477562

Stoddard, Lawrence T.; Sidman, Murray; Brady, Joseph V.

1988-01-01

14

Information Processing with Length Intervals of Stationary Random Point Processes

NASA Astrophysics Data System (ADS)

For some typical noiseless binary channels operating with signals at low power level, the limits of the information transmission are computed. This is performed for random point processes on a subset

Bendjaballah, Cherif

2011-04-01

15

ERIC Educational Resources Information Center

Compared the effects of fixed-interval and on-demand administration of analgesic medications in chronic pain patients. A fixed-interval analgesic schedule was found more effective than an on-demand schedule in reducing subjective pain and elevating mood. No differences were found between the two conditions on measures of physical activity.…

Berntzen, Dagfinn; Gotestam, K. Gunnar

1987-01-01

16

Clock control of human performance on avoidance and fixed-interval schedules1

The avoidance and fixed-interval performances of human subjects were studied in two experiments. Addition of time-correlated stimuli (added clock) improved behavioral efficiency, since response rates decreased without decreases in reinforcement rates. Response-dependent display of the clock maintained a second, observing response and reductions in clock duration weakened such observing behavior. Generally, the reinforcing properties of the clock were more apparent with the avoidance than with the fixed-interval schedule, a finding attributed to temporal cues already provided by delivery of the fixed-interval reinforcers. Reduced rates of the main response when the clock was dependent on an observing response were more than offset by rates of the observing response in the majority of subjects. Thus, the results do not support an interpretation of the reinforcing properties of added clocks simply in terms of work reduction. PMID:16811938

Baron, Alan; Galizio, Mark

1976-01-01

17

NASA Astrophysics Data System (ADS)

Wavelength selection is a critical step for producing better prediction performance when applied to spectral data. Considering the fact that the vibrational and rotational spectra have continuous features of spectral bands, we propose a novel method of wavelength interval selection based on random frog, called interval random frog (iRF). To obtain all the possible continuous intervals, spectra are first divided into intervals by moving window of a fix width over the whole spectra. These overlapping intervals are ranked applying random frog coupled with PLS and the optimal ones are chosen. This method has been applied to two near-infrared spectral datasets displaying higher efficiency in wavelength interval selection than others. The source code of iRF can be freely downloaded for academy research at the website: http://code.google.com/p/multivariate-calibration/downloads/list.

Yun, Yong-Huan; Li, Hong-Dong; Wood, Leslie R. E.; Fan, Wei; Wang, Jia-Jun; Cao, Dong-Sheng; Xu, Qing-Song; Liang, Yi-Zeng

2013-07-01

18

ERIC Educational Resources Information Center

The relationship between positive and negative reinforcement and the symmetry of Thorndike's law of effect are unresolved issues in operant psychology. Here we show that, for a given pattern of responding on variable interval (VI) schedules with the same programmed rate of food rewards (positive reinforcement VI) or electric shocks (negative…

Mallpress, Dave E. W.; Fawcett, Tim W.; McNamara, John M.; Houston, Alasdair I.

2012-01-01

19

True random number generator based on discretized encoding of the time interval between photons.

We propose an approach to generate true random number sequences based on the discretized encoding of the time interval between photons. The method is simple and efficient, and can produce a highly random sequence several times longer than that of other methods based on threshold or parity selection, without the need for hashing. A proof-of-principle experiment has been performed, showing that the system could be easily integrated and applied to quantum cryptography and other fields. PMID:23456008

Li, Shen; Wang, Long; Wu, Ling-An; Ma, Hai-Qiang; Zhai, Guang-Jie

2013-01-01

20

The research on job shop on-line scheduling with random release-date

For the job shop on-line scheduling in which the release dates of jobs are stochastic distributed, the hybrid scheduling strategy that integrates the interval rolling mechanism with the key event-driven is introduced. The satisfactory solution can be obtained through iterative optimization by embedding the neighborhood search into the constraint satisfaction solving procedure. The simulation and computational results verified that the

Jing Yin; Baojiang Chen

2009-01-01

21

Confidence intervals for a difference between lognormal means in cluster randomization trials.

Cluster randomization trials, in which intact social units are randomized to different interventions, have become popular in the last 25 years. Outcomes from these trials in many cases are positively skewed, following approximately lognormal distributions. When inference is focused on the difference between treatment arm arithmetic means, existent confidence interval procedures either make restricting assumptions or are complex to implement. We approach this problem by assuming log-transformed outcomes from each treatment arm follow a one-way random effects model. The treatment arm means are functions of multiple parameters for which separate confidence intervals are readily available, suggesting that the method of variance estimates recovery may be applied to obtain closed-form confidence intervals. A simulation study showed that this simple approach performs well in small sample sizes in terms of empirical coverage, relatively balanced tail errors, and interval widths as compared to existing methods. The methods are illustrated using data arising from a cluster randomization trial investigating a critical pathway for the treatment of community acquired pneumonia. PMID:25267552

Poirier, Julia; Zou, Gy; Koval, John

2014-09-29

22

Tennis League Rules 1. The Intramural Sports office will provide the schedule by random pick "Advantage Set" or "Tie-Break Set" scoring as long as both players agree beforehand. 3. Participants beforehand, after the hour is up, the person winning at that time is the winner. 8. Report scores and any

van den Berg, Jur

23

In many medical problems that collect multiple observations per subject, the time to an event is often of interest. Sometimes, the occurrence of the event can be recorded at regular intervals leading to interval-censored data. It is further desirable to obtain the most parsimonious model in order to increase predictive power and to obtain ease of interpretation. Variable selection and often random effects selection in case of clustered data become crucial in such applications. We propose a Bayesian method for random effects selection in mixed effects accelerated failure time (AFT) models. The proposed method relies on the Cholesky decomposition on the random effects covariance matrix and the parameter-expansion method for the selection of random effects. The Dirichlet prior is used to model the uncertainty in the random effects. The error distribution for the accelerated failure time model has been specified using a Gaussian mixture to allow flexible error density and prediction of the survival and hazard functions. We demonstrate the model using extensive simulations and the Signal Tandmobiel Study(®). PMID:24123191

Harun, Nusrat; Cai, Bo

2014-03-15

24

Zipf's law, the central limit theorem, and the random division of the unit interval

NASA Astrophysics Data System (ADS)

It is shown that a version of Mandelbrot's monkey-at-the-typewriter model of Zipf's inverse power law is directly related to two classical areas in probability theory: the central limit theorem and the ``broken stick'' problem, i.e., the random division of the unit interval. The connection to the central limit theorem is proved using a theorem on randomly indexed sums of random variables [A. Gut, Stopped Random Walks: Limit Theorems and Applications (Springer, New York, 1987)]. This reveals an underlying log-normal structure of pseudoword probabilities with an inverse power upper tail that clarifies a point of confusion in Mandelbrot's work. An explicit asymptotic formula for the slope of the log-linear rank-size law in the upper tail of this distribution is also obtained. This formula relates to known asymptotic results concerning the random division of the unit interval that imply a slope value approaching -1 under quite general conditions. The role of size-biased sampling in obscuring the bottom part of the distribution is explained and connections to related work are noted.

Perline, Richard

1996-07-01

25

Key pecking by three pigeons was maintained under a multiple fixed-interval fixed-ratio schedule of food presentation. The fixed-interval value remained at 3 minutes, while the fixed-ratio size was increased systematically in 30-response increments from 30 to either 120 (two pigeons) or 150 (one pigeon). At least two lower fixed-ratio values were also redetermined. The effects of ethanol (5 to 2.5 g/kg) were assessed at each of the different schedule parameters. Both overall and running response rates under the fixed-ratio schedule decreased with increases in the size of the fixed-ratio schedule; pause duration under the fixed-ratio schedule was directly related to increases in fixed-ratio size. Overall and running rates of responding under the fixed-interval schedule changed little with increases in the size of the fixed-ratio schedule. Despite the relative invariance of fixed-interval responding across the different fixed-ratio values, the effects of ethanol on responding under the fixed-interval schedule differed depending on the size of the fixed-ratio schedule. Greater increases occurred in both overall and in lower local rates of responding under the fixed-interval schedule when the fixed-ratio value was 120 or 150. The effects of ethanol on responding under the fixed-ratio schedule also depended on the size of the fixed ratio. Increases in responding under the fixed-ratio schedule were typically greater at the higher fixed-ratio values where response rates were lower. When the effects of ethanol were redetermined at the lower fixed-ratio parameter values, rates and patterns of responding were comparable to those obtained initially. However, the dose-effect curves for responding under both fixed-ratio and fixed-interval schedules were shifted up and to the right of those determined during the ascending series. The effects of ethanol can depend on rate or responding, behavioral history, and the context in which behavior occurs. PMID:7411015

Barrett, J E; Stanley, J A

1980-01-01

26

Project Scheduling Problem for Software Development with Random Fuzzy Activity Duration Times

This paper presents a new method that describes activity duration times, which can be as random fuzzy variables to solve the\\u000a software project scheduling problem. It solves the problem of the present classic models, such as PERT and CPM, which are\\u000a weak in solving project scheduling problem for software development due to the concurrent, iterative and evolutionary nature\\u000a characteristics of

Wei Huang; Lixin Ding; Bin Wen; Buqing Cao

2009-01-01

27

Randomized Online Scheduling on Two Uniform Machines Leah Epstein \\Lambda John Noga y Steve Seiden online scheduling of independent jobs with the objective to minimize the makespan (i.e., the length of the schedule, or equivalently the last job completion time). An input to this problem consists of a sequence

Sgall, Jiri

28

A methodology for adaptive scheduling of radar intervals based on a cost-function methodology

NASA Astrophysics Data System (ADS)

In this note we introduce the idea of adaptive scheduling based on a cost function methodology. As the warfare environment becomes more complex, individual sensor resources are stretched, and the usage of the sensors has grown. In a multi-ship multi-platform environment, one has the potential to share information across platforms. This would dramatically increase the strategic and tactical picture available to mission planners and commanders at all force levels. In order to accomplish this mission, the sensors must all be coordinated so adaptability and multi-force tasking can be accomplished with netted sensors. Adaptive sensor management expands group capabilities by freeing up resources such as dwells/energy management. Savings arise by effective usage of tracking resources by revisiting threats with radar resources only when needed. This can be done by introducing analytic cost functions of the revisit time that enable one to minimize revisit time while maintaining error within acceptable bounds.

Gray, John E.; Smith-Carroll, Amy S.; Zaffram, Christopher

2004-07-01

29

Response Strength in Extreme Multiple Schedules

ERIC Educational Resources Information Center

Four pigeons were trained in a series of two-component multiple schedules. Reinforcers were scheduled with random-interval schedules. The ratio of arranged reinforcer rates in the two components was varied over 4 log units, a much wider range than previously studied. When performance appeared stable, prefeeding tests were conducted to assess…

McLean, Anthony P.; Grace, Randolph C.; Nevin, John A.

2012-01-01

30

Inner Random Restart Genetic Algorithm for Practical Delivery Schedule Optimization

NASA Astrophysics Data System (ADS)

A delivery route optimization that improves the efficiency of real time delivery or a distribution network requires solving several tens to hundreds but less than 2 thousands cities Traveling Salesman Problems (TSP) within interactive response time (less than about 3 second), with expert-level accuracy (less than about 3% of error rate). Further, to make things more difficult, the optimization is subjects to special requirements or preferences of each various delivery sites, persons, or societies. To meet these requirements, an Inner Random Restart Genetic Algorithm (Irr-GA) is proposed and developed. This method combines meta-heuristics such as random restart and GA having different types of simple heuristics. Such simple heuristics are 2-opt and NI (Nearest Insertion) methods, each applied for gene operations. The proposed method is hierarchical structured, integrating meta-heuristics and heuristics both of which are multiple but simple. This method is elaborated so that field experts as well as field engineers can easily understand to make the solution or method easily customized and extended according to customers' needs or taste. Comparison based on the experimental results and consideration proved that the method meets the above requirements more than other methods judging from not only optimality but also simplicity, flexibility, and expandability in order for this method to be practically used.

Sakurai, Yoshitaka; Takada, Kouhei; Onoyama, Takashi; Tsukamoto, Natsuki; Tsuruta, Setsuo

31

Future wireless sensor networks are expected to provide various sensing services and energy efficiency is one of the most important criterions. The node scheduling strategy aims to increase network lifetime by selecting a set of sensor nodes to provide the required sensing services in a periodic manner. In this paper, we are concerned with the service-oriented node scheduling problem to provide multiple sensing services while maximizing the network lifetime. We firstly introduce how to model the data correlation for different services by using Markov Random Field (MRF) model. Secondly, we formulate the service-oriented node scheduling issue into three different problems, namely, the multi-service data denoising problem which aims at minimizing the noise level of sensed data, the representative node selection problem concerning with selecting a number of active nodes while determining the services they provide, and the multi-service node scheduling problem which aims at maximizing the network lifetime. Thirdly, we propose a Multi-service Data Denoising (MDD) algorithm, a novel multi-service Representative node Selection and service Determination (RSD) algorithm, and a novel MRF-based Multi-service Node Scheduling (MMNS) scheme to solve the above three problems respectively. Finally, extensive experiments demonstrate that the proposed scheme efficiently extends the network lifetime. PMID:25384005

Cheng, Hongju; Su, Zhihuang; Lloret, Jaime; Chen, Guolong

2014-01-01

32

Service-Oriented Node Scheduling Scheme for Wireless Sensor Networks Using Markov Random Field Model

Future wireless sensor networks are expected to provide various sensing services and energy efficiency is one of the most important criterions. The node scheduling strategy aims to increase network lifetime by selecting a set of sensor nodes to provide the required sensing services in a periodic manner. In this paper, we are concerned with the service-oriented node scheduling problem to provide multiple sensing services while maximizing the network lifetime. We firstly introduce how to model the data correlation for different services by using Markov Random Field (MRF) model. Secondly, we formulate the service-oriented node scheduling issue into three different problems, namely, the multi-service data denoising problem which aims at minimizing the noise level of sensed data, the representative node selection problem concerning with selecting a number of active nodes while determining the services they provide, and the multi-service node scheduling problem which aims at maximizing the network lifetime. Thirdly, we propose a Multi-service Data Denoising (MDD) algorithm, a novel multi-service Representative node Selection and service Determination (RSD) algorithm, and a novel MRF-based Multi-service Node Scheduling (MMNS) scheme to solve the above three problems respectively. Finally, extensive experiments demonstrate that the proposed scheme efficiently extends the network lifetime. PMID:25384005

Cheng, Hongju; Su, Zhihuang; Lloret, Jaime; Chen, Guolong

2014-01-01

33

Interval Estimation of Random Effects in Proportional Hazards Models with Frailties

Semi-parametric frailty models are widely used to analyze clustered survival data. In this paper, we propose the use of the hierarchical likelihood interval for individual frailties of the clusters, not the parameters of the frailty distribution. We study the relationship between hierarchical likelihood, empirical Bayesian, and fully Bayesian intervals for frailties. We show that our proposed interval can be interpreted as a frequentist confidence interval and Bayesian credible interval under a uniform prior. We also propose an adjustment of the proposed interval to avoid null intervals. Simulation studies show that the proposed interval preserves the nominal confidence level. The procedure is illustrated using data from a multicenter lung cancer clinical trial. PMID:23361438

Ha, Il Do; Vaida, Florin; Lee, Youngjo

2014-01-01

34

Background The inter-patient classification schema and the Association for the Advancement of Medical Instrumentation (AAMI) standards are important to the construction and evaluation of automated heartbeat classification systems. The majority of previously proposed methods that take the above two aspects into consideration use the same features and classification method to classify different classes of heartbeats. The performance of the classification system is often unsatisfactory with respect to the ventricular ectopic beat (VEB) and supraventricular ectopic beat (SVEB). Methods Based on the different characteristics of VEB and SVEB, a novel hierarchical heartbeat classification system was constructed. This was done in order to improve the classification performance of these two classes of heartbeats by using different features and classification methods. First, random projection and support vector machine (SVM) ensemble were used to detect VEB. Then, the ratio of the RR interval was compared to a predetermined threshold to detect SVEB. The optimal parameters for the classification models were selected on the training set and used in the independent testing set to assess the final performance of the classification system. Meanwhile, the effect of different lead configurations on the classification results was evaluated. Results Results showed that the performance of this classification system was notably superior to that of other methods. The VEB detection sensitivity was 93.9% with a positive predictive value of 90.9%, and the SVEB detection sensitivity was 91.1% with a positive predictive value of 42.2%. In addition, this classification process was relatively fast. Conclusions A hierarchical heartbeat classification system was proposed based on the inter-patient data division to detect VEB and SVEB. It demonstrated better classification performance than existing methods. It can be regarded as a promising system for detecting VEB and SVEB of unknown patients in clinical practice. PMID:24981916

2014-01-01

35

Frequently, studies are conducted in a real clinic setting. When the outcome of interest is collected longitudinally over a specified period of time, this design can lead to unequally spaced intervals and varying numbers of assessments. In our study, these features were embedded in a randomized, factorial design in which interventions to improve blood pressure control were delivered to both patients and providers. We examine the effect of the intervention and compare methods of estimation of both fixed effects and variance components in the multilevel generalized linear mixed model. Methods of comparison include penalized quasi-likelihood (PQL), adaptive quadrature, and Bayesian Monte Carlo methods. We also investigate the implications of reducing the data and analysis to baseline and final measurements. In the full analysis, the PQL fixed-effects estimates were closest to zero and confidence intervals were generally narrower than those of the other methods. The adaptive quadrature and Bayesian fixed-effects estimates were similar, but the Bayesian credible intervals were consistently wider. Variance component estimation was markedly different across methods, particularly for the patient-level random effects. In the baseline and final measurement analysis, we found that estimates and corresponding confidence intervals for the adaptive quadrature and Bayesian methods were very similar. However, the time effect was diminished and other factors also failed to reach statistical significance, most likely due to decreased power. When analyzing data from this type of design, we recommend using either adaptive quadrature or Bayesian methods to fit a multilevel generalized linear mixed model including all available measurements. PMID:18825655

Olsen, Maren K; DeLong, Elizabeth R; Oddone, Eugene Z; Bosworth, Hayden B

2008-12-20

36

NASA Technical Reports Server (NTRS)

NASA?s Air Traffic Management Demonstration-1 (ATD-1) is a multi-year effort to demonstrate high-throughput, fuel-efficient arrivals at a major U.S. airport using NASA-developed scheduling automation, controller decision-support tools, and ADS-B-enabled Flight-Deck Interval Management (FIM) avionics. First-year accomplishments include the development of a concept of operations for managing scheduled arrivals flying Optimized Profile Descents with equipped aircraft conducting FIM operations, and the integration of laboratory prototypes of the core ATD-1 technologies. Following each integration phase, a human-in-the-loop simulation was conducted to evaluate and refine controller tools, procedures, and clearance phraseology. From a ground-side perspective, the results indicate the concept is viable and the operations are safe and acceptable. Additional training is required for smooth operations that yield notable benefits, particularly in the areas of FIM operations and clearance phraseology.

Callantine, Todd J.; Cabrall, Christopher; Kupfer, Michael; Omar, Faisal G.; Prevot, Thomas

2012-01-01

37

Randomized Online Scheduling on Two Uniform Machines Leah Epstein \\Lambda John Noga y Steve Seiden online scheduling of inÂ dependent jobs with the objective to minimize the makespan (i.e., the length of the schedule, or equivalently the last job completion time). An input to this problem consists of a sequence

Epstein, Leah

38

NASA Technical Reports Server (NTRS)

We report a new method for the noninvasive characterization of the frequency response of the autonomic nervous system (ANS) in mediating fluctuations in heart rate (HR). The approach entails computation of the transfer function magnitude and phase between instantaneous lung volume and HR. Broad band fluctuations in lung volume were initiated when subjects breathed on cue to a sequence of beeps spaced randomly in time. We studied 10 subjects in both supine and standing positions. The transfer function, averaged among all the subjects, showed systematic differences between the two postures, reflecting the differing frequency responses of the sympathetic and parasympathetic divisions of the ANS.

Chen, M. H.; Berger, R. D.; Saul, J. P.; Stevenson, K.; Cohen, R. J.

1987-01-01

39

Confidence Intervals for Random Forests: The Jackknife and the Infinitesimal Jackknife

We study the variability of predictions made by bagged learners and random forests, and show how to estimate standard errors for these methods. Our work builds on variance estimates for bagging proposed by Efron (1992, 2013) that are based on the jackknife and the infinitesimal jackknife (IJ). In practice, bagged predictors are computed using a finite number B of bootstrap replicates, and working with a large B can be computationally expensive. Direct applications of jackknife and IJ estimators to bagging require B = ?(n1.5) bootstrap replicates to converge, where n is the size of the training set. We propose improved versions that only require B = ?(n) replicates. Moreover, we show that the IJ estimator requires 1.7 times less bootstrap replicates than the jackknife to achieve a given accuracy. Finally, we study the sampling distributions of the jackknife and IJ variance estimates themselves. We illustrate our findings with multiple experiments and simulation studies.

Wager, Stefan; Hastie, Trevor; Efron, Bradley

2014-01-01

40

Background A Food and Drug Administration (FDA) safety communication in August 2011 warned that citalopram was associated with a dose dependent risk of QT prolongation and recommended dose restriction in patients over the age of 60 but did not provide data for this age group. Methods CitAD was a randomized, double-masked, placebo-controlled, multicenter clinical trial for agitation in Alzheimer's disease (AD). Participants were assigned to citalopram (target dose of 30 mg/day) or placebo in a 1?1 ratio. 186 people, 181 of whom were over the age of 60, having probable AD with clinically significant agitation were recruited from September 2009 to January 2013. After the FDA safety communication about citalopram, ECG was added to the required study procedures before enrollment and repeated at week 3 to monitor change in QTc interval. Forty-eight participants were enrolled after enhanced monitoring began. Results Citalopram treatment was associated with a larger increase in QTc interval than placebo (difference in week 3 QTc adjusting for baseline QTc: 18.1 ms [95% CI: 6.1, 30.1]; p?=?0.004). More participants in the citalopram group had an increase ?30 ms from baseline to week 3 (7 in citalopram versus 1 in placebo; Fisher's exact p?=?0.046), but only slightly more in the citalopram group met a gender-specific threshold for prolonged QTc (450 ms for males; 470 ms for females) at any point during follow-up (3 in citalopram versus 1 in placebo, Fisher's exact p?=?0.611). One of the citalopram participants who developed prolonged QTc also displayed ventricular bigeminy. No participants in either group had a cardiovascular-related death. Conclusion Citalopram at 30 mg/day was associated with improvement in agitation in patients with AD but was also associated with QT prolongation. Trial Registration ClinicalTrials.gov NCT00898807 PMID:24914549

Drye, Lea T.; Spragg, David; Devanand, D. P.; Frangakis, Constantine; Marano, Christopher; Meinert, Curtis L.; Mintzer, Jacobo E.; Munro, Cynthia A.; Pelton, Gregory; Pollock, Bruce G.; Porsteinsson, Anton P.; Rabins, Peter V.; Rosenberg, Paul B.; Schneider, Lon S.; Shade, David M.; Weintraub, Daniel; Yesavage, Jerome; Lyketsos, Constantine G.

2014-01-01

41

Background Meta-regression is becoming increasingly used to model study level covariate effects. However this type of statistical analysis presents many difficulties and challenges. Here two methods for calculating confidence intervals for the magnitude of the residual between-study variance in random effects meta-regression models are developed. A further suggestion for calculating credible intervals using informative prior distributions for the residual between-study variance is presented. Methods Two recently proposed and, under the assumptions of the random effects model, exact methods for constructing confidence intervals for the between-study variance in random effects meta-analyses are extended to the meta-regression setting. The use of Generalised Cochran heterogeneity statistics is extended to the meta-regression setting and a Newton-Raphson procedure is developed to implement the Q profile method for meta-analysis and meta-regression. WinBUGS is used to implement informative priors for the residual between-study variance in the context of Bayesian meta-regressions. Results Results are obtained for two contrasting examples, where the first example involves a binary covariate and the second involves a continuous covariate. Intervals for the residual between-study variance are wide for both examples. Conclusions Statistical methods, and R computer software, are available to compute exact confidence intervals for the residual between-study variance under the random effects model for meta-regression. These frequentist methods are almost as easily implemented as their established counterparts for meta-analysis. Bayesian meta-regressions are also easily performed by analysts who are comfortable using WinBUGS. Estimates of the residual between-study variance in random effects meta-regressions should be routinely reported and accompanied by some measure of their uncertainty. Confidence and/or credible intervals are well-suited to this purpose. PMID:25196829

2014-01-01

42

ERIC Educational Resources Information Center

Introduces a computer package written for Mathematica, the purpose of which is to perform a number of difficult iterative functions with respect to the squared multiple correlation coefficient under the fixed and random models. These functions include computation of the confidence interval upper and lower bounds, power calculation, calculation of…

Mendoza, Jorge L.; Stafford, Karen L.

2001-01-01

43

ERIC Educational Resources Information Center

An experiment with adult humans investigated the effects of response-contingent money loss (response-cost punishment) on monetary-reinforced responding. A yoked-control procedure was used to separate the effects on responding of the response-cost contingency from the effects of reduced reinforcement density. Eight adults pressed buttons for money…

Pietras, Cynthia J.; Brandt, Andrew E.; Searcy, Gabriel D.

2010-01-01

44

Background Although, in Western countries, oxaliplatin-based regimens have been established as a gold standard treatment for patients with stage III or high risk stage II colon cancer after curative resection, in Japan fluorouracil-based regimens have been widely accepted and recommended in the guidelines for adjuvant settings in patients with stage III colon cancer. S-1, an oral preparation evolved from uracil and tegafur, has equivalent efficacy to uracil and tegafur/leucovorin for treating patients with advanced colorectal cancer and might be a suitable regimen in an adjuvant setting. However, the completion rate of the standard six-week cycle of the S-1 regimen is poor and the establishment of an optimal treatment schedule is critical. Therefore, we will conduct a multicenter randomized phase II trial to compare six-week and three-week cycles to establish the optimal schedule of S-1 adjuvant therapy for patients with stage III colon cancer after curative resection. Methods/Design The study is an open-label, multicenter randomized phase II trial. The primary endpoint of this study is three-year disease-free survival rate. Secondary endpoints are the completion rate of the treatment, relative dose intensity, overall survival, disease-free survival, and incidence of adverse events. The sample size was 200, determined with a significance level of 0.20, power of 0.80, and non-inferiority margin of a 10% absolute difference in the primary endpoint. Discussion Although S-1 has not been approved yet as a standard treatment of colon cancer in an adjuvant setting, it is a promising option. Moreover, in Japan S-1 is a standard treatment for patients with stage II/III gastric cancer after curative resection and a promising option for patients with colorectal liver metastases in an adjuvant setting. However, a six-week cycle of treatment is not considered to be the best schedule, and some clinicians use a modified schedule, such as a three-week cycle to keep a sufficient dose intensity with few adverse events. Therefore, it will be useful to determine whether a three-week cycle has an equal or greater efficacy and tolerance to side-effects compared with the standard six-week cycle schedule, and thus may be the most suitable treatment schedule for S-1 treatment. Trial registration The University Hospital Medical Information Network (UMIN) Clinical Trials Registry UMIN000006750. PMID:23320901

2013-01-01

45

Background In research clinic settings, overweight adults undertaking HIIT (high intensity interval training) improve their fitness as effectively as those undertaking conventional walking programs but can do so within a shorter time spent exercising. We undertook a randomized controlled feasibility (pilot) study aimed at extending HIIT into a real world setting by recruiting overweight/obese, inactive adults into a group based activity program, held in a community park. Methods Participants were allocated into one of three groups. The two interventions, aerobic interval training and maximal volitional interval training, were compared with an active control group undertaking walking based exercise. Supervised group sessions (36 per intervention) were held outdoors. Cardiorespiratory fitness was measured using VO2max (maximal oxygen uptake, results expressed in ml/min/kg), before and after the 12 week interventions. Results On ITT (intention to treat) analyses, baseline (N?=?49) and exit (N?=?39) O2 was 25.3±4.5 and 25.3±3.9, respectively. Participant allocation and baseline/exit VO2max by group was as follows: Aerobic interval training N?=? 16, 24.2±4.8/25.6±4.8; maximal volitional interval training N?=?16, 25.0±2.8/25.2±3.4; walking N?=?17, 26.5±5.3/25.2±3.6. The post intervention change in VO2max was +1.01 in the aerobic interval training, ?0.06 in the maximal volitional interval training and ?1.03 in the walking subgroups. The aerobic interval training subgroup increased VO2max compared to walking (p?=?0.03). The actual (observed, rather than prescribed) time spent exercising (minutes per week, ITT analysis) was 74 for aerobic interval training, 45 for maximal volitional interval training and 116 for walking (p?=? 0.001). On descriptive analysis, the walking subgroup had the fewest adverse events. Conclusions In contrast to earlier studies, the improvement in cardiorespiratory fitness in a cohort of overweight/obese participants undertaking aerobic interval training in a real world setting was modest. The most likely reason for this finding relates to reduced adherence to the exercise program, when moving beyond the research clinic setting. Trial Registration ACTR.org.au ACTRN12610000295044 PMID:24454698

Lunt, Helen; Draper, Nick; Marshall, Helen C.; Logan, Florence J.; Hamlin, Michael J.; Shearman, Jeremy P.; Cotter, James D.; Kimber, Nicholas E.; Blackwell, Gavin; Frampton, Christopher M. A.

2014-01-01

46

Scheduling Randomly-Deployed Heterogeneous Video Sensor Nodes for Reduced Intrusion Detection Time

NASA Astrophysics Data System (ADS)

This paper proposes to use video sensor nodes to provide an efficient intrusion detection system. We use a scheduling mechanism that takes into account the criticality of the surveillance application and present a performance study of various cover set construction strategies that take into account cameras with heterogeneous angle of view and those with very small angle of view. We show by simulation how a dynamic criticality management scheme can provide fast event detection for mission-critical surveillance applications by increasing the network lifetime and providing low stealth time of intrusions.

Pham, Congduc

47

We examined the impact of self-controlled knowledge of results on the acquisition, retention, and transfer of anticipation timing skill as a function of random and blocked practice schedules. Forty-eight undergraduate students were divided into experimental groups that practiced under varying combinations of random or blocked as well as self-controlled or yoked practice conditions. Anticipation timing performance (5, 13, and 21?mph) was recorded during acquisition and during a short term no-feedback retention test. A transfer test, administered 24?h after the retention test, consisted of two novel anticipation timing speeds (9, 17?mph). Absolute error (AE) and variable error (VE) of timing served as the dependent measures. All participants improved their accuracy and consistency across acquisition blocks; however, those who practiced under blocked rather than random conditions had greater accuracy (lower AE) regardless of feedback delivery. During retention and transfer, those who practiced under random conditions showed greater consistency (lower VE) compared to their blocked counterparts. Finally, participants who controlled their feedback schedule were more accurate (lower AE) and less variable (lower VE) during transfer compared to yoked participants, regardless of practice scheduling. Our findings indicate that practicing under a random schedule improves retention and transfer consistency, while self-control of feedback is advantageous to both the accuracy and consistency with which anticipation timing skill transfers to novel task demands. The combination of these learning manipulations, however, does not improve skill retention or transfer above and beyond their orthogonal effects. PMID:23233843

Ali, Asif; Fawver, Bradley; Kim, Jingu; Fairbrother, Jeffrey; Janelle, Christopher M.

2012-01-01

48

Is walking a random walk? Evidence for long-range correlations in stride interval of human gait

; Division on Aging, Harvard Medical School; and Department of Biomedical Engineering, Boston University-range correlations, or 3) long-range correlations with power-law scaling. To test these possibilities, the stride hundreds of steps; the stride inter- val at any time depended on the stride interval at remote previous

49

The Stata command xtreg estimates the random-effects linear regression model, for which the random effects are assumed to be normally distributed with zero mean and non-negative variance, s^2_{i,t}. Testing homogeneity across units is equivalent to testing the null hypothesis H_0: s^2_{i,t} = 0, which is a value on the boundary of the parameter space. The command xtreg provides the upper-tail

Matteo Bottai; Nicola Orsini

2004-01-01

50

NASA Astrophysics Data System (ADS)

Intervals are a new, higher-level primitive for parallel programming with which programmers directly construct the program schedule. Programs using intervals can be statically analyzed to ensure that they do not deadlock or contain data races. In this paper, we demonstrate the flexibility of intervals by showing how to use them to emulate common parallel control-flow constructs like barriers and signals, as well as higher-level patterns such as bounded-buffer producer-consumer. We have implemented intervals as a publicly available library for Java and Scala.

Matsakis, Nicholas D.; Gross, Thomas R.

51

This book has seven chapters. In chapter one we give the basics needed to make this book a self contained one. Chapter two introduces the notion of interval semigroups and interval semifields and are algebraically analysed. Chapter three introduces special types of interval semirings like matrix interval semirings and interval polynomial semirings. Chapter four for the first time introduces the notion of group interval semirings, semigroup interval semirings, loop interval semirings and groupoid interval semirings and these structures are studied. Interval neutrosophic semirings are introduced in chapter five. Applications of these structures are given in chapter six. The final chapter suggests around 120 problems for the reader.

W. B. Vasantha Kandasamy; Florentin Smarandache

2011-06-02

52

This book has seven chapters. In chapter one we give the basics needed to make this book a self contained one. Chapter two introduces the notion of interval semigroups and interval semifields and are algebraically analysed. Chapter three introduces special types of interval semirings like matrix interval semirings and interval polynomial semirings. Chapter four for the first time introduces the notion of group interval semirings, semigroup interval semirings, loop interval semirings and groupoid interval semirings and these structures are studied. Interval neutrosophic semirings are introduced in chapter five. Applications of these structures are given in chapter six. The final chapter suggests around 120 problems for the reader.

Kandasamy, W B Vasantha

2011-01-01

53

ERIC Educational Resources Information Center

Schedules of reinforcement typically produce reliable patterns of behaviour, and one factor that can cause deviations from these normally reliable patterns is schizotypy. Low scorers on the unusual experiences subscale of the Oxford-Liverpool Inventory of Feelings and Experiences performed as expected on a yoked random-ratio (RR), random-interval…

Randell, Jordan; Searle, Rob; Reed, Phil

2012-01-01

54

This paper discusses automated scheduling as it applies to complex domains such as factories, transportation, and communications systems. The window-constrained-packing problem is introduced as an ideal model of the scheduling trade offs. Specific algorithms are compared in terms of simplicity, speed, and accuracy. In particular, dispatch, look-ahead, and genetic algorithms are statistically compared on randomly generated job sets. The conclusion

William J. Wolfe; David Wood; Steve Sorensen

1996-01-01

55

Background High-intensity interval training has been shown to be a time-efficient way to induce physiological adaptations similar to those of traditional endurance training. Creatine supplementation may enhance high-intensity interval training, leading to even greater physiological adaptations. The purpose of this study was to determine the effects of high-intensity interval training (HIIT) and creatine supplementation on cardiorespiratory fitness and endurance performance (maximal oxygen consumption (VO2PEAK), time-to-exhaustion (VO2PEAKTTE), ventilatory threshold (VT), and total work done (TWD)) in college-aged men. Methods Forty-three recreationally active men completed a graded exercise test to determine VO2PEAK, VO2PEAKTTE, and VT. In addition, participants completed a time to exhaustion (TTE) ride at 110% of the maximum workload reached during the graded exercise test to determine TWD (TTE (sec) × W = J). Following testing, participants were randomly assigned to one of three groups: creatine (creatine citrate) (Cr; n = 16), placebo (PL; n = 17), or control (n = 10) groups. The Cr and PL groups completed four weeks of HIIT prior to post-testing. Results Significant improvements in VO2PEAK and VO2PEAKTTE occurred in both training groups. Only the Cr group significantly improved VT (16% vs. 10% improvement in PL). No changes occurred in TWD in any group. Conclusion In conclusion, HIIT is an effective and time-efficient way to improve maximal endurance performance. The addition of Cr improved VT, but did not increase TWD. Therefore, 10 g of Cr per day for five days per week for four weeks does not seem to further augment maximal oxygen consumption, greater than HIIT alone; however, Cr supplementation may improve submaximal exercise performance. PMID:19909536

2009-01-01

56

Much of the research on operations scheduling problems has either ignored setup times or assumed that setup times on each machine are independent of the job sequence. Furthermore, most scheduling problems that have been discussed in the literature are under the assumption that machines are continuously available. Nevertheless, in most real-life industries a machine can be unavailable for many reasons,

M. Zandieh; M. Gholami

2009-01-01

57

Background The target delivery channel of RTS,S candidate malaria vaccines in malaria-endemic countries in Africa is the World Health Organisation Expanded Program on Immunization. As an Adjuvant System, age de-escalation and schedule selection step, this study assessed 3 schedules of RTS,S/AS01E and RTS,S/AS02D in infants and young children 5–17 months of age in Ghana. Methodology A Phase II, partially-blind randomized controlled study (blind to vaccine, not to schedule), of 19 months duration was conducted in two (2) centres in Ghana between August 2006 and May 2008. Subjects were allocated randomly (1?1?1?1?1?1) to one of six study groups at each study site, each defining which vaccine should be given and by which schedule (0,1-, 0,1,2- or 0,1,7-months). For the 0,1,2-month schedule participants received RTS,S/AS01E or rabies vaccine at one center and RTS,S/AS01E or RTS,S/AS02D at the other. For the other schedules at both study sites, they received RTS,S/AS01E or RTS,S/AS02D. The primary outcome measure was the occurrence of serious adverse events until 10 months post dose 1. Results The number of serious adverse events reported across groups was balanced. One child had a simple febrile convulsion, which evolved favourably without sequelae, considered to be related to RTS,S/AS01E vaccination. Low grade reactions occurred slightly more frequently in recipients of RTS,S/AS than rabies vaccines; grade 3 reactions were infrequent. Less local reactogenicity occurred with RTS,S/AS01E than RTS,S/AS02D. Both candidate vaccines were highly immunogenic for anti-circumsporozoite and anti-Hepatitis B Virus surface antigen antibodies. Recipients of RTS,S/AS01E compared to RTS,S/AS02D had higher peak anti-circumsporozoite antibody responses for all 3 schedules. Three dose schedules were more immunogenic than 2 dose schedules. Area under the curve analyses for anti-circumsporozoite antibodies were comparable between the 0,1,2- and 0,1,7-month RTS,S/AS01E schedules. Conclusions Both candidate malaria vaccines were well tolerated. Anti-circumsporozoite responses were greater with RTS,S/AS01E than RTS,S/AS02D and when 3 rather than 2 doses were given. This study supports the selection of RTS,S/AS01E and a 3 dose schedule for further development in children and infants. Trial Registration ClinicalTrials.gov NCT00360230 PMID:19806184

Owusu-Agyei, Seth; Ansong, Daniel; Asante, Kwaku; Kwarteng Owusu, Sandra; Owusu, Ruth; Wireko Brobby, Naana Ayiwa; Dosoo, David; Osei Akoto, Alex; Osei-Kwakye, Kingsley; Adjei, Emmanuel Asafo; Boahen, Kwadwo Owusu; Sylverken, Justice; Adjei, George; Sambian, David; Apanga, Stephen; Kayan, Kingsley; Vekemans, Johan; Ofori-Anyinam, Opokua; Leach, Amanda; Lievens, Marc; Demoitie, Marie-Ange; Dubois, Marie-Claude; Cohen, Joe; Ballou, W. Ripley; Savarese, Barbara; Chandramohan, Daniel; Gyapong, John Owusu; Milligan, Paul; Antwi, Sampson; Agbenyega, Tsiri; Greenwood, Brian; Evans, Jennifer

2009-01-01

58

Background Coronary artery disease (CAD) is accountable for more than 7 million deaths each year according to the World Health Organization (WHO). In a European population 80% of patients diagnosed with CAD are overweight and 31% are obese. Physical inactivity and overweight are major risk factors in CAD, thus central strategies in secondary prevention are increased physical activity and weight loss. Methods/Design In a randomized controlled trial 70 participants with stable CAD, age 45–75, body mass index 28–40 kg/m2 and no diabetes are randomized (1:1) to 12 weeks of intensive exercise or weight loss both succeeded by a 40-week follow-up. The exercise protocol consist of supervised aerobic interval training (AIT) at 85-90% of VO2peak 3 times weekly for 12 weeks followed by supervised AIT twice weekly for 40 weeks. In the weight loss arm dieticians instruct the participants in a low energy diet (800–1000 kcal/day) for 12 weeks, followed by 40 weeks of weight maintenance combined with supervised AIT twice weekly. The primary endpoint of the study is change in coronary flow reserve after the first 12 weeks’ intervention. Secondary endpoints include cardiovascular, metabolic, inflammatory and anthropometric measures. Discussion The study will compare the short and long-term effects of a protocol consisting of AIT alone or a rapid weight loss followed by AIT. Additionally, it will provide new insight in mechanisms behind the benefits of exercise and weight loss. We wish to contribute to the creation of effective secondary prevention and sustainable rehabilitation strategies in the large population of overweight and obese patients diagnosed with CAD. Trial registration ClinicalTrials.gov: NCT01724567 PMID:24252596

2013-01-01

59

Purpose The dose limiting toxicity of oxaliplatin (l-HOP) is neurotoxicity, which is characterized by an acute neuropathy and a clinically\\u000a distinct chronic neuropathy. This randomized study evaluated if prolonged l-HOP infusion over the conventional l-HOP schedule\\u000a was useful in reducing acute and possibly chronic l-HOP induced neurotoxicity in colon and gastric cancer patients receiving\\u000a l-HOP-based regimen as adjuvant chemotherapy.\\u000a \\u000a \\u000a \\u000a Methods Sixty-four patients

Roberto Petrioli; Alessandra Pascucci; Edoardo Francini; Stefania Marsili; Angela Sciandivasci; Rossana Tassi; Serenella Civitelli; Gabriello Tanzini; Marco Lorenzi; Guido Francini

2008-01-01

60

NSDL National Science Digital Library

This applet, created by Ivo Dinov of the University of California at Berkeley, introduces the concept of confidence intervals. Select an alpha level, sample size, and the number of experiments, and click "Play." For each sample, the applet will show the data points as blue dots and the confidence interval as a red, vertical line. The true population mean is shown as a horizontal purple line, and green ovals indicate which intervals do not contain the true mean.

Dinov, Ivo

2009-01-14

61

Due to the reduced physical activity of patients who have undergone total hip arthroplasty (THA), there are no home-based exercise training regimens for preventing muscle atrophy and aerobic capacity impairment in these patients. We examined whether interval walking training (IWT) could prevented these issues. Twenty-eight female patients (?60 years of age) who had undergone THA more than 2 months prior were randomly divided into IWT (n?=?14) and control (CNT, n?=?14) groups. The IWT subjects trained at a target of 60 min of fast walking at >70% peak aerobic capacity for walking (O2peak) per wk for 12 wk, while those in the CNT maintained their previous sedentary life during the same period. We measured the energy expenditure of the daily physical activity, except during sleeping and bathing, every minute and every day during the intervention. We also measured the isometric knee extension (FEXT) and flexion (FFLX) forces, O2peak, and anaerobic threshold during the graded cycling exercise (O2AT) before and after the intervention. All subjects, except for one in IWT, completed the protocol. FFLX increased by 23% on the operated side (P?=?0.003) and 14% on the non-operated side of IWT (P?=?0.006), while it only increased on the operated side of CNT (P?=?0.03). The O2peak and O2AT in IWT increased by 8% (P?=?0.08) and 13% (P?=?0.002), respectively, and these changes were significantly higher in the IWT than in CNT group (both, P<0.05). In conclusion, IWT might be an effective home-based training regimen for preventing the muscle atrophy from reduced daily physical activity in THA patients. Trial Registration UMIN-CTR UMIN000013172 PMID:25268505

Morishima, Yutaka; Mizushima, Takashi; Yamauchi, Katsuya; Morikawa, Mayuko; Masuki, Shizue; Nose, Hiroshi

2014-01-01

62

This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708

Xu, Jiuping

2014-01-01

63

Interactive Applet: Confidence Intervals

NSDL National Science Digital Library

This applet, designed by McGraw-Hill Higher Education, is designed to help students visualize the concept of confidence intervals by allowing the user to select sample size, standard deviation, and confidence level, then testing random samples. It accompanies "Statistical Techniques in Business and Economics," but does not require use of the text.

2011-10-03

64

NSDL National Science Digital Library

This site, presented by the Department of Statistics at Yale University, gives an explanation of, a definition for and an example of confidence intervals. It covers topics including inference about population mean and z and t critical values. The page nicely blends graphs and visuals, along with text, to provide a thorough presentation.

Lacey, Michelle

65

NASA Technical Reports Server (NTRS)

This paper provides an analytical formulation to predict scheduling success for a class of problems frequently referred to as activity scheduling. Space Network communications scheduling is an example of activity scheduling. The principal assumption is that the activity start times are randomly distributed over the available time in the time line. The formulation makes it possible to estimate how much of the demand can be scheduled as a function of the demand, number of resources, activity duration, and activity flexibility. The paper includes computed results for a variety of resource and demand conditions. The results demonstrate that even with highly flexible activities, it is difficult to schedule demand greater than 60 percent of resources without the use of optimization and conflict resolution capabilities in the scheduling system.

Messing, Fredric

1993-01-01

66

Background and objectives: Interbirth intervals (IBIs) mediate a trade-off between child number and child survival. Life history theory predicts that the evolutionarily optimal IBI differs for different individuals whose fitness is affected by how closely a mother spaces her children. The objective of the article is to clarify these conflicts and explore their implications for public health. Methodology: Simple models of inclusive fitness and kin conflict address the evolution of human birth-spacing. Results: Genes of infants generally favor longer intervals than genes of mothers, and infant genes of paternal origin generally favor longer IBIs than genes of maternal origin. Conclusions and implications: The colonization of maternal bodies by offspring cells (fetal microchimerism) raises the possibility that cells of older offspring could extend IBIs by interfering with the implantation of subsequent embryos. PMID:24480612

Haig, David

2014-01-01

67

The aim of this phase II trial was to estimate the objective response rate (ORR) of two different schedules of ixabepilone [weekly or every 3 weeks (Q3W)] combined with bevacizumab, relative to a reference arm of weekly paclitaxel and bevacizumab. Patients with human epidermal growth factor receptor 2-normal, chemotherapy-naïve metastatic breast cancer (MBC) were randomized 3:3:2 to ixabepilone 16 mg/m(2) weekly plus bevacizumab 10 mg/kg Q2W (Arm A: n = 46); ixabepilone 40 mg/m(2) Q3W (reduced to 32 mg/m(2) after four cycles of treatment) plus bevacizumab 15 mg/kg Q3W (Arm B: n = 45); or paclitaxel 90 mg/m(2) weekly plus bevacizumab 10 mg/kg intravenous infusion Q2W (Arm C: n = 32). Of 123 randomized patients, 122 were treated. All were followed for ?19 months; 5 % of patients remained on study treatment at the time of this analysis. Grade 3 or 4 neutropenia was more common in Arm B (60 %) than Arms A (16 %) or C (22 %); other adverse events were similar. The investigator-assessed ORR was 48, 71, and 63 % for Arms A, B, and C, respectively. Median progression-free survival (randomized patients) was 9.6 months in Arm A, 11.9 months in Arm B, and 13.5 months in Arm C. In conclusion, ixabepilone Q3W plus bevacizumab has clinical activity as first-line therapy for MBC relative to paclitaxel plus bevacizumab, but with significantly greater risk of grade 3 or 4 neutropenia. In addition, these data suggest that weekly dosing of ixabepilone may be less active than Q3W dosing, but with less neutropenia. PMID:23649189

Rugo, Hope S; Campone, Mario; Amadori, Dino; Aldrighetti, Daniela; Conte, Pierfranco; Wardley, Andrew; Villanueva, Cristian; Melisko, Michelle; McHenry, M Brent; Liu, David; Lee, Francis; Pivot, Xavier

2013-06-01

68

This randomized, partially-blind study (ClinicalTrials.gov registration number NCT00541970) evaluated the immunogenicity and safety of 2-dose (2D) schedules of the HPV-16/18 AS04-adjuvanted vaccine. Results to month (M) 24 have been reported previously and we now report data to M48 focusing on the licensed vaccine formulation (20 ?g each of HPV-16 and -18 antigens) administered at M0,6 compared with the standard 3-dose (3D) schedule (M0,1,6). Healthy females (age stratified: 9-14, 15-19, 20-25 years) were randomized to receive 2D at M0,6 (n = 240) or 3D at M0,1,6 (n = 239). In the according-to-protocol immunogenicity cohort, all initially seronegative subjects seroconverted for HPV-16 and -18 antibodies and remained seropositive up to M48. For both HPV-16 and -18, geometric mean antibody titer (GMT) ratios (3D schedule in women aged 15-25 years divided by 2D schedule in girls aged 9-14 years) at M36 and M48 were close to 1, as they were at M7 when non-inferiority was demonstrated. The kinetics of HPV-16, -18, -31, and -45 antibody responses were similar for both groups and HPV-16 and -18 GMTs were substantially higher than natural infection titers. The vaccine had a clinically acceptable safety profile in both groups. In summary, antibody responses to a 2D M0,6 schedule of the licensed vaccine formulation in girls aged 9-14 years appeared comparable to the standard 3D schedule in women aged 15-25 years up to 4 years after first vaccination. A 2D schedule could facilitate implementation of HPV vaccination programs and improve vaccine coverage and series completion rates. PMID:24576907

Romanowski, Barbara; Schwarz, Tino F; Ferguson, Linda M; Ferguson, Murdo; Peters, Klaus; Dionne, Marc; Schulze, Karin; Ramjattan, Brian; Hillemanns, Peter; Behre, Ulrich; Suryakiran, Pemmaraju; Thomas, Florence; Struyf, Frank

2014-05-01

69

Pitch strength of regular-interval click trains with different length “runs” of regular intervals

Click trains were generated with first- and second-order statistics following Kaernbach and Demany [J. Acoust. Soc. Am. 104, 2298–2306 (1998)]. First-order intervals are between successive clicks, while second-order intervals are those between every other click. Click trains were generated with a repeating alternation of fixed and random intervals which produce a pitch at the reciprocal of the duration of the fixed interval. The intervals were then randomly shuffled and compared to the unshuffled, alternating click trains in pitch-strength comparison experiments. In almost all comparisons for the first-order interval stimuli, the shuffled-interval click trains had a stronger pitch strength than the unshuffled-interval click trains. The shuffled-interval click trains only produced stronger pitches for second-order interval stimuli when the click trains were unfiltered. Several experimental conditions and an analysis of runs of regular and random intervals in these click trains suggest that the auditory system is sensitive to runs of regular intervals in a stimulus that contains a mix of regular and random intervals. These results indicate that fine-structure regularity plays a more important role in pitch perception than randomness, and that the long-term autocorrelation function or spectra of these click trains are not good predictors of pitch strength. PMID:15957774

Yost, William A.; Mapes-Riordan, Dan; Shofner, William; Dye, Raymond; Sheft, Stanley

2009-01-01

70

Scheduling semiconductor wafer fabrication

The impact that scheduling can have on the performance of semi-conductor wafer fabrication facilities is assessed. The performance measure considered is the mean throughput time (sometimes called cycle time, turnaround time or manufacturing interval) for a lot of wafers. A variety of input control and sequencing rules are evaluated using a simulation model of a representative, but fictitious, semiconductor wafer

LAWRENCE M. WEIN

1988-01-01

71

The aim of this paper is to deal with resource-constrained multiple project scheduling problems (rc-mPSP) under a fuzzy random\\u000a environment by a hybrid genetic algorithm with fuzzy logic controller (flc-hGA), to a large-scale water conservancy and hydropower\\u000a construction project in the southwest region of China, whose main project is a dam embankment. The objective functions in\\u000a this paper are to

Jiuping Xu; Zhe Zhang

72

Purpose The training response of an intensified period of high-intensity exercise is not clear. Therefore, we compared the cardiovascular adaptations of completing 24 high-intensity aerobic interval training sessions carried out for either three or eight weeks, respectively. Methods Twenty-one healthy subjects (23.0±2.1 years, 10 females) completed 24 high-intensity training sessions throughout a time-period of either eight weeks (moderate frequency, MF) or three weeks (high frequency, HF) followed by a detraining period of nine weeks without any training. In both groups, maximal oxygen uptake (VO2max) was evaluated before training, at the 9th and 17th session and four days after the final 24th training session. In the detraining phase VO2max was evaluated after 12 days and thereafter every second week for eight weeks. Left ventricular echocardiography, carbon monoxide lung diffusion transfer factor, brachial artery flow mediated dilatation and vastus lateralis citrate maximal synthase activity was tested before and after training. Results The cardiovascular adaptation after HF training was delayed compared to training with MF. Four days after ending training the HF group showed no improvement (+3.0%, p?=?0.126), whereas the MF group reached their highest VO2max with a 10.7% improvement (p<0.001: group difference p?=?0.035). The HF group reached their highest VO2max (6.1% increase, p?=?0.026) twelve days into the detraining period, compared to a concomitant reduction to 7.9% of VO2max (p<0.001) above baseline in the MF group (group difference p?=?0.609). Conclusion Both HF and MF training of high-intensity aerobic exercise improves VO2max. The cardiovascular adaptation following a HF programme of high-intensity exercise is however delayed compared to MF training. Trial Registration ClinicalTrials.gov NCT00733941. PMID:24516645

Hatle, Håvard; Støbakk, Per Kristian; Mølmen, Harald Edvard; Brønstad, Eivind; Tjønna, Arnt Erik; Steinshamn, Sigurd; Skogvoll, Eirik; Wisløff, Ulrik; Ingul, Charlotte Björk; Rognmo, Øivind

2014-01-01

73

Clairvoyant scheduling of random walks

Â´acs (BU) Clairvoyant demon April 25, 2008 1 / 65 #12;Introduction The clairvoyant demon problem 0 1 2 3 4 nodes. In each instant, either X or Y will move. A demon knows both (infinite) walks completely . . ., Y = 0012111443 . . .. The repetitions are the demon's insertions. PÂ´eter GÂ´acs (BU) Clairvoyant

Gacs, Peter

74

A hybrid genetic algorithm for the job shop scheduling problem

This paper presents a hybrid genetic algorithm for the Job Shop Scheduling problem. The chromosome representation of the problem is based on random keys. The schedules are constructed using a priority rule in which the priorities are defined by the genetic algorithm. Schedules are constructed using a procedure that generates parameterized active schedules. After a schedule is obtained a local

José Fernando Gonçalves; Jorge José De Magalhães Mendes; Maur??cio G. C. Resende

2005-01-01

75

CONFIDENCE INTERVALS UNDER ORDER RESTRICTIONS

In this paper, we consider the problem of constructing confidence intervals (CIs) for G independent normal population means subject to linear ordering constraints. For this problem, CIs based on asymptotic distributions, likelihood ratio tests and bootstraps do not have good properties particularly when some of the population means are close to each other. We propose a new method based on defining intermediate random variables that are related to the original observations and using the CIs of the means of these intermediate random variables to restrict the original CIs from the separate groups. The coverage rates of the intervals are shown to exceed, but be close to, the nominal level for two groups, when the ratio of the variances is assumed known. Simulation studies show that the proposed CIs have coverage rates close to nominal levels with reduced average widths. Data on half-lives of an antibiotic are analyzed to illustrate the method. PMID:24505210

Park, Yongseok; Kalbfleisch, John D.; Taylor, Jeremy M.G.

2013-01-01

76

A phase II trial in advanced renal cell carcinoma (RCC) found no benefit in efficacy or safety between patients receiving oral sunitinib 50 mg/day for 4 weeks followed by 2-week off-treatment (Schedule 4/2) and those receiving 37.5 mg continuous daily sunitinib. We hypothesized that fatigue would have a more variable “on-off” effect with the 4/2 schedule. A total of 292 patients completed two fatigue-related items on Days 1 and 29 of each treatment cycle. Mean absolute slopes were compared across treatments. A planned analysis of item “I feel fatigued” demonstrated that the mean absolute slope was greater in Schedule 4/2 compared to continuous dosing (0.042 vs. 0.032, P = 0.003), and analysis based on the change from Day 1 to Day 29 (0.52 vs. 0.21, P = 0.002) and, separately, Day 29 to the next Day 1 (?0.38 vs. ?0.05, P < 0.001) showed the changes to be significantly larger in Schedule 4/2 than continuous dosing. “I have a lack of energy” showed a similar pattern graphically, however, the planned analysis was not statistically significant based on the absolute slopes but was when Day 1 to Day 29 and Day 29 to Day 1 changes were analyzed separately. The 4/2 arm was associated with a greater degree of variability in fatigue reflecting a possible “on-off” effect whereby patients receiving the 4/2 schedule reported less fatigue at the beginning of each cycle compared to Day 29. The findings can inform care for individuals with advanced RCC receiving intermittent dosing of sunitinib. PMID:25044922

Cella, David; Jensen, Sally E; Hahn, Elizabeth A; Beaumont, Jennifer L; Korytowsky, Beata; Bhattacharyya, Helen; Motzer, Robert

2014-01-01

77

Lottery Scheduling: Flexible Proportional-Share Resource Management

This paper presents lottery scheduling, a novel randomized resource allocation mechanism. Lottery scheduling provides efficient, responsive control over the relative execution rates of computations. Such control is beyond the capabilities of conventional schedulers, and is desirable in systems that service requests of varying importance, such as databases, media-based applications, and networks. Lottery scheduling also supports modular resource management by enabling

Carl A. Waldspurger; William E. Weihl

1994-01-01

78

Topological Constraints in Periodic Train Scheduling

It is well known that many scheduling problems can be modeled as constraint optimization problems. The scheduling of train services can be considered as a prob- lem subject to a number of constraints describing railway infrastructure, required train services and reasonable time-intervals for waiting and transits. Railway optimization problems are known to be hard problems and a good solution or

M. A. Salido; M. Abril; F. Barber; L. Ingolotti; P. Tormos; A. Lova

79

Nearly optimal perfectly-periodic schedules

We consider the problem of scheduling a set of jobs on a single shared resource using time-multiplexing. A perfectly-periodic schedule is one where resource time is divided into equal size “time-slots” quanta, and each job gets a time slot precisely every fixed interval of time (the period of the job). Periodic schedules are advantageous in distributed settings with synchronized clocks,

Amotz Bar-Noy; Aviv Nisgav; Boaz Patt-Shamir

2001-01-01

80

Schedulers for Larger Classes of Pinwheel Instances

The pinwheel is a hard-real-time scheduling problem for scheduling satellite ground stations to service a number of satellites without data loss. Given a multiset of positive integers (instance)A={a1,..., an}, the problem is to find an infinite sequence (schedule) of symbols from {1,2,...,n} such that there is at least one symboli within any interval of ai symbols (slots). Not all instancesA

Mee Yee Chan; Francis Y. L. Chin

1993-01-01

81

Technology Transfer Automated Retrieval System (TEKTRAN)

Irrigation scheduling is critical to the proper management and operation of microirrigation systems since they are typically designed to apply small irrigation volumes frequently. Irrigation scheduling can be based on a soil water balance or by direct measurement of soil or plant water status. The...

82

Structuring a schedule - whether by Critical Path Method (CPM) or Precedence Charting System (PCS) - involves estimating the duration of one or more activities and arranging them in the most logical sequence. Given the start date, the completion date is relatively simple to determine. What is then so complicated about the process. It is complicated by the people involved - the people who make the schedules and the people who attempt to follow them. Schedules are an essential part of project management and construction contract administration. Much of the material available pertains to the mechanics of schedules, the types of logic networks, the ways that data can be generated and presented. This paper sheds light on other facets of the subject - the statistical and philosophical fundamentals involved in scheduling.

Kleck, W

1982-04-01

83

In this book we use only special types of intervals and introduce the notion of different types of interval linear algebras and interval vector spaces using the intervals of the form [0, a] where the intervals are from Zn or Z+ \\cup {0} or Q+ \\cup {0} or R+ \\cup {0}. A systematic development is made starting from set interval vector spaces to group interval vector spaces. Vector spaces are taken as interval polynomials or interval matrices or just intervals over suitable sets or semigroups or groups. Main feature of this book is the authors have given over 350 examples. This book has six chapters. Chapter one is introductory in nature. Chapter two introduces the notion of set interval linear algebras of type one and two. Set fuzzy interval linear algebras and their algebras and their properties are discussed in chapter three. Chapter four introduces several types of interval linear bialgebras and bivector spaces and studies them. The possible applications are given in chapter five. Chapter six suggests nearly 110 problems of all levels.

W. B. Vasantha Kandasamy; Florentin Smarandache

2010-12-08

84

Preemptive scheduling with position costs

Scheduling algorithm, preemption, primal-dual algorithm, dynamic ... have produced efficient lower bounds used in a branch-and-bound method [13] and ..... T by ? and by moving ? units of Ji* from the interval [t+ ? ?, t+) to [t? ? ?, t?) .... fi(?)d?. Since ? can be arbitrarily small, we define the marginal transfer cost mi of Ji that.

85

Intervals in evolutionary algorithms for global optimization

Optimization is of central concern to a number of disciplines. Interval Arithmetic methods for global optimization provide us with (guaranteed) verified results. These methods are mainly restricted to the classes of objective functions that are twice differentiable and use a simple strategy of eliminating a splitting larger regions of search space in the global optimization process. An efficient approach that combines the efficient strategy from Interval Global Optimization Methods and robustness of the Evolutionary Algorithms is proposed. In the proposed approach, search begins with randomly created interval vectors with interval widths equal to the whole domain. Before the beginning of the evolutionary process, fitness of these interval parameter vectors is defined by evaluating the objective function at the center of the initial interval vectors. In the subsequent evolutionary process the local optimization process returns an estimate of the bounds of the objective function over the interval vectors. Though these bounds may not be correct at the beginning due to large interval widths and complicated function properties, the process of reducing interval widths over time and a selection approach similar to simulated annealing helps in estimating reasonably correct bounds as the population evolves. The interval parameter vectors at these estimated bounds (local optima) are then subjected to crossover and mutation operators. This evolutionary process continues for predetermined number of generations in the search of the global optimum.

Patil, R.B.

1995-05-01

86

Food variety increases consumption and the rate of instrumental behavior that is reinforced by food in humans and animals. The present experiment investigated the relationship between the variety effect and habituation to food by testing the role of the interval between successive food presentations on responding in an operant food-seeking task. Habituation to food was expected at short, but not long, interfood intervals. The effects of variety on food's long-term reinforcing value were also tested. Four groups of rats were trained to lever-press on different random-interval (RI) schedules of reinforcement to earn 45-mg food pellets. Half the rats in each group received an unpredictable mix of grain and sucrose pellets, while the other half consistently received sucrose pellets. Response rate began at a high rate and then decreased within each 30-min session for groups that received short inter-pellet intervals (i.e., RI-3?s and RI-6?s reinforcement schedules) but not in groups that received longer inter-pellet intervals (i.e., RI-12?s and RI-24?s). A variety effect in the form of higher responding in the mix group than the sucrose-only group was also only evident at the shorter intervals. Habituation and variety effects were also most evident with the short intervals when we controlled for the number of reinforcers earned, suggesting that they were not merely due to rapid satiation. The variety effect also appeared quickly when groups trained with longer inter-pellet intervals (RI-12?s and RI-24?s) were transitioned to shorter intervals (RI-3?s and RI-6?s). There was no effect of variety on resistance to extinction or on resistance to the response-suppressing effects of pre-session feeding. The results more clearly link this version of the variety effect to the short-term effect of variety on food habituation. PMID:25261732

Thrailkill, Eric A; Epstein, Leonard H; Bouton, Mark E

2015-01-01

87

Pro: Should we move to more frequent haemodialysis schedules?

Most end-stage renal disease patients on maintenance haemodialysis follow the typical schedule of three sessions per week, and thus remain outside dialysis for two short intervals (?2 days in duration) and for a longer interval (?3 days) at the end of each week. This pattern was historically enforced more due to calendar logistics and less due to factors related to health and disease. Therefore, it is long hypothesized that the intermittent nature of haemodialysis and the consequent shifts and fluctuations in volume status and metabolic parameters during the dialysis-free periods may pre-dispose patients to several complications. Recent large-scale observational studies in haemodialysis patients link the first week-day (including the last hours of the long interval and the subsequent dialysis session) with increased risk of cardiovascular morbidity and mortality. Previous observational studies support that enhanced-frequency home haemodialysis is associated with reduced risk of all-cause mortality, while randomized studies suggest that short-daily or alternate-day in-centre haemodialysis offer improvements in left ventricular hypertrophy, blood pressure, phosphorous homeostasis and other intermediate end points when compared with conventional thrice-weekly in-centre haemodialysis. This article summarizes available evidence relating long inter-dialytic intervals with elevated cardiovascular risk, potential mechanisms for this association and the main benefits of more frequent dialytic modalities. PMID:25538158

Georgianos, Panagiotis I; Sarafidis, Pantelis A

2015-01-01

88

We study the dynamics of the European Air Transport Network by using a multiplex network formalism. We will consider the set of flights of each airline as an interdependent network and we analyze the resilience of the system against random flight failures in the passenger's rescheduling problem. A comparison between the single-plex approach and the corresponding multiplex one is presented illustrating that the multiplexity strongly affects the robustness of the European Air Network.

Cardillo, Alessio; Gómez-Gardeñes, Jesús; Romance, Miguel; del Amo, Alejandro J García; Boccaletti, Stefano

2012-01-01

89

Reducing barriers to timely MR imaging scheduling.

Scheduling a magnetic resonance (MR) imaging study at the authors' large health system in 2011 required considerable preparation before an appointment time was given to a patient. Difficulties in promptly scheduling appointments resulted from the varying time required for examinations, depending on the requested protocol, availability of appropriate MR imaging equipment, examination timing, prior insurance authorization verification, and proper patient screening. These factors contributed to a backlog of patients to schedule that regularly exceeded 300. A multidisciplinary process-improvement team was assembled to improve the turnaround time for scheduling an outpatient MR imaging examination (the interval between the time when the order was received and the time when the patient was informed about the MR imaging appointment). Process improvements targeted by the team included protocol turnaround time, schedule standardization, schedule intervals, examination timing, service standards, and scheduling redesign. Using lean methods and multiple plan-do-check-act cycles, the time to schedule an outpatient MR imaging examination improved from 117 hours to 33 hours, a 72% reduction, during the 9-month study period in 2011-2012. The number of patients in the scheduling queue was reduced by 90%. Overall MR imaging examinations within the specific patient population studied increased from 773 patient studies during the first month of intervention to 1444 studies the following month and averaged over 1279 patient studies per month throughout the study. PMID:25384302

Wessman, Brooke V; Moriarity, Andrew K; Ametlli, Vanda; Kastan, David J

2014-01-01

90

Probabilistic Loop Scheduling Considering Communication Overhead?

the productivity, the probabilistic nature of such tasks needs to be properly con- sidered during the schedulingProbabilistic Loop Scheduling Considering Communication Overhead? Sissades Tongsima, Chantana can be rep- resented by random variables. A communication cost issue is also con- sidered as another

Feitelson, Dror

91

Influences on Cocaine Tolerance Assessed under a Multiple Conjunctive Schedule of Reinforcement

ERIC Educational Resources Information Center

Under multiple schedules of reinforcement, previous research has generally observed tolerance to the rate-decreasing effects of cocaine that has been dependent on schedule-parameter size in the context of fixed-ratio (FR) schedules, but not under the context of fixed-interval (FI) schedules of reinforcement. The current experiment examined the…

Yoon, Jin Ho; Branch, Marc N.

2009-01-01

92

Dynamic Slack Reclamation with Procrastination Scheduling in Real-Time Embedded Systems

Dynamic Slack Reclamation with Procrastination Scheduling in Real-Time Embedded Systems Ravindra reclamation techniques under procrastination scheduling to minimize the static and dynamic en- ergy idle intervals through slack reclamation. While using the entire slack for either slowdown

Gupta, Rajesh

93

Interval polynomial positivity

NASA Technical Reports Server (NTRS)

It is shown that a univariate interval polynomial is globally positive if and only if two extreme polynomials are globally positive. It is shown that the global positivity property of a bivariate interval polynomial is completely determined by four extreme bivariate polynomials. The cardinality of the determining set for k-variate interval polynomials is 2k. One of many possible generalizations, where vertex implication for global positivity holds, is made by considering the parameter space to be the set dual of a boxed domain.

Bose, N. K.; Kim, K. D.

1989-01-01

94

Uninstructed human responding: sensitivity to ratio and interval contingencies1

College students' presses on a telegraph key were occasionally reinforced by light onsets in the presence of which button presses (consummatory responses) produced points later exchangeable for money. One student's key presses were reinforced according to a variable-ratio schedule; key presses of another student in a separate room were reinforced according to a variable-interval schedule yoked to the interreinforcement intervals produced by the first student. Instructions described the operation of the reinforcement button, but did not mention the telegraph key; instead, key pressing was established by shaping. Performances were comparable to those of infrahuman organisms: variable-ratio key-pressing rates were higher than yoked variable-interval rates. With some yoked pairs, schedule effects occurred so rapidly that rate reversals produced by schedule reversals were demonstrable within one session. But sensitivity to these contingencies was not reliably obtained with other pairs for whom an experimenter demonstrated key pressing or for whom the reinforcer included automatic point deliveries instead of points produced by button presses. A second experiment with uninstructed responding demonstrated sensitivity to fixed-interval contingencies. These findings clarify prior failures to demonstrate human sensitivity to schedule contingencies: human responding is maximally sensitive to these contingencies when instructions are minimized and the reinforcer requires a consummatory response. PMID:16812006

Matthews, Byron A.; Shimoff, Eliot; Catania, A. Charles; Sagvolden, Terje

1977-01-01

95

Analysis is presented of a prospective randomized study involving 365 patients with histologically proven unresectable non-oat-cell carcinoma of the lung treated with deffinitive radiotherapy. The patients were radomized to one of four treatment regimens: 4000 rad split course, or 4000, 5000, or 6000-rad continuous courses in five fractions per week. Ninety to 100 patients were accessioned to each group. The one-year survival rate is 50% and the two-year survival rate, 25%. The patients treated with the split course have the lowest survival rate in comparison with the other groups. The complete and partial local regression of tumor was 49% in patients treated with 4000 rad and 55% in the groups treated with 5000 and 6000 rad. For patients who achieved complete regression of the tumor following irradiation, the two-year survival rate is 40%, in contrast to 20% for those with partial regression, and no survivors among the patients with stable or progressive disease. The incidence of intrathoracic recurrence was 33% for patients treated with 6000 rad, 39% for those receiving 5000 rad, and 44 to 49% for those treated with a 4000-rad split or continuous course. At present, the data stongly suggest that patients treated with 5000 or 6000 rad have a better response, tumor control, and survival rate than those receiving lower doses. Patients with high performance status or with tumors in earlier stages have a two-year survival rate of approx. 40%, in comparison with 20% for other patients. The various irradiation regimens have been well tolerated, with complications being slightly higher in the 4000-rad split course group and in the 6000-rad continuous course group. The most frequent complications have been pneumonitis, pulmonary fibrosis, and dyspagia due to transient esophagitis. Further investigation will be necessary before the optimal management of patients with bronchogenic carcinoma by irradiation is established.

Perez, C.A.; Stanley, K.; Rubin, P.; Kramer, S.; Brady, L.; Perez-Tamayo, R.; Brown, G.S.; Concannon, J.; Rotman, M.; Seydel, H.G.

1980-06-01

96

Traditional neural networks like multi-layered perceptrons (MLP) use example patterns, i.e., pairs of real-valued observation vectors, ({rvec x},{rvec y}), to approximate function {cflx f}({rvec x}) = {rvec y}. To determine the parameters of the approximation, a special version of the gradient descent method called back-propagation is widely used. In many situations, observations of the input and output variables are not precise; instead, we usually have intervals of possible values. The imprecision could be due to the limited accuracy of the measuring instrument or could reflect genuine uncertainty in the observed variables. In such situation input and output data consist of mixed data types; intervals and precise numbers. Function approximation in interval domains is considered in this paper. We discuss a modification of the classical backpropagation learning algorithm to interval domains. Results are presented with simple examples demonstrating few properties of nonlinear interval mapping as noise resistance and finding set of solutions to the function approximation problem.

Patil, R.B.

1995-05-01

97

NSDL National Science Digital Library

This resource consists of a Java applet and descriptive text. The applet illustrates a random walk on a discrete time interval from 0 to n. The time n and the probability of a step to the right can be varied. The random variables of interest are the final position, the maximum position, and the time of the last zero.

Siegrist, Kyle

98

Sampling SIM: Confidence Intervals

NSDL National Science Digital Library

Created by authors Beth Chance, Bob delmas and Joan Garfield, this web page provides instructions for teaching confidence intervals using Sampling SIM software. It includes information regarding prerequisite knowledge, common misconceptions, and objectives, as well as links to an activity and a pre/post-test. This resource will provide a nice assessment of the skills obtained during this lesson.

Chance, Beth; Delmas, Bob; Garfield, Joan

2009-03-16

99

Overconfidence in Interval Estimates

ERIC Educational Resources Information Center

Judges were asked to make numerical estimates (e.g., "In what year was the first flight of a hot air balloon?"). Judges provided high and low estimates such that they were X% sure that the correct answer lay between them. They exhibited substantial overconfidence: The correct answer fell inside their intervals much less than X% of the time. This…

Soll, Jack B.; Klayman, Joshua

2004-01-01

100

Support vector interval regression networks for interval regression analysis

In this paper, the support vector interval regression networks (SVIRNs) are proposed for the interval regression analysis. The SVIRNs consist of two radial basis function networks. One network identifies the upper side of data interval, and the other network identifies the lower side of data intervals. Because the support vector regression (SVR) approach is equivalent to solving a linear constrained

Jin-tsong Jeng; Chen-chia Chuang; Shun-feng Su

2003-01-01

101

Ant colony system for job shop scheduling with time windows

Scheduling for a job shop production system is an integral aspect of production management. Scheduling operations must minimize\\u000a stock, waste, and idle time and ensure on-time delivery of goods in a time window problem. In this study, due date is considered\\u000a as an interval instead of a time point. This study addresses scheduling with a time window of job shop

Rong-Hwa Huang; Chang-Lin Yang

2008-01-01

102

Childhood Immunization Schedule

... The Basics Vaccines Home Immunization Schedules Instant Childhood Immunization Schedule Get Email Updates Get the best protection ... date. See Disclaimer for additional details. Based on Immunization Schedule for Children 0 through 6 Years of ...

103

A molar theory of reinforcement schedules

Behavior of subjects exposed to concurrent and individual interval and ratio schedules of reinforcement may be described in terms of a set of expressions relating the value of responses to their durations, a feedback equation relating reinforcement to response duration, and the assumption that subjects allocate their time among various responses so as to maximize value. PMID:16812114

Rachlin, Howard

1978-01-01

104

Iterative refinement scheduling

NASA Technical Reports Server (NTRS)

We present a heuristics-based approach to deep space mission scheduling which is modeled on the approach used by expert human schedulers in producing schedules for planetary encounters. New chronological evaluation techniques are used to focus the search by using information gained during the scheduling process to locate, classify, and resolve regions of conflict. Our approach is based on the assumption that during the construction of a schedule there exist several disjunct temporal regions where the demand for one resource type or a single temporal constraint dominates (bottleneck regions). If the scheduler can identify these regions and classify them based on their dominant constraint, then the scheduler can select the scheduling heuristic.

Biefeld, Eric

1992-01-01

105

Clairvoyant scheduling of random walks Peter Gacs

the two 1 #12;2 0 1 2 3 4 Y : WAIT X : GO Figure 1: The clairvoyant demon problem. X,Y are "tokens" performing independent ran- dom walks on the same graph: here the complete graph K5. A "demon" decides every the clairvoyant demon problem, arose first in distributed computing. The original problem was to find a leader

Gacs, Peter

106

Models of Ratio Schedule Performance

Predictions of P. R. Killeen's (1994) mathematical principles of reinforcement were tested for responding on ratio reinforcement schedules. The type of response key, the number of sessions per condition, and first vs. second half of a session had negligible effects on responding. Longer reinforcer durations and larger grain types engeridered more responding, affecting primarily the parameter a (specific activation). Key pecking was faster than treadle pressing, affecting primarily the parameter ? (response time). Longer intertrial intervals led to higher overall response rates and shorter postreinforcement pauses and higher running rates, and ruled out some competing explanations. The treadle data required a distinction between the energetic requirements and rate-limiting properties of extended responses. The theory was extended to predict pause durations and run rates on ratio schedules. PMID:9206029

Bizo, Lewis A.; Killeen, Peter R.

2008-01-01

107

Shaping of interval timing behavior by the peak procedure and response temporal distribution

2 Abstract: Two experiments examined interval timing behavior in mice and rats. In the first experiment, subjects were mice, and in the second one they were rats. The animals were trained using the peak procedure that was a variant of a discrete trial fixed interval (DT-FI) schedule. On food trails, the first lever press response more than 30-s after tone

Yuuki Nishimura; Erika Takahashi; Shogo Sakata

2003-01-01

108

Protocols for distributive scheduling

NASA Technical Reports Server (NTRS)

The increasing complexity of space operations and the inclusion of interorganizational and international groups in the planning and control of space missions lead to requirements for greater communication, coordination, and cooperation among mission schedulers. These schedulers must jointly allocate scarce shared resources among the various operational and mission oriented activities while adhering to all constraints. This scheduling environment is complicated by such factors as the presence of varying perspectives and conflicting objectives among the schedulers, the need for different schedulers to work in parallel, and limited communication among schedulers. Smooth interaction among schedulers requires the use of protocols that govern such issues as resource sharing, authority to update the schedule, and communication of updates. This paper addresses the development and characteristics of such protocols and their use in a distributed scheduling environment that incorporates computer-aided scheduling tools. An example problem is drawn from the domain of space shuttle mission planning.

Richards, Stephen F.; Fox, Barry

1993-01-01

109

NASA Technical Reports Server (NTRS)

A mathematical model for job scheduling in a specified context is presented. The model uses both linear programming and combinatorial methods. While designed with a view toward optimization of scheduling of facility and plant operations at the Deep Space Communications Complex, the context is sufficiently general to be widely applicable. The general scheduling problem including options for scheduling objectives is discussed and fundamental parameters identified. Mathematical algorithms for partitioning problems germane to scheduling are presented.

Phillips, K.

1976-01-01

110

On-line schedulers for pinwheel tasks using the time-driven approach

Pinwheel scheduling algorithms can be used to produce distance-constrained real-time system schedules where the temporal distance between any two consecutive completions of a task must be less than a pre-defined time interval. A pinwheel schedule can be generated off-line and executed cyclically. Such an approach provides a good predictability and allows for off-line schedule optimization. However, the static approach is

Chih-wen Hsueh; Kwei-jay Lin

1998-01-01

111

Lottery and Stride Scheduling: Flexible Proportional-Share Resource Management

This thesis presents flexible abstractions for specifying resource management policies, togetherwith efficient mechanisms for implementing those abstractions. Several novel scheduling techniquesare introduced, including both randomized and deterministic algorithms that provideproportional-share control over resource consumption rates. Such control is beyond the capabilitiesof conventional schedulers, and is desirable across a broad spectrum of systems thatservice clients of varying...

Carl A. Waldspurger

1995-01-01

112

Federal Register 2010, 2011, 2012, 2013

...Collection; Comment Request for Form 1065, Schedule C, Schedule D, Schedule K-1, Schedule L, Schedule M-1, Schedule M-2, and Schedule M-3 AGENCY: Internal Revenue Service (IRS), Treasury. ACTION: Notice and request for...

2010-07-22

113

A Synthesized Heuristic Task Scheduling Algorithm

Aiming at the static task scheduling problems in heterogeneous environment, a heuristic task scheduling algorithm named HCPPEFT is proposed. In task prioritizing phase, there are three levels of priority in the algorithm to choose task. First, the critical tasks have the highest priority, secondly the tasks with longer path to exit task will be selected, and then algorithm will choose tasks with less predecessors to schedule. In resource selection phase, the algorithm is selected task duplication to reduce the interresource communication cost, besides forecasting the impact of an assignment for all children of the current task permits better decisions to be made in selecting resources. The algorithm proposed is compared with STDH, PEFT, and HEFT algorithms through randomly generated graphs and sets of task graphs. The experimental results show that the new algorithm can achieve better scheduling performance. PMID:25254244

Dai, Yanyan; Zhang, Xiangli

2014-01-01

114

Green's relationship, ASI=P(C)2I, which equates the area, ASI, under the receiver operating characteristic (ROC) curve in the single-interval forced-choice (SIFC) task with the proportion correct, P(C)2I, in the two-interval forced-choice (2IFC) task, is rederived using the cross-correlation functions of the SIFC evidence distributions. The relationship is generalized to include discrete random variables, unidimensional decision axes that do not need to

Judi A. Lapsley Miller; Brian K. Scurfield; Vit Drga; Susan J. Galvin; John Whitmore

2002-01-01

115

Dedicated heterogeneous node scheduling including backfill scheduling

A method and system for job backfill scheduling dedicated heterogeneous nodes in a multi-node computing environment. Heterogeneous nodes are grouped into homogeneous node sub-pools. For each sub-pool, a free node schedule (FNS) is created so that the number of to chart the free nodes over time. For each prioritized job, using the FNS of sub-pools having nodes useable by a particular job, to determine the earliest time range (ETR) capable of running the job. Once determined for a particular job, scheduling the job to run in that ETR. If the ETR determined for a lower priority job (LPJ) has a start time earlier than a higher priority job (HPJ), then the LPJ is scheduled in that ETR if it would not disturb the anticipated start times of any HPJ previously scheduled for a future time. Thus, efficient utilization and throughput of such computing environments may be increased by utilizing resources otherwise remaining idle.

Wood, Robert R. (Livermore, CA); Eckert, Philip D. (Livermore, CA); Hommes, Gregg (Pleasanton, CA)

2006-07-25

116

Reinforcement learning in scheduling

NASA Technical Reports Server (NTRS)

The goal of this research is to apply reinforcement learning methods to real-world problems like scheduling. In this preliminary paper, we show that learning to solve scheduling problems such as the Space Shuttle Payload Processing and the Automatic Guided Vehicle (AGV) scheduling can be usefully studied in the reinforcement learning framework. We discuss some of the special challenges posed by the scheduling domain to these methods and propose some possible solutions we plan to implement.

Dietterich, Tom G.; Ok, Dokyeong; Zhang, Wei; Tadepalli, Prasad

1994-01-01

117

Scheduling Nonconsumable Resources

NASA Technical Reports Server (NTRS)

Users manual describes computer program SWITCH that schedules use of resources - by appliances switched on and off and use resources while they are on. Plans schedules according to predetermined goals; revises schedules when new goals imposed. Program works by depth-first searching with strict chronological back-tracking. Proceeds to evaluate alternatives as necessary, sometimes interacting with user.

Porta, Harry J.

1990-01-01

118

Alternative congestion pricing schedules

It is important that alternative congestion pricing schedules are compared with respect to a broad spectrum of their effects. This paper uses an equilibrium simulation model of peak-period commuting along an urban highway to compare quantitatively both positive and normative effects of six congestion pricing schedules. Three of these schedules have been investigated qualitatively in the literature; the others are

Xuehao Chu

1999-01-01

119

course schedule grading scheme

news search image and video search (micro-)blog search product search, federated search social searchwhat is IR course schedule grading scheme 60-538: Information Retrieval September 4, 2014 1 / 48 #12;what is IR course schedule grading scheme Outline 1 what is IR 2 course schedule 3 grading scheme

Lu, Jianguo

120

Scheduling a set of tasks on a set of machines so as to yield an efficient schedule is a basicproblem in computer science and operations research. Most of the research on this problemincorporates the potentially unrealistic assumption that communication between the differentmachines is instantaneous. In this paper we remove this assumption and study the problem ofnetwork scheduling, where each job

Cynthia Phillips Sandia; Nm Clifford Stein

1994-01-01

121

Explorations in statistics: confidence intervals

NSDL National Science Digital Library

Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This third installment of Explorations in Statistics investigates confidence intervals. A confidence interval is a range that we expect, with some level of confidence, to include the true value of a population parameter such as the mean. A confidence interval provides the same statistical information as the P value from a hypothesis test, but it circumvents the drawbacks of that hypothesis test. Even more important, a confidence interval focuses our attention on the scientific importance of some experimental result.

Douglas Curran- Everett (University of Colorado Denver)

2009-06-01

122

Modal Intervals Revisited Part 1: A Generalized Interval Natural Extension

-extensions of the elementary op- erations are first built leading to a generalized interval arithmetic. This arithmetic is proved to coincide with the well known Kaucher arithmetic. Then the natural AE-extensions are constructed extensions) and its utilization. In particular, a new mean-value extension to generalized intervals

Paris-Sud XI, UniversitÃ© de

123

Effect Sizes, Confidence Intervals, and Confidence Intervals for Effect Sizes

ERIC Educational Resources Information Center

The present article provides a primer on (a) effect sizes, (b) confidence intervals, and (c) confidence intervals for effect sizes. Additionally, various admonitions for reformed statistical practice are presented. For example, a very important implication of the realization that there are dozens of effect size statistics is that "authors must…

Thompson, Bruce

2007-01-01

124

A Dynamic Scheduling Method of Earth-Observing Satellites by Employing Rolling Horizon Strategy

Focused on the dynamic scheduling problem for earth-observing satellites (EOS), an integer programming model is constructed after analyzing the main constraints. The rolling horizon (RH) strategy is proposed according to the independent arriving time and deadline of the imaging tasks. This strategy is designed with a mixed triggering mode composed of periodical triggering and event triggering, and the scheduling horizon is decomposed into a series of static scheduling intervals. By optimizing the scheduling schemes in each interval, the dynamic scheduling of EOS is realized. We also propose three dynamic scheduling algorithms by the combination of the RH strategy and various heuristic algorithms. Finally, the scheduling results of different algorithms are compared and the presented methods in this paper are demonstrated to be efficient by extensive experiments. PMID:23690742

Dishan, Qiu; Chuan, He; Jin, Liu; Manhao, Ma

2013-01-01

125

Online and o ine preemptive two-machine job shop schedul-Tracy Kimbrel

Online and o#15;ine preemptive two-machine job shop schedul- ing Tracy Kimbrel IBM T.J. Watson online and o#15;ine algorithms for special cases of preemptive job shop scheduling to minimize makespan are asymptotically tight. keywords: online algorithms, job shop scheduling, preemption, randomized algorithms 1 #12

Saia, Jared

126

A distributed scheduling algorithm for heterogeneous real-time systems

NASA Technical Reports Server (NTRS)

Much of the previous work on load balancing and scheduling in distributed environments was concerned with homogeneous systems and homogeneous loads. Several of the results indicated that random policies are as effective as other more complex load allocation policies. The effects of heterogeneity on scheduling algorithms for hard real time systems is examined. A distributed scheduler specifically to handle heterogeneities in both nodes and node traffic is proposed. The performance of the algorithm is measured in terms of the percentage of jobs discarded. While a random task allocation is very sensitive to heterogeneities, the algorithm is shown to be robust to such non-uniformities in system components and load.

Zeineldine, Osman; El-Toweissy, Mohamed; Mukkamala, Ravi

1991-01-01

127

Mele et al. (1986) reported exposure to Aroclor 1248 (A1248) in rhesus monkeys produced an increased rate of responding under a fixed-interval (FI) schedule of reinforcement in which 25% of the scheduled reinforcers were omitted. The purpose of this work was to determine whether...

128

NASA Technical Reports Server (NTRS)

The DSN (Deep Space Network) Scheduling Engine targets all space missions that use DSN services. It allows clients to issue scheduling, conflict identification, conflict resolution, and status requests in XML over a Java Message Service interface. The scheduling requests may include new requirements that represent a set of tracks to be scheduled under some constraints. This program uses a heuristic local search to schedule a variety of schedule requirements, and is being infused into the Service Scheduling Assembly, a mixed-initiative scheduling application. The engine resolves conflicting schedules of resource allocation according to a range of existing and possible requirement specifications, including optional antennas; start of track and track duration ranges; periodic tracks; locks on track start, duration, and allocated antenna; MSPA (multiple spacecraft per aperture); arraying/VLBI (very long baseline interferometry)/delta DOR (differential one-way ranging); continuous tracks; segmented tracks; gap-to-track ratio; and override or block-out of requirements. The scheduling models now include conflict identification for SOA(start of activity), BOT (beginning of track), RFI (radio frequency interference), and equipment constraints. This software will search through all possible allocations while providing a best-effort solution at any time. The engine reschedules to accommodate individual emergency tracks in 0.2 second, and emergency antenna downtime in 0.2 second. The software handles doubling of one mission's track requests over one week (to 42 total) in 2.7 seconds. Further tests will be performed in the context of actual schedules.

Clement, Bradley; Johnston, Mark; Wax, Allan; Chouinard, Caroline

2008-01-01

129

A Comparison of Techniques for Scheduling Earth-Observing Satellites

NASA Technical Reports Server (NTRS)

Scheduling observations by coordinated fleets of Earth Observing Satellites (EOS) involves large search spaces, complex constraints and poorly understood bottlenecks, conditions where evolutionary and related algorithms are often effective. However, there are many such algorithms and the best one to use is not clear. Here we compare multiple variants of the genetic algorithm: stochastic hill climbing, simulated annealing, squeaky wheel optimization and iterated sampling on ten realistically-sized EOS scheduling problems. Schedules are represented by a permutation (non-temperal ordering) of the observation requests. A simple deterministic scheduler assigns times and resources to each observation request in the order indicated by the permutation, discarding those that violate the constraints created by previously scheduled observations. Simulated annealing performs best. Random mutation outperform a more 'intelligent' mutator. Furthermore, the best mutator, by a small margin, was a novel approach we call temperature dependent random sampling that makes large changes in the early stages of evolution and smaller changes towards the end of search.

Globus, Al; Crawford, James; Lohn, Jason; Pryor, Anna

2004-01-01

130

Explorations in Statistics: Confidence Intervals

ERIC Educational Resources Information Center

Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This third installment of "Explorations in Statistics" investigates confidence intervals. A confidence interval is a range that we expect, with some level of confidence, to include the true value of a population parameter…

Curran-Everett, Douglas

2009-01-01

131

Teaching Confidence Intervals Using Simulation

ERIC Educational Resources Information Center

Confidence intervals are difficult to teach, in part because most students appear to believe they understand how to interpret them intuitively. They rarely do. To help them abandon their misconception and achieve understanding, we have developed a simulation tool that encourages experimentation with multiple confidence intervals derived from the…

Hagtvedt, Reidar; Jones, Gregory Todd; Jones, Kari

2008-01-01

132

Automatic Error Analysis Using Intervals

ERIC Educational Resources Information Center

A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

Rothwell, E. J.; Cloud, M. J.

2012-01-01

133

Integrated resource scheduling in a distributed scheduling environment

NASA Technical Reports Server (NTRS)

The Space Station era presents a highly-complex multi-mission planning and scheduling environment exercised over a highly distributed system. In order to automate the scheduling process, customers require a mechanism for communicating their scheduling requirements to NASA. A request language that a remotely-located customer can use to specify his scheduling requirements to a NASA scheduler, thus automating the customer-scheduler interface, is described. This notation, Flexible Envelope-Request Notation (FERN), allows the user to completely specify his scheduling requirements such as resource usage, temporal constraints, and scheduling preferences and options. The FERN also contains mechanisms for representing schedule and resource availability information, which are used in the inter-scheduler inconsistency resolution process. Additionally, a scheduler is described that can accept these requests, process them, generate schedules, and return schedule and resource availability information to the requester. The Request-Oriented Scheduling Engine (ROSE) was designed to function either as an independent scheduler or as a scheduling element in a network of schedulers. When used in a network of schedulers, each ROSE communicates schedule and resource usage information to other schedulers via the FERN notation, enabling inconsistencies to be resolved between schedulers. Individual ROSE schedules are created by viewing the problem as a constraint satisfaction problem with a heuristically guided search strategy.

Zoch, David; Hall, Gardiner

1988-01-01

134

Introduction Nonattendance at scheduled outpatient appointments for primary care is a major health care problem worldwide. Our aim was to estimate the prevalence of nonattendance at scheduled appointments for outpatients seeking primary care, to identify associated factors and build a model that predicts nonattendance at scheduled appointments. Methods A cohort study of adult patients, who had a scheduled outpatient appointment for primary care, was conducted between January 2010 and July 2011, at the Italian Hospital of Buenos Aires. We evaluated the history and characteristics of these patients, and their scheduling and attendance at appointments. Patients were divided into two groups: those who attended their scheduled appointments, and those who did not. We estimated the odds ratios (OR) and corresponding 95% confidence intervals (95% CI), and generated a predictive model for nonattendance, with logistic regression, using factors associated with lack of attendance, and those considered clinically relevant. Alternative models were compared using Akaike’s Information Criterion. A generation cohort and a validation cohort were assigned randomly. Results Of 113,716 appointments included in the study, 25,687 were missed (22.7%; 95% CI: 22.34%–22.83%). We found a statistically significant association between nonattendance and age (OR: 0.99; 95% CI: 0.99–0.99), number of issues in the personal health record (OR: 0.98; 95% CI: 0.98–0.99), time between the request for and date of appointment (OR: 1; 95% CI: 1–1), history of nonattendance (OR: 1.07; 95% CI: 1.07–1.07), appointment scheduled later than 4 pm (OR: 1.30; 95% CI: 1.24–1.35), and specific days of the week (OR: 1.00; 95% CI: 1.06–1.1). The predictive model for nonattendance included characteristics of the patient requesting the appointment, the appointment request, and the actual appointment date. The area under the receiver operating characteristic curve of the predictive model in the generation cohort was 0.892 (95% CI: 0.890–0.894). Conclusion Evidence related to patient characteristics, and the identification of appointments with a higher likelihood of nonattendance, should promote guided strategies to reduce the rate of nonattendance, as well as to future research on this topic. The use of predictive models could further guide management strategies to reduce the rate of nonattendance. PMID:24235820

Giunta, Diego; Briatore, Agustina; Baum, Analía; Luna, Daniel; Waisman, Gabriel; de Quiros, Fernán Gonzalez Bernaldo

2013-01-01

135

Confidence Interval for the Estimated Mean of a Population

NSDL National Science Digital Library

Given a sample of N values of X randomly drawn from a normally distributed population, this program will calculate the .95 and .99 confidence intervals (CI) for the estimated mean of the population. Data entry is simple, author Richard Lowry has allowed users to import data directly from a spreadsheet. This is a great resource for easy statistical calculations.

Lowry, Richard

2009-01-14

136

Preemptive scheduling with release times, deadlines, and due times

Given n jobs, each of which has a release time, a deadline, and a processing requirement, the problem of determining whether there exists a preemptive schedule on m uniform machines which completes each job in the time interval between its release time and its deadline is examined. An o(m/sup 2/n/sup 4/+n/sup 5/) algorithm is presented which uses a generalisation of network flow techniques to construct such a schedule whenever one exists. This algorithm is then used with search techniques to find a schedule which minimises maximum lateness. 10 references.

Martel, C.

1982-07-01

137

Expert systems tools for Hubble Space Telescope observation scheduling

NASA Technical Reports Server (NTRS)

The utility of expert systems techniques for the Hubble Space Telescope (HST) planning and scheduling is discussed and a plan for development of expert system tools which will augment the existing ground system is described. Additional capabilities provided by these tools will include graphics-oriented plan evaluation, long-range analysis of the observation pool, analysis of optimal scheduling time intervals, constructing sequences of spacecraft activities which minimize operational overhead, and optimization of linkages between observations. Initial prototyping of a scheduler used the Automated Reasoning Tool running on a LISP workstation.

Miller, Glenn; Rosenthal, Don; Cohen, William; Johnston, Mark

1987-01-01

138

Knowledge based tools for Hubble Space Telescope planning and scheduling: Constraints and strategies

NASA Technical Reports Server (NTRS)

The Hubble Space Telescope (HST) presents an especially challenging scheduling problem since a year's observing program encompasses tens of thousands of exposures facing numerous coupled constraints. Recent progress in the development of planning and scheduling tools is discussed which augment the existing HST ground system. General methods for representing activities, constraints, and constraint satisfaction, and time segmentation were implemented in a scheduling testbed. The testbed permits planners to evaluate optimal scheduling time intervals, calculate resource usage, and to generate long and medium range plans. Graphical displays of activities, constraints, and plans are an important feature of the system. High-level scheduling strategies using rule based and neural net approaches were implemented.

Miller, Glenn; Johnston, Mark; Vick, Shon; Sponsler, Jeff; Lindenmayer, Kelly

1988-01-01

139

NASA Technical Reports Server (NTRS)

The Air Force Space Command schedules telemetry, tracking and control activities across the Air Force Satellite Control network. The Range Scheduling Aid (RSA) is a rapid prototype combining a user-friendly, portable, graphical interface with a sophisticated object-oriented database. The RSA has been a rapid prototyping effort whose purpose is to elucidate and define suitable technology for enhancing the performance of the range schedulers. Designing a system to assist schedulers in their task and using their current techniques as well as enhancements enabled by an electronic environment, has created a continuously developing model that will serve as a standard for future range scheduling systems. The RSA system is easy to use, easily ported between platforms, fast, and provides a set of tools for the scheduler that substantially increases his productivity.

Halbfinger, Eliezer M.; Smith, Barry D.

1991-01-01

140

Examining the efficiency of schedules of distributed retrieval practice.

Given that students typically have a sizeable amount of course material to learn but a finite amount of study time, evaluating the efficiency of study schedules is important. We explored the efficiency of various schedules of distributed retrieval plus restudy. Across two experiments, 227 undergraduates were asked to learn Swahili-English vocabulary word pairs. In conventional schedule groups, all items were presented for 3 practice trials after initial study (as in most previous research). In dropout schedule groups, the number of practice trials allocated to each item varied, in that practice with a given item was discontinued after criterion performance had been reached. A dropout schedule led to levels of performance similar to those for conventional schedules (but in fewer trials), and it was particularly effective for learning initially incorrect items. However, the efficiency of the various schedules depended critically on the interval between presentations of an item. Results suggest that dropout can be a more efficient learning schedule for students than can conventional schedules of practice. PMID:18265608

Pyc, Mary A; Rawson, Katherine A

2007-12-01

141

Interactive dynamic aircraft scheduling

Introducing recent advances in computer technology to improve aircraft scheduling is investigated. Incorporating interactive graphics, modern database manipulation techniques, and decision support algorithms, the computer ...

Deckwitz, Thomas Anthony

1984-01-01

142

Advice Complexity and Barely Random Algorithms

NASA Astrophysics Data System (ADS)

Recently, a new measurement - the advice complexity - was introduced for measuring the information content of online problems. The aim is to measure the bitwise information that online algorithms lack, causing them to perform worse than offline algorithms. Among a large number of problems, a well-known scheduling problem, job shop scheduling with unit length tasks, and the paging problem were analyzed within this model. We observe some connections between advice complexity and randomization. Our special focus goes to barely random algorithms, i.e., randomized algorithms that use only a constant number of random bits, regardless of the input size. We adapt the results on advice complexity to obtain efficient barely random algorithms for both the job shop scheduling and the paging problem.

Komm, Dennis; Královi?, Richard

143

Reference intervals: the way forward.

New facts have recently enhanced interest in the topic of reference intervals. In particular, the International Organization for Standardization standard 15189, requesting that 'biological reference intervals shall be periodically reviewed', and the directive of the European Union on in vitro diagnostic medical devices asking manufacturers to provide detailed information on reference intervals, have renewed interest in the subject. This review presents an update on the topic, discussing the theoretical aspects and the most critical issues. The basic approach to the definition of reference intervals proposed in the original International Federation of Clinical Chemistry documents still remain valid. The use of data mining to obtain reference data from existing databases has severe limitations. New statistical approaches to discard outliers and to compute reference limits have been recommended. On the other hand, perspectives opened by the improvement in standardization through the implementation of the concept of traceability suggest new models to define 'common' reference intervals that can be transferred and adopted by different clinical laboratories in order to decrease the proliferation of different reference intervals not always justified by differences in population characteristics or in analytical methodology. PMID:19103955

Ceriotti, Ferruccio; Hinzmann, Rolf; Panteghini, Mauro

2009-01-01

144

Interval timing is a key element of foraging theory, models of predator avoidance, and competitive interactions. Although interval timing is well documented in vertebrate species, it is virtually unstudied in invertebrates. In the present experiment, we used free-flying honey bees (Apis mellifera ligustica) as a model for timing behaviors. Subjects were trained to enter a hole in an automated artificial flower to receive a nectar reinforcer (i.e. reward). Responses were continuously reinforced prior to exposure to either a fixed interval (FI) 15-sec, FI 30-sec, FI 60-sec, or FI 120-sec reinforcement schedule. We measured response rate and post-reinforcement pause within each fixed interval trial between reinforcers. Honey bees responded at higher frequencies earlier in the fixed interval suggesting subject responding did not come under traditional forms of temporal control. Response rates were lower during FI conditions compared to performance on continuous reinforcement schedules, and responding was more resistant to extinction when previously reinforced on FI schedules. However, no “scalloped” or “break-and-run” patterns of group or individual responses reinforced on FI schedules were observed; no traditional evidence of temporal control was found. Finally, longer FI schedules eventually caused all subjects to cease returning to the operant chamber indicating subjects did not tolerate the longer FI schedules. PMID:24983960

Craig, David Philip Arthur; Varnon, Christopher A.; Sokolowski, Michel B. C.; Wells, Harrington; Abramson, Charles I.

2014-01-01

145

Dynamic slack reclamation with procrastination scheduling in real-time embedded systems

Leakage energy consumption is an increasing concern in current and future CMOS technology generations. Procrastination scheduling, where task execution can be delayed to maximize the duration of idle intervals, has been proposed to minimize leakage energy drain. We address dynamic slack reclamation techniques under procrastination scheduling to minimize the static and dynamic energy consumption. In addition to dynamic task slowdown,

Ravindra Jejurikar; Rajesh K. Gupta

2005-01-01

146

Performance of two species of quail on basic reinforcement schedules1

Two bobwhite quail (Colinus virginianus) and two Japanese quail (Coturnix coturnix japonica) were trained to peck a response key to obtain food. In general, performance on fixed ratio 20 and variable- and fixed-interval 60-sec schedules was comparable to the response patterns of other species under these schedules. PMID:5645877

Cloar, F. Thomas; Melvin, Kenneth B.

1968-01-01

147

ERIC Educational Resources Information Center

Dissociating motoric and motivational effects of pharmacological manipulations on operant behavior is a substantial challenge. To address this problem, we applied a response-bout analysis to data from rats trained to lever press for sucrose on variable-interval (VI) schedules of reinforcement. Motoric, motivational, and schedule factors (effort…

Brackney, Ryan J.; Cheung, Timothy H. C.; Neisewander, Janet L.; Sanabria, Federico

2011-01-01

148

Stochastic interval analysis of natural frequency and mode shape of structures with uncertainties

NASA Astrophysics Data System (ADS)

In this paper, natural frequencies and mode shapes of structures with mixed random and interval parameters are investigated by using a hybrid stochastic and interval approach. Expressions for the mean value and variance of natural frequencies and mode shapes are derived by using perturbation method and random interval moment method. The bounds of these probabilistic characteristics are then determined by interval arithmetic. Two examples are given first to illustrate the feasibility of the presented method and the results are verified by Monte Carlo Simulations. The presented approach is also applicable to solve pure random and pure interval problems. This capability is demonstrated in the third and fourth examples through the comparisons with the peer research outcomes.

Wang, Chen; Gao, Wei; Song, Chongmin; Zhang, Nong

2014-04-01

149

Interval regression by tolerance analysis approach

In interval linear regression analysis, we are given crisp or interval data and we are to determine appropriate interval regression parameters. There are various methods for interval regression; many of them possess the property that while some of the resulting interval regression parameters are very wide, the other parameters are crisp. This drawback is the main limiting factor for such

Milan Hladík; Michal ?erný

150

NCI-Frederick PHL - Scheduling

Services Price List Courier Services & Shipment Procedures Scheduling Contact Information Related Links Establishing an Account PHL Forms PHL Portal Scheduling Project scheduling will depend on the type, scope and size of the project. Detailed information

151

Software Pipelining: An Effective Scheduling Technique for VLIW Machines

Abstract This paper shows that software pipelining is an effective and viable scheduling technique for VLIW processors In software pipelining, iterations of a loop in the source program are continuously initiated at constant intervals, before the preceding iterations complete The advantage of software pipelining is that optimal performance can be achieved with compact object code

Monica S. Lam

1988-01-01

152

High resolution time interval meter

Method and apparatus are provided for measuring the time interval between two events to a higher resolution than reliability available from conventional circuits and component. An internal clock pulse is provided at a frequency compatible with conventional component operating frequencies for reliable operation. Lumped constant delay circuits are provided for generating outputs at delay intervals corresponding to the desired high resolution. An initiation START pulse is input to generate first high resolution data. A termination STOP pulse is input to generate second high resolution data. Internal counters count at the low frequency internal clock pulse rate between the START and STOP pulses. The first and second high resolution data are logically combined to directly provide high resolution data to one counter and correct the count in the low resolution counter to obtain a high resolution time interval measurement.

Martin, A.D.

1986-05-09

153

Meta-Analytic Interval Estimation for Standardized and Unstandardized Mean Differences

ERIC Educational Resources Information Center

The fixed-effects (FE) meta-analytic confidence intervals for unstandardized and standardized mean differences are based on an unrealistic assumption of effect-size homogeneity and perform poorly when this assumption is violated. The random-effects (RE) meta-analytic confidence intervals are based on an unrealistic assumption that the selected…

Bonett, Douglas G.

2009-01-01

154

Approximation Schemes for Scheduling with Availability Constraints

NASA Astrophysics Data System (ADS)

We investigate the problems of scheduling n weighted jobs to m identical machines with availability constraints. We consider two different models of availability constraints: the preventive model where the unavailability is due to preventive machine maintenance, and the fixed job model where the unavailability is due to a priori assignment of some of the n jobs to certain machines at certain times. Both models have applications such as turnaround scheduling or overlay computing. In both models, the objective is to minimize the total weighted completion time. We assume that m is a constant, and the jobs are non-resumable. For the preventive model, it has been shown that there is no approximation algorithm if all machines have unavailable intervals even when w i = p i for all jobs. In this paper, we assume there is one machine permanently available and the processing time of each job is equal to its weight for all jobs. We develop the first PTAS when there are constant number of unavailable intervals. One main feature of our algorithm is that the classification of large and small jobs is with respect to each individual interval, thus not fixed. This classification allows us (1) to enumerate the assignments of large jobs efficiently; (2) and to move small jobs around without increasing the objective value too much, and thus derive our PTAS. Then we show that there is no FPTAS in this case unless P = NP.

Fu, Bin; Huo, Yumei; Zhao, Hairong

155

NASA Schedule Management Handbook

NASA Technical Reports Server (NTRS)

The purpose of schedule management is to provide the framework for time-phasing, resource planning, coordination, and communicating the necessary tasks within a work effort. The intent is to improve schedule management by providing recommended concepts, processes, and techniques used within the Agency and private industry. The intended function of this handbook is two-fold: first, to provide guidance for meeting the scheduling requirements contained in NPR 7120.5, NASA Space Flight Program and Project Management Requirements, NPR 7120.7, NASA Information Technology and Institutional Infrastructure Program and Project Requirements, NPR 7120.8, NASA Research and Technology Program and Project Management Requirements, and NPD 1000.5, Policy for NASA Acquisition. The second function is to describe the schedule management approach and the recommended best practices for carrying out this project control function. With regards to the above project management requirements documents, it should be noted that those space flight projects previously established and approved under the guidance of prior versions of NPR 7120.5 will continue to comply with those requirements until project completion has been achieved. This handbook will be updated as needed, to enhance efficient and effective schedule management across the Agency. It is acknowledged that most, if not all, external organizations participating in NASA programs/projects will have their own internal schedule management documents. Issues that arise from conflicting schedule guidance will be resolved on a case by case basis as contracts and partnering relationships are established. It is also acknowledged and understood that all projects are not the same and may require different levels of schedule visibility, scrutiny and control. Project type, value, and complexity are factors that typically dictate which schedule management practices should be employed.

2011-01-01

156

In a discrete-trials procedure, pigeons chose between a fixed-ratio 81 schedule and a progressive-ratio schedule by making a single peck at the key correlated with one or the other of these schedules. The response requirement on the progressive-ratio schedule began at 1 and increased by 10 each time the progressive-ratio schedule was chosen. Each time the fixed-ratio schedule was chosen, the requirement on the progressive-ratio schedule was reset to 1 response. In conditions where there was no intertrial interval, subjects chose the progressive-ratio schedule for an average of about five consecutive trials (during which the response requirement increased to 41), and then chose the fixed-ratio schedule. This ratio was larger than that predicted by an optimality analysis that assumes that subjects respond in a pattern that minimizes the response-reinforcer ratio or one that assumes that subjects respond in a pattern that maximizes the overall rate of reinforcement. In conditions with a 25-s or 50-s intertrial interval, subjects chose the progressive-ratio schedule for an average of about eight consecutive trials before choosing the fixed-ratio schedule. This change in performance with the addition of an intertrial interval was also not predicted by an optimality analysis. On the other hand, the results were consistent with the theory that choice is determined by the delays to the reinforcers delivered on the present trial and on subsequent trials. PMID:3681185

Mazur, J E; Vaughan, W

1987-01-01

157

A centrally acting antitussive agent dextromethorphan (DM) was tested to determine its possible interaction with naloxone in rats responding under a fixed-interval schedule of positive reinforcement. A sugar sweetened milk reward was used as a positive reinforcer. Under the same experimental conditions the effects of morphine alone and in combination with naloxone were also determined. Low dose DM (10 mg\\/kg)

Tambay Ta?kin

1986-01-01

158

Solar flare interval distribution for probabilistic flare forecasting method

NASA Astrophysics Data System (ADS)

Determining of more likely model for solar flare interval distribution is an essential step in probabilistic solar flare forecasting method in space weather research. In this presentation we discuss statistical models for solar flare interval distribution in individual active regions. We first discuss some problems with a conventional procedure to derive probability density functions from any data set and propose a new procedure, which uses the maximum likelihood method and Akaike Information Criterion (AIC) to objectively compare some competing probability density functions. Previous studies on solar flare interval distribution in individual active regions only dealt with constant or time-dependent Poisson process models, and no other models were discussed. We examine three models -exponential, lognormal, and inverse Gaussian- as competing models for probability density functions. We analyzed solar flare data in 55 active regions that are listed in the GOES soft X-ray flare catalog for the years from 1981 to 2005 by using the new procedure. We found that lognormal and inverse Gaussian models are more likely models than the exponential model for solar flare interval distribution in individual active regions. The results suggest that solar flares do not occur randomly in time; rather, solar flare intervals appear to be regulated by solar flare mechanisms. We briefly mention a probabilistic solar flare forecasting method as an application of a solar flare interval distribution analysis.

Kubo, Yuki

159

Linear Regression Sample Proportions Interpretation of the Confidence Interval Interval Estimation

Linear Regression Sample Proportions Interpretation of the Confidence Interval Topic 16 Interval Estimation Additional Topics 1 / 9 #12;Linear Regression Sample Proportions Interpretation of the Confidence Interval Outline Linear Regression Sample Proportions Interpretation of the Confidence Interval 2 / 9 #12

Watkins, Joseph C.

160

Operating Systems: CPU Scheduling

NSDL National Science Digital Library

This Web site from the University of Bridgeport provides the context and explanation for CPU scheduling mechanisms. It includes a rationale for context switching along with with pseudocode, and it explains the difference between non-preemptive versus preemptive scheduling. It offers visuals that could be incorporated into lectures analyzing various scheduling algorithms including first in first out (FIFO), round-robin, priority-based, shortest job first (SJF), and multilevel feedback queues. It includes an assignment as well as an online slideshow presentation. This site is applicable for technical/community college or university level computer science or engineering courses.

2009-08-10

161

Are accidents scheduled. [safety management problems

NASA Technical Reports Server (NTRS)

Two major sets of safety problems associated with project scheduling are examined. The first set involves problems resulting from the improper scheduling of the safety tasks. The second involves problems which result from inadequate attention to scheduling of those project tasks which lead to tests and operations and includes condensed schedules, modified schedules, schedule workarounds, eliminated portions of the schedules and strung out schedules.

Childs, C.

1976-01-01

162

Chapter 14 Work Schedule 14.01 Alternative Work Schedules

Chapter 14 Work Schedule 14.01 Alternative Work Schedules Â· Policy The University of Wisconsin-Madison supports the use of alternative work schedules by full- time or part-time classified employees when participate in an alternative work schedule experiment unless the employee is specifically excluded

Sheridan, Jennifer

163

High resolution time interval counter

NASA Technical Reports Server (NTRS)

In recent years, we have developed two types of high resolution, multi-channel time interval counters. In the NIST two-way time transfer MODEM application, the counter is designed for operating primarily in the interrupt-driven mode, with 3 start channels and 3 stop channels. The intended start and stop signals are 1 PPS, although other frequencies can also be applied to start and stop the count. The time interval counters used in the NIST Frequency Measurement and Analysis System are implemented with 7 start channels and 7 stop channels. Four of the 7 start channels are devoted to the frequencies of 1 MHz, 5 MHz or 10 MHz, while triggering signals to all other start and stop channels can range from 1 PPS to 100 kHz. Time interval interpolation plays a key role in achieving the high resolution time interval measurements for both counters. With a 10 MHz time base, both counters demonstrate a single-shot resolution of better than 40 ps, and a stability of better than 5 x 10(exp -12) (sigma(sub chi)(tau)) after self test of 1000 seconds). The maximum rate of time interval measurements (with no dead time) is 1.0 kHz for the counter used in the MODEM application and is 2.0 kHz for the counter used in the Frequency Measurement and Analysis System. The counters are implemented as plug-in units for an AT-compatible personal computer. This configuration provides an efficient way of using a computer not only to control and operate the counters, but also to store and process measured data.

Zhang, Victor S.; Davis, Dick D.; Lombardi, Michael A.

1995-01-01

164

Autonomous Byte Stream Randomizer

NASA Technical Reports Server (NTRS)

Net-centric networking environments are often faced with limited resources and must utilize bandwidth as efficiently as possible. In networking environments that span wide areas, the data transmission has to be efficient without any redundant or exuberant metadata. The Autonomous Byte Stream Randomizer software provides an extra level of security on top of existing data encryption methods. Randomizing the data s byte stream adds an extra layer to existing data protection methods, thus making it harder for an attacker to decrypt protected data. Based on a generated crypto-graphically secure random seed, a random sequence of numbers is used to intelligently and efficiently swap the organization of bytes in data using the unbiased and memory-efficient in-place Fisher-Yates shuffle method. Swapping bytes and reorganizing the crucial structure of the byte data renders the data file unreadable and leaves the data in a deconstructed state. This deconstruction adds an extra level of security requiring the byte stream to be reconstructed with the random seed in order to be readable. Once the data byte stream has been randomized, the software enables the data to be distributed to N nodes in an environment. Each piece of the data in randomized and distributed form is a separate entity unreadable on its own right, but when combined with all N pieces, is able to be reconstructed back to one. Reconstruction requires possession of the key used for randomizing the bytes, leading to the generation of the same cryptographically secure random sequence of numbers used to randomize the data. This software is a cornerstone capability possessing the ability to generate the same cryptographically secure sequence on different machines and time intervals, thus allowing this software to be used more heavily in net-centric environments where data transfer bandwidth is limited.

Paloulian, George K.; Woo, Simon S.; Chow, Edward T.

2013-01-01

165

Demos: Scheduling Hourly Workers

NSDL National Science Digital Library

In a world of just-in-time production and flexible scheduling, a number of scholars and policy analysts are beginning to examine these specific business practices. In March 2011, the Demos organization published a paper as part of their series with the magazine "The American Prospect" that looks into the world of what is called "workplace flexibility." The 22-page paper is by Nancy K. Cauthen, and it looks at how scheduling flexibility might actually be very problematic for low-wage workers. The paper posits that most low-wage workers would probably benefit from "more predictability and stability within fluid schedules," as they need more advance notice to plan for child care and transportation. The resulting schedule changes may in fact also cause "tremendous chaos and stress" for these workers' children as well. Visitors will find much to think about in this paper, including its concluding remarks, which offer a set of timely policy recommendations.

Cauthen, Nancy K.

166

Immunization Schedules for Adults

... Immunize? Vaccines: The Basics Vaccines Home Share Compartir Immunization Schedules for Adults in Easy-to-read Formats ... never outgrow the need for vaccines. The specific immunizations you need as an adult are determined by ...

167

NSDL National Science Digital Library

This online television schedule provides listings of NASA's televised programming, including mission coverage, educational shows, and historical programs. A link is provided to a list of organizations that transmit NASA television to the World Wide Web.

168

A method for interference mitigation in space communications scheduling

NASA Technical Reports Server (NTRS)

Increases in the number of user spacecraft and data rates supported by NASA's Tracking and Data Relay Satellite System (TDRSS) in the S and Ku bands could result in communications conflicts due to mutual interference. A method to mitigate interference while minimizing unnecessary scheduling restrictions on both TDRSS network and user resources, based on consideration of all relevant communications parameters, was developed. The steps of this method calculate required separation angles at TDRS and produce interference intervals, which can be used in the production of schedules free of unacceptable interference. The method can also be used as a basis for analysis, evaluation, and optimization of user schedules with respect to communications performance. Described here are the proposed method and its potential application to scheduling in space communications. Test cases relative to planned missions, including the Earth Observing System, the Space Station Manned Base, and the Space Shuttle are discussed.

Wong, Yen F.; Rash, James L.

1991-01-01

169

Polynomial optimization techniques for activity scheduling. Optimization based prototype scheduler

NASA Technical Reports Server (NTRS)

Polynomial optimization techniques for activity scheduling (optimization based prototype scheduler) are presented in the form of the viewgraphs. The following subject areas are covered: agenda; need and viability of polynomial time techniques for SNC (Space Network Control); an intrinsic characteristic of SN scheduling problem; expected characteristics of the schedule; optimization based scheduling approach; single resource algorithms; decomposition of multiple resource problems; prototype capabilities, characteristics, and test results; computational characteristics; some features of prototyped algorithms; and some related GSFC references.

Reddy, Surender

1991-01-01

170

NASA Technical Reports Server (NTRS)

Resupply scheduling for the Space Station presents some formidable logistics problems. One of the most basic problems is assigning supplies to a series of shuttle resupply missions. A prototype logistics expert system which constructs resupply schedules was developed. This prototype is able to reconstruct feasible resupply plans. In addition, analysts can use the system to evaluate the impact of adding, deleting or modifying launches, cargo space, experiments, etc.

Tanner, Steve; Hughes, Angi; Byrd, Jim

1987-01-01

171

Packet Scheduling with Fragmentation

We investigate a scheduling problem in a TDMA en- vironment where packets may be fragmented. Our model of the problem is derived from a scheduling problem present in data over CATV networks, where a slotted TDMA channel is used to carry both real-time and best-effort traffic. Packets of real-time flows have high priority and are allocated in fixed, periodically lo-

Nir Naaman; Raphael Rom

2002-01-01

172

NSDL National Science Digital Library

The Research Randomizer, made available by the Social Psychology Network (SPN), is a program (JavaScript) that assists researchers and students "who want an easy way to perform random sampling or assign participants to experimental conditions." A tutorial at the site works through example situations and explains how to use the Research Randomizer. The Research Randomizer is an excellent tool for researchers desiring simulations of random sampling in experimental situations.

173

Optimal External Memory Interval Management

ls/ ojs a.p hp OPTIMAL EXTERNAL MEMORY INTERVAL MANAGEMENT 1493 v v 1 v 2 v 3 v 4 v 5 X v X v 2 X v 3 X v 4 X v 5 s X v 1 b 1 b 2 b 3 b 4 b 5 b 6 b i+1 qb i Fig. 2.1. A node in the base tree. In- terval s is stored in L 2 , R 4 , and either M 2,4 or U... in X v l , D ow nl oa de d 05 /2 8/ 14 to 1 29 .2 37 .4 6. 99 . R ed ist rib ut io n su bje ct to SIA M lic en se or co py rig ht; se e h ttp ://w ww .si am .or g/j ou rna ls/ ojs a.p hp 1492 LARS ARGE AND JEFFREY SCOTT VITTER the intervals...

Arge, Lars; Vitter, Jeffrey Scott

2003-01-01

174

A calculator for confidence intervals

NASA Astrophysics Data System (ADS)

A calculator program has been written to give confidence intervals on branching ratios for rare decay modes (or similar quantities) calculated from the number of events observed, the acceptance factor, the background estimate and the associated errors. Results from different experiments (or different channels from the same experiment) can be combined. The calculator is available in http://www.slac.stanford.edu/~barlow/limits.html.

Barlow, Roger

2002-12-01

175

Fourier Analysis of Musical Intervals

NASA Astrophysics Data System (ADS)

Use of a microphone attached to a computer to capture musical sounds and software to display their waveforms and harmonic spectra has become somewhat commonplace. A recent article in The Physics Teacher aptly demonstrated the use of MacScope2 in just such a manner as a way to teach Fourier analysis.3 A logical continuation of this project is to use MacScope not just to analyze the Fourier composition of musical tones but also musical intervals.

LoPresto, Michael C.

2008-11-01

176

A Calculator for Confidence Intervals

A calculator program has been written to give confidence intervals on branching ratios for rare decay modes (or similar quantities) calculated from the number of events observed, the acceptance factor, the background estimate and the associated errors. Results from different experiments (or different channels from the same experiment) can be combined. The calculator is available in http://www.slac.stanford.edu/~barlow/limits.html

Roger Barlow

2002-03-01

177

Automated Scheduling Via Artificial Intelligence

NASA Technical Reports Server (NTRS)

Artificial-intelligence software that automates scheduling developed in Operations Mission Planner (OMP) research project. Software used in both generation of new schedules and modification of existing schedules in view of changes in tasks and/or available resources. Approach based on iterative refinement. Although project focused upon scheduling of operations of scientific instruments and other equipment aboard spacecraft, also applicable to such terrestrial problems as scheduling production in factory.

Biefeld, Eric W.; Cooper, Lynne P.

1991-01-01

178

Analysis of sequencing and scheduling methods for arrival traffic

NASA Technical Reports Server (NTRS)

The air traffic control subsystem that performs scheduling is discussed. The function of the scheduling algorithms is to plan automatically the most efficient landing order and to assign optimally spaced landing times to all arrivals. Several important scheduling algorithms are described and the statistical performance of the scheduling algorithms is examined. Scheduling brings order to an arrival sequence for aircraft. First-come-first-served scheduling (FCFS) establishes a fair order, based on estimated times of arrival, and determines proper separations. Because of the randomness of the traffic, gaps will remain in the scheduled sequence of aircraft. These gaps are filled, or partially filled, by time-advancing the leading aircraft after a gap while still preserving the FCFS order. Tightly scheduled groups of aircraft remain with a mix of heavy and large aircraft. Separation requirements differ for different types of aircraft trailing each other. Advantage is taken of this fact through mild reordering of the traffic, thus shortening the groups and reducing average delays. Actual delays for different samples with the same statistical parameters vary widely, especially for heavy traffic.

Neuman, Frank; Erzberger, Heinz

1990-01-01

179

A Model of Alcohol Drinking under an Intermittent Access Schedule Using Group-Housed Mice

Here, we describe a new model of voluntary alcohol drinking by group-housed mice. The model employs sensor-equipped cages that track the behaviors of the individual animals via implanted radio chips. After the animals were allowed intermittent access to alcohol (three 24 h intervals every week) for 4 weeks, the proportions of licks directed toward bottles containing alcohol were 50.9% and 39.6% for the male and female mice, respectively. We used three approaches (i.e., quinine adulteration, a progressive ratio schedule and a schedule involving a risk of punishment) to test for symptoms of compulsive alcohol drinking. The addition of 0.01% quinine to the alcohol solution did not significantly affect intake, but 0.03% quinine induced a greater than 5-fold reduction in the number of licks on the alcohol bottles. When the animals were required to perform increasing numbers of instrumental responses to obtain access to the bottle with alcohol (i.e., a progressive ratio schedule), they frequently reached a maximum of 21 responses irrespective of the available reward. Although the mice rarely achieved higher response criteria, the number of attempts was ?10 times greater in case of alcohol than water. We have developed an approach for mapping social interactions among animals that is based on analysis of the sequences of entries into the cage corners. This approach allowed us to identify the mice that followed other animals in non-random fashions. Approximately half of the mice displayed at least one interaction of this type. We have not yet found a clear correlation between imitative behavior and relative alcohol preference. In conclusion, the model we describe avoids the limitations associated with testing isolated animals and reliably leads to stable alcohol drinking. Therefore, this model may be well suited to screening for the effects of genetic mutations or pharmacological treatments on alcohol-induced behaviors. PMID:24804807

Smutek, Magdalena; Turbasa, Mateusz; Sikora, Magdalena; Piechota, Marcin; Zajdel, Joanna; Przewlocki, Ryszard; Parkitna, Jan Rodriguez

2014-01-01

180

Steps Toward Optimal Competitive Scheduling

NASA Technical Reports Server (NTRS)

This paper is concerned with the problem of allocating a unit capacity resource to multiple users within a pre-defined time period. The resource is indivisible, so that at most one user can use it at each time instance. However, different users may use it at different times. The users have independent, se@sh preferences for when and for how long they are allocated this resource. Thus, they value different resource access durations differently, and they value different time slots differently. We seek an optimal allocation schedule for this resource. This problem arises in many institutional settings where, e.g., different departments, agencies, or personal, compete for a single resource. We are particularly motivated by the problem of scheduling NASA's Deep Space Satellite Network (DSN) among different users within NASA. Access to DSN is needed for transmitting data from various space missions to Earth. Each mission has different needs for DSN time, depending on satellite and planetary orbits. Typically, the DSN is over-subscribed, in that not all missions will be allocated as much time as they want. This leads to various inefficiencies - missions spend much time and resource lobbying for their time, often exaggerating their needs. NASA, on the other hand, would like to make optimal use of this resource, ensuring that the good for NASA is maximized. This raises the thorny problem of how to measure the utility to NASA of each allocation. In the typical case, it is difficult for the central agency, NASA in our case, to assess the value of each interval to each user - this is really only known to the users who understand their needs. Thus, our problem is more precisely formulated as follows: find an allocation schedule for the resource that maximizes the sum of users preferences, when the preference values are private information of the users. We bypass this problem by making the assumptions that one can assign money to customers. This assumption is reasonable; a committee is usually in charge of deciding the priority of each mission competing for access to the DSN within a time period while scheduling. Instead, we can assume that the committee assigns a budget to each mission.This paper is concerned with the problem of allocating a unit capacity resource to multiple users within a pre-defined time period. The resource is indivisible, so that at most one user can use it at each time instance. However, different users may use it at different times. The users have independent, se@sh preferences for when and for how long they are allocated this resource. Thus, they value different resource access durations differently, and they value different time slots differently. We seek an optimal allocation schedule for this resource. This problem arises in many institutional settings where, e.g., different departments, agencies, or personal, compete for a single resource. We are particularly motivated by the problem of scheduling NASA's Deep Space Satellite Network (DSN) among different users within NASA. Access to DSN is needed for transmitting data from various space missions to Earth. Each mission has different needs for DSN time, depending on satellite and planetary orbits. Typically, the DSN is over-subscribed, in that not all missions will be allocated as much time as they want. This leads to various inefficiencies - missions spend much time and resource lobbying for their time, often exaggerating their needs. NASA, on the other hand, would like to make optimal use of this resource, ensuring that the good for NASA is maximized. This raises the thorny problem of how to measure the utility to NASA of each allocation. In the typical case, it is difficult for the central agency, NASA in our case, to assess the value of each interval to each user - this is really only known to the users who understand their needs. Thus, our problem is more precisely formulated as follows: find an allocation schedule for the resource that maximizes the sum ofsers preferences, when the preference values are private information of the users. We bypass this

Frank, Jeremy; Crawford, James; Khatib, Lina; Brafman, Ronen

2006-01-01

181

NASA Technical Reports Server (NTRS)

The GERRY scheduling system developed by NASA Ames with assistance from the Lockheed Space Operations Company, and the Lockheed Artificial Intelligence Center, uses a method called constraint-based iterative repair. Using this technique, one encodes both hard rules and preference criteria into data structures called constraints. GERRY repeatedly attempts to improve schedules by seeking repairs for violated constraints. The system provides a general scheduling framework which is being tested on two NASA applications. The larger of the two is the Space Shuttle Ground Processing problem which entails the scheduling of all the inspection, repair, and maintenance tasks required to prepare the orbiter for flight. The other application involves power allocation for the NASA Ames wind tunnels. Here the system will be used to schedule wind tunnel tests with the goal of minimizing power costs. In this paper, we describe the GERRY system and its application to the Space Shuttle problem. We also speculate as to how the system would be used for manufacturing, transportation, and military problems.

Zweben, Monte

1993-01-01

182

To make a decision, we need to compare the values of quantities. In many practical situations, we know the values with interval uncertainty. In such situations, we need to compare intervals. Allen’s algebra describes all possible relations between intervals on the real line, and ordering relations between such intervals are well studied. In this paper, we extend this description to intervals in an arbitrary partially ordered set (poset). In particular, we explicitly describe ordering relations between intervals that generalize relation between points. As auxiliary results, we provide a logical interpretation of the relation between intervals, and extend the results about interval graphs to intervals over posets.

Zapata, Francisco; Kreinovich, Vladik; Joslyn, Cliff A.; Hogan, Emilie A.

2013-08-01

183

Outline: CPU Scheduling Mechanisms

NSDL National Science Digital Library

This PowerPoint slide show from an instructor with the Departmentof Computer Science at Florida State University offers an in-depth lecture notes for CPU scheduling mechanisms. Most notably, this presentation includes diagrams and logical flowcharts of CPU processes which would make the material easier to understand for instructors or students engaged in technical/community college or university level computer science or engineering courses. It provides an overview of scheduling criteria and explanation and comparison of first-come-first-served, shortest-job-next, priority and non-preemptive priority, deadline, real-time, compared with round robin CPU scheduling. Mathematical models and formulas are included along with efficiency graphs. The presentation also covers process thread context, context switching, and multilevel queues.

Liu, Xiuwen

2009-08-05

184

Scheduling with genetic algorithms

NASA Technical Reports Server (NTRS)

In many domains, scheduling a sequence of jobs is an important function contributing to the overall efficiency of the operation. At Boeing, we develop schedules for many different domains, including assembly of military and commercial aircraft, weapons systems, and space vehicles. Boeing is under contract to develop scheduling systems for the Space Station Payload Planning System (PPS) and Payload Operations and Integration Center (POIC). These applications require that we respect certain sequencing restrictions among the jobs to be scheduled while at the same time assigning resources to the jobs. We call this general problem scheduling and resource allocation. Genetic algorithms (GA's) offer a search method that uses a population of solutions and benefits from intrinsic parallelism to search the problem space rapidly, producing near-optimal solutions. Good intermediate solutions are probabalistically recombined to produce better offspring (based upon some application specific measure of solution fitness, e.g., minimum flowtime, or schedule completeness). Also, at any point in the search, any intermediate solution can be accepted as a final solution; allowing the search to proceed longer usually produces a better solution while terminating the search at virtually any time may yield an acceptable solution. Many processes are constrained by restrictions of sequence among the individual jobs. For a specific job, other jobs must be completed beforehand. While there are obviously many other constraints on processes, it is these on which we focussed for this research: how to allocate crews to jobs while satisfying job precedence requirements and personnel, and tooling and fixture (or, more generally, resource) requirements.

Fennel, Theron R.; Underbrink, A. J., Jr.; Williams, George P. W., Jr.

1994-01-01

185

Adaptive Sensor Activity Scheduling in Distributed Sensor Networks: A Statistical Mechanics Approach

Adaptive Sensor Activity Scheduling in Distributed Sensor Networks: A Statistical Mechanics scheduling (A-SAS) in distributed sensor networks to enable detection and dynamic footprint tracking of spatial-temporal events. The sensor network is modeled as a Markov random field on a graph, where concepts

Ray, Asok

186

The Problem of obtaining an optimal matching and scheduling of interdependent tasks in distributed heterogeneous computing (DHC) environments is well known to be an NP-hard problem. In a DHC system, task execution time is dependent on the machine to which it is assigned and task precedence constraints are represented by a directed acyclic graph. Recent research in evolutionary techniques has shown that genetic algorithms usually obtain more efficient schedules that other known algorithms. We propose a non-evolutionary random scheduling (RS) algorithm for efficient matching and scheduling of inter-dependent tasks in a DHC system. RS is a succession of randomized task orderings and a heuristic mapping from task order to schedule. Randomized task ordering is effectively a topological sort where the outcome may be any possible task order for which the task precedent constraints are maintained. A detailed comparison to existing evolutionary techniques (GA and PSGA) shows the proposed algorithm is less complex than evolutionary techniques, computes schedules in less time, requires less memory and fewer tuning parameters. Simulation results show that the average schedules produced by RS are approximately as efficient as PSGA schedules for all cases studied and clearly more efficient than PSGA for certain cases. The standard formulation for the scheduling problem addressed in this paper is Rm|prec|Cmax.,

Wayne F. Boyer; Gurdeep S. Hura

2005-09-01

187

The scheduling limits for a typical long year taking into account technical stops, machine development, spe- cial physics runs are presented. An attempt is then made to outline a ten year post LS1 schedule taking into account the disparate requirements outlined in the previous talks in this session. The demands on the planned long shutdowns and the impact of these demands on their proposed length will be discussed. The option of using ion running as a pre-shutdown cool-down period will be addressed.

Lamont, M

2014-01-01

188

NASA Astrophysics Data System (ADS)

The new schedule for Space Shuttle missions and expendable launch vehicles (ELV's) calls for a 7-month delay in sending up the Hubble Space Telescope. NASA was forced to put off launching the telescope until February 1990 to keep the Magellan and Galileo missions within their narrow launch windows. The first post-Challenger shuttle launch is now scheduled for late this month. Discovery's most recent delays were due to a hydrogen leak discovered July 29 that has still not been corrected and an engine valve malfunction during an August 4 test fire.

Wainger, Lisa A.

189

Dynamic airline scheduling and robust airline schedule de-peaking

Demand stochasticity is a major challenge for the airlines in their quest to produce profit maximizing schedules. Even with an optimized schedule, many flights have empty seats at departure, while others suffer a lack of ...

Jiang, Hai, 1979-

2006-01-01

190

NASA Astrophysics Data System (ADS)

Advanced Planning and Scheduling (APS) has been widely recognized as a promising method for solving real production planning and scheduling problems. Based on the proposal of a real-time job shop scheduling mechanism under an APS environment, which adopts the Lagrangean relaxation method as the optimization logic, the present paper describes a feasibility study of this mechanism by evaluating its calculation speed and re-scheduling quality. Numerical experiments have been carried out for various models having different scales, as well as different densities and strengths of random events, such as the arrival of new jobs or changes to the due dates for existing jobs. The results of experiments show that the proposed scheduling mechanism has the potential to satisfy the real-time scheduling requirements, not only in terms of calculation speed and solution quality, but also with respect to predictability of the calculation load. Finally, an improvement to the Lagrangean relaxation method is proposed to improve re-scheduling quality.

Shin, Kaikou; Kuroda, Mitsuru; Natsuyama, Kouichi

191

Minimizing the number of tardy jobs for the single machine scheduling problem: MIP-based lower and

Minimizing the number of tardy jobs for the single machine scheduling problem: MIP-based lower of scheduling n jobs on a single machine. A fixed processing time and an execution interval are associated with each job. Preemption is not allowed. The objective is to find a feasible job sequence that minimizes

Briand, Cyril

192

Purpose: To determine the effect of dose and fractionation schedule of prophylactic cranial irradiation (PCI) on the incidence of chronic neurotoxicity (CNt) and changes in quality of life for selected patients with limited-disease small-cell lung cancer (LD SCLC). Methods and Materials: Patients with LD SCLC who achieved a complete response after chemotherapy and thoracic irradiation were eligible for randomization to undergo PCI to a total dose of 25 Gy in 10 daily fractions (Arm 1) vs. the experimental cohort of 36 Gy. Those receiving 36 Gy underwent a secondary randomization between daily 18 fractions (Arm 2) and twice-daily 24 fractions (Arm 3). Enrolled patients participated in baseline and follow-up neuropsychological test batteries along with quality-of-life assessments. Results: A total of 265 patients were accrued, with 131 in Arm 1, 67 in Arm 2, and 66 in Arm 3 being eligible. There are 112 patients (42.2%) alive with 25.3 months of median follow-up. There were no significant baseline differences among groups regarding quality-of-life measures and one of the neuropsychological tests, namely the Hopkins Verbal Learning Test. However, at 12 months after PCI there was a significant increase in the occurrence of CNt in the 36-Gy cohort (p = 0.02). Logistic regression analysis revealed increasing age to be the most significant predictor of CNt (p = 0.005). Conclusions: Because of the increased risk of developing CNt in study patients with 36 Gy, a total PCI dose of 25 Gy remains the standard of care for patients with LD SCLC attaining a complete response to initial chemoradiation.

Wolfson, Aaron H., E-mail: awolfson@med.miami.edu [University of Miami Miller School of Medicine, Miami, FL (United States); Bae, Kyounghwa [Department of Statistics, Radiation Therapy Oncology Group, Philadelphia, PA (United States); Komaki, Ritsuko; Meyers, Christina [M. D. Anderson Cancer Center, Houston, TX (United States); Movsas, Benjamin [Henry Ford Health System, Detroit, MI (United States); Le Pechoux, Cecile [Institute Gustave Roussy, Villejuif (France); Werner-Wasik, Maria [Thomas Jefferson Medical College, Philadelphia, PA (United States); Videtic, Gregory M.M. [Cleveland Clinic, Cleveland, OH (United States); Garces, Yolanda I. [Mayo Clinic, Rochester, MN (United States); Choy, Hak [University of Texas Southwestern, Dallas, TX (United States)

2011-09-01

193

Automated Long - Term Scheduling for the SOFIA Airborne Observatory

NASA Technical Reports Server (NTRS)

The NASA Stratospheric Observatory for Infrared Astronomy (SOFIA) is a joint US/German project to develop and operate a gyro-stabilized 2.5-meter telescope in a Boeing 747SP. SOFIA's first science observations were made in December 2010. During 2011, SOFIA accomplished 30 flights in the "Early Science" program as well as a deployment to Germany. The new observing period, known as Cycle 1, is scheduled to begin in 2012. It includes 46 science flights grouped in four multi-week observing campaigns spread through a 13-month span. Automation of the flight scheduling process offers a major challenge to the SOFIA mission operations. First because it is needed to mitigate its relatively high cost per unit observing time compared to space-borne missions. Second because automated scheduling techniques available for ground-based and space-based telescopes are inappropriate for an airborne observatory. Although serious attempts have been made in the past to solve part of the problem, until recently mission operations staff was still manually scheduling flights. We present in this paper a new automated solution for generating SOFIA long-term schedules that will be used in operations from the Cycle 1 observing period. We describe the constraints that should be satisfied to solve the SOFIA scheduling problem in the context of real operations. We establish key formulas required to efficiently calculate the aircraft course over ground when evaluating flight schedules. We describe the foundations of the SOFIA long-term scheduler, the constraint representation, and the random search based algorithm that generates observation and instrument schedules. Finally, we report on how the new long-term scheduler has been used in operations to date.

Civeit, Thomas

2013-01-01

194

Ongoing behavior predicts perceptual report of interval duration

The ability to estimate the passage of time is essential for adaptive behavior in complex environments. Yet, it is not known how the brain encodes time over the durations necessary to explain animal behavior. Under temporally structured reinforcement schedules, animals tend to develop temporally structured behavior, and interval timing has been suggested to be accomplished by learning sequences of behavioral states. If this is true, trial to trial fluctuations in behavioral sequences should be predictive of fluctuations in time estimation. We trained rodents in an duration categorization task while continuously monitoring their behavior with a high speed camera. Animals developed highly reproducible behavioral sequences during the interval being timed. Moreover, those sequences were often predictive of perceptual report from early in the trial, providing support to the idea that animals may use learned behavioral patterns to estimate the duration of time intervals. To better resolve the issue, we propose that continuous and simultaneous behavioral and neural monitoring will enable identification of neural activity related to time perception that is not explained by ongoing behavior. PMID:24672473

Gouvêa, Thiago S.; Monteiro, Tiago; Soares, Sofia; Atallah, Bassam V.; Paton, Joseph J.

2014-01-01

195

Flexibility for Vocational Education through Computer Scheduling. Quarterly Report.

ERIC Educational Resources Information Center

This progress report of a 2-year project (ending April 30, 1968) offers a random sampling of course schedule configurations and specific course performance criteria submitted to the Stanford project staff for evaluation and comment, and a brief statement of the project's data collection and data evaluation objectives. The project seeks to…

Allen, Dwight W.

196

Scheduling techniques in the Request Oriented Scheduling Engine (ROSE)

NASA Technical Reports Server (NTRS)

Scheduling techniques in the ROSE are presented in the form of the viewgraphs. The following subject areas are covered: agenda; ROSE summary and history; NCC-ROSE task goals; accomplishments; ROSE timeline manager; scheduling concerns; current and ROSE approaches; initial scheduling; BFSSE overview and example; and summary.

Zoch, David R.

1991-01-01

197

Vassal: Loadable Scheduler Support for Multi-Policy Scheduling

This paper presents Vassal, a system that enables applications to dynamically load and unload CPU scheduling policies into the operating system kernel, allowing multiple policies to be in effect simultaneously. With Vassal, applications can utilize scheduling algorithms tailored to their specific needs and general-purpose operating systems can support a wide variety of special-purpose scheduling policies without implementing each of them

George M. Candea; Michael B. Jones

1998-01-01

198

We consider a scheduling problem concerning a set of jobs {J1, J2,..., Jn} in which the job Ji requests Ci units of computation time every Ti units of time periodically. These jobs are to be executed by a timeshared single-processor computing system. It is assumed that the requests for each job arrive at the system at the beginning of the

C. L. Liu; Jane W.-S. Liu; Arthur L. Liestman

1982-01-01

199

Daytona Beach Activities Schedule

Highlights Â· Daytona Beach Â· Activities Schedule Â· Birthdays Â· Manners TheELIWeekly Daytona Beach A day of fun in the sun! WHAT: Come for a day of surf, sun, and sand at world famous Daytona Beach. We will drive to the beach in the morning and spend the day sunning, swimming, and having fun! After the beach

Pilyugin, Sergei S.

200

Daytona Beach Activities Schedule

Highlights Â· Daytona Beach Â· Activities Schedule Â· Birthdays Â· Manners TheELIWeekly Daytona Beach A day of fun in the sun! WHAT: Come for a day of surf, sun, and sand at world famous Daytona Beach. We will drive to the beach in the morning and spend the day sunning, swimming, and having fun! WHEN: Saturday

Pilyugin, Sergei S.

201

CMS multicore scheduling strategy

NASA Astrophysics Data System (ADS)

In the next years, processor architectures based on much larger numbers of cores will be most likely the model to continue "Moore's Law" style throughput gains. This not only results in many more jobs in parallel running the LHC Run 1 era monolithic applications, but also the memory requirements of these processes push the workernode architectures to the limit. One solution is parallelizing the application itself, through forking and memory sharing or through threaded frameworks. CMS is following all of these approaches and has a comprehensive strategy to schedule multicore jobs on the GRID based on the glideinWMS submission infrastructure. The main component of the scheduling strategy, a pilot-based model with dynamic partitioning of resources that allows the transition to multicore or whole-node scheduling without disallowing the use of single-core jobs, is described. This contribution also presents the experiences made with the proposed multicore scheduling schema and gives an outlook of further developments working towards the restart of the LHC in 2015.

Pérez-Calero Yzquierdo, Antonio; Hernández, Jose; Holzman, Burt; Majewski, Krista; McCrea, Alison; Cms Collaboration

2014-06-01

202

Scheduling Nonlinear Computational Loads

Scheduling Nonlinear Computational Loads JUI TSUN HUNG THOMAS G. ROBERTAZZI, Fellow, IEEE Stony node is nonlinear in the size of the assigned load. Optimal load allocation and speedup for simultaneous load distribution for a quadratic nonlinearity are obtained using simple equations. An iterative

Robertazzi, Thomas G.

203

as an index of the aversive stimulation at rich-lean transitions and as an analog measure of aberrant behavior in clinical settings. Experiment 1 investigated the effects of arranging rich-lean transitions between variable-ratio (VR) schedules or random...

Brewer, Adam Thornton

2012-08-31

204

Finding Nested Common Intervals Efficiently Guillaume Blin

Finding Nested Common Intervals Efficiently Guillaume Blin David Faye Jens Stoye Abstract In this paper, we study the problem of efficiently finding gene clusters formalized by nested common intervals common intervals. We reduce this complexity by providing a quadratic time algorithm computing

Boyer, Edmond

205

Common Intervals of Trees Steffen Heber

Common Intervals of Trees Steffen Heber Dept. of Computer Science N. C. State University, Box 7566 problems, algorithms, labelled trees 1 Introduction In this paper we consider the problem of finding common intervals of trees, a general- ization of the concept of common intervals in permutations. For a permutation

Savage, Carla D.

206

HyperStat Online: Confidence Intervals

NSDL National Science Digital Library

This resource gives a thorough definition of confidence intervals. It shows the user how to compute a confidence interval and how to interpret them. It goes into detail on how to construct a confidence interval for the difference between means, correlations, and proportions. It also gives a detailed explanation of Pearson's correlation. It also includes exercises for the user.

Lane, David M.

2009-10-16

207

47 CFR 52.35 - Porting Intervals.

Code of Federal Regulations, 2010 CFR

...2010-10-01 2010-10-01 false Porting Intervals. 52.35 Section 52.35 Telecommunication...Number Portability § 52.35 Porting Intervals. (a) All telecommunications...as described in the 2009 LNP Porting Interval Order, must comply with this...

2010-10-01

208

Efficient periodic scheduling by trees

In a perfectly-periodic schedule, time is divided into time-slots, and each client gets a time slot precisely every prede- fined number of time slots. The input to a schedule design algo- rithm is a frequency request for each client, and its task is to con- struct a perfectly periodic schedule that matches the requests as \\

Amotz Bar-noy; Boaz Patt-shamir; Vladimir Dreizin

2002-01-01

209

Efficient algorithms for periodic scheduling

In a perfectly periodic schedule, time is divided into time slots, and each client gets a slot precisely every predefined number of time slots. The input to a schedule design algorithm is a frequency request for each client, and its task is to construct a perfectly periodic schedule that matches the requests as ''closely'' as possible. The quality of the

Amotz Bar-noy; Vladimir Dreizin; Boaz Patt-shamir

2004-01-01

210

Flexible Scheduling: Making the Transition

ERIC Educational Resources Information Center

Citing literature that supports the benefits of flexible scheduling on student achievement, the author exhorts readers to campaign for flexible scheduling in their library media centers. She suggests tips drawn from the work of Graziano (2002), McGregor (2006) and Stripling (1997) for making a smooth transition from fixed to flexible scheduling:…

Creighton, Peggy Milam

2008-01-01

211

Computer Systems II CPU Scheduling

1 1 Computer Systems II CPU Scheduling 2 Review of Process States #12;2 OS Queuing Model 3 CPUReady When Does a Scheduler Take Decisions? When a process: 1. Switches from running to waiting state. 2. Switches from running to ready state. 3. Switches from waiting to ready. 4. Terminates. Scheduling under 1

212

Surprise Benefits of Arena Scheduling

ERIC Educational Resources Information Center

One of the most challenging tasks a principal must accomplish every year is the construction of the master schedule. Free from the magnetic scheduling boards and wall charts of yesteryear, principals now have technological tools--such as programs that offer schools solutions for their scheduling needs--that can save time and enable them to work…

Surloff, Andrew

2008-01-01

213

State-based scheduling: An architecture for telescope observation scheduling

NASA Technical Reports Server (NTRS)

The applicability of constraint-based scheduling, a methodology previously developed and validated in the domain of factory scheduling, is extended to problem domains that require attendance to a wider range of state-dependent constraints. The problem of constructing and maintaining a short-term observation schedule for the Hubble Space Telescope (HST), which typifies this type of domain is the focus of interest. The nature of the constraints encountered in the HST domain is examined, system requirements are discussed with respect to utilization of a constraint-based scheduling methodology in such domains, and a general framework for state-based scheduling is presented.

Muscettola, Nicola; Smith, Stephen F.

1989-01-01

214

DYNAS3: a dynamic scheduling system for job scheduling

NASA Astrophysics Data System (ADS)

This paper is concerned with the development of a dynamic scheduling system for job scheduling, named DYNAS3. Though many scheduling criteria for job scheduling have been proposed, most of them are impractical for use in the low-volume/high-variety manufacturing environment. DYNAS3 is a scheduling system for job scheduling in a low-volume/high- variety manufacturing environment, developed by using client/server computing technology. The system provides us with a practical facility for job scheduling which takes into account the influence of many factors such as machine setup times, cell changes, replacement machines, and load balancing between machines. Numerical testing examples are taken from a real- manufacturing factory of Japan, and high-quality results are efficiently generated.

Zhang, YaoXue; Li, GuangJie; Di, Shuo; Cheng, Hua; Cheng, KangFu

1995-08-01

215

RVLS Confidence Interval on a Proportion

NSDL National Science Digital Library

This applet allows you to explore the validity of confidence intervals on a proportion with various values for sample size (N) and population proportion (Pi). After you specify N, Pi, the level of confidence, and the number of simulations you wish to perform, the applet samples data according to your specification and computes a confidence interval for each simulation. The proportion of simulations for which the confidence interval contains Pi is recorded. If the method for constructing confidence intervals is valid, then about 95% of the 95% confidence intervals should contain Pi.

Lane, David M.

2009-01-06

216

Schedule Verification and Synthesis for Embedded Real-Time Components

In this paper we address the problems of schedule synthesis and timing verification for component-based architectures in embedded\\u000a systems. We consider a component to be a set of tasks with response times that lie within specified intervals. When a set\\u000a of components is deployed to implement a desired functionality, we want to guarantee that the components can achieve the timing

Purandar Bhaduri

217

On the Hardness of Subset Sum Problem from Different Intervals

NASA Astrophysics Data System (ADS)

The subset sum problem, which is often called as the knapsack problem, is known as an NP-hard problem, and there are several cryptosystems based on the problem. Assuming an oracle for shortest vector problem of lattice, the low-density attack algorithm by Lagarias and Odlyzko and its variants solve the subset sum problem efficiently, when the “density” of the given problem is smaller than some threshold. When we define the density in the context of knapsack-type cryptosystems, weights are usually assumed to be chosen uniformly at random from the same interval. In this paper, we focus on general subset sum problems, where this assumption may not hold. We assume that weights are chosen from different intervals, and make analysis of the effect on the success probability of above algorithms both theoretically and experimentally. Possible application of our result in the context of knapsack cryptosystems is the security analysis when we reduce the data size of public keys.

Kogure, Jun; Kunihiro, Noboru; Yamamoto, Hirosuke

218

Transportation Baseline Schedule

The “1999 National Transportation Program - Transportation Baseline Report” presents data that form a baseline to enable analysis and planning for future Department of Energy (DOE) Environmental Management (EM) waste/material transportation. The companion “1999 Transportation ‘Barriers’ Analysis” analyzes the data and identifies existing and potential problems that may prevent or delay transportation activities based on the data presented. The “1999 Transportation Baseline Schedule” (this report) uses the same data to provide an overview of the transportation activities of DOE EM waste/materials. This report can be used to identify areas where stakeholder interface is needed, and to communicate to stakeholders the quantity/schedule of shipments going through their area. Potential bottlenecks in the transportation system can be identified; the number of packages needed, and the capacity needed at receiving facilities can be planned. This report offers a visualization of baseline DOE EM transportation activities for the 11 major sites and the “Geologic Repository Disposal” site (GRD).

Fawcett, Ricky Lee; John, Mark Earl

2000-01-01

219

Dynamic multicommodity flow schedules

NASA Astrophysics Data System (ADS)

Some new results in the scheduling of dynamic multicommodity flows in data communication networks are presented. A new performance measure for effective delivery of backlogged data to their destinations is defined and the solution to the resulting delivery problem is obtained through a sequential linear optimization methodology. Properties of an optimal dynamic multicommodity flow schedule are studied in detail, taking advantage where possible of the linear programming formulation. The special case of the delivery problem in a single destination network also is analyzed. Application of the results to stochastic delivery problems in which the data inputs to the network are modelled as Poisson processes is addressed, and a new dynamic data communication network analysis is presented. Finally, the delivery problem on networks with capacitated links and with traversal delays is considered and some new results obtained.

Feit, A.

1981-12-01

220

INTERVAL_ARITHMETIC: A Fortran 90 Module for an Interval Data Type

INTERVAL_ARITHMETIC: A Fortran 90 Module for an Interval Data Type R. BAKER KEARFOTT ones are mac* *hine-specific, obsolete and unsupported, for languages other than Fortran, or commercial. The * *Fortran 90 mod- ule INTERVAL_ARITHMETIC provides a portable interval data type in Fortran

Kearfott, R. Baker

221

Interval estimation for statistical control

In order to develop meaningful statistical control limits when applicable based upon the normal distribution, the mean and the standard deviation of a population must be known. In cases when an accurate approximation of the standard deviation is not known, the calculation of reasonable statistical control limits can be performed by using the Student's t-distribution. The Student's t-value for n-1 degrees of freedom (n is the sample size) and the confidence intervals that correspond to 20 and 30 (95.45% and 99.73% respectively) is used to determine control limits from a relatively small data set. The control limits with this method are obtained by multiplying the calculated standard deviation of the data set by the appropriate t-value for the specified confidence level and degrees of freedom While the t-distribution method is needed primarily for a small sample size (n <30), it approaches the normal distribution with increasing n, and hence can be used for any size sample.

ABDURRAHAM, N.M.

2002-08-22

222

Schedule for Interdisciplinary Oceanography

NSDL National Science Digital Library

This class schedule, submitted by Dr. Robert Stewart of Texas A & M University, lays out the topics of discussion by day of class. One of the first days is devoted to a problem-based learning module where the students download information about the 12/26/04 tsunami from the web and forming opinions about what happened and why. The next day is devoted to the relationships between earthquakes, geologic setting and tsunamis as well as early warning systems.

Robert H Stewart

223

, OpenMP, MPI, Native, ? Ad a pt iv e Fr a m e w o rk Scheduler Performance MonitorARMI Communication Library pContainers Dependence Patterns PARAGRAPH Executor Figure 2.1: STAPL overview 2.2 Related work There are a large number....1 Contribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2. PRELIMINARIES AND RELATED WORK . . . . . . . . . . . . . . . . . 8 2.1 STAPL overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 2.2 Related work...

Sharma, Shishir

2013-05-07

224

Methods of sample size planning are developed from the accuracy in parameter approach in the multiple regression context in order to obtain a sufficiently narrow confidence interval for the population squared multiple correlation coefficient when regressors are random. Approximate and exact methods are developed that provide necessary sample size so that the expected width of the confidence interval will be

Ken Kelley

2008-01-01

225

Customer Appeasement Scheduling

Almost all of the current process scheduling algorithms which are used in modern operating systems (OS) have their roots in the classical scheduling paradigms which were developed during the 1970's. But modern computers have different types of software loads and user demands. We think it is important to run what the user wants at the current moment. A user can be a human, sitting in front of a desktop machine, or it can be another machine sending a request to a server through a network connection. We think that OS should become intelligent to distinguish between different processes and allocate resources, including CPU, to those processes which need them most. In this work, as a first step to make the OS aware of the current state of the system, we consider process dependencies and interprocess communications. We are developing a model, which considers the need to satisfy interactive users and other possible remote users or customers, by making scheduling decisions based on process dependencies and interproce...

Nikseresht, Mohammad R; Maheshwari, Anil

2010-01-01

226

Automated Platform Management System Scheduling

NASA Technical Reports Server (NTRS)

The Platform Management System was established to coordinate the operation of platform systems and instruments. The management functions are split between ground and space components. Since platforms are to be out of contact with the ground more than the manned base, the on-board functions are required to be more autonomous than those of the manned base. Under this concept, automated replanning and rescheduling, including on-board real-time schedule maintenance and schedule repair, are required to effectively and efficiently meet Space Station Freedom mission goals. In a FY88 study, we developed several promising alternatives for automated platform planning and scheduling. We recommended both a specific alternative and a phased approach to automated platform resource scheduling. Our recommended alternative was based upon use of exactly the same scheduling engine in both ground and space components of the platform management system. Our phased approach recommendation was based upon evolutionary development of the platform. In the past year, we developed platform scheduler requirements and implemented a rapid prototype of a baseline platform scheduler. Presently we are rehosting this platform scheduler rapid prototype and integrating the scheduler prototype into two Goddard Space Flight Center testbeds, as the ground scheduler in the Scheduling Concepts, Architectures, and Networks Testbed and as the on-board scheduler in the Platform Management System Testbed. Using these testbeds, we will investigate rescheduling issues, evaluate operational performance and enhance the platform scheduler prototype to demonstrate our evolutionary approach to automated platform scheduling. The work described in this paper was performed prior to Space Station Freedom rephasing, transfer of platform responsibility to Code E, and other recently discussed changes. We neither speculate on these changes nor attempt to predict the impact of the final decisions. As a consequence some of our work and results may be outdated when this paper is published.

Hull, Larry G.

1990-01-01

227

Energy-optimal schedules of real-time jobs with hard deadlines

the 31 Algorithm 3 Optimize jobs 1: function Optimize(job Ji) 2: if Stack is empty then 3: Schedule Ji in the interval [0,di] at frequency fi = xidi?0 4: Push(Ji) to Stack 5: else 6: JPREVIOUS ?Stack.Pop() 7: si = dPREVIOUS 8: if xidi?si > f...PREVIOUS then 9: JPREVIOUS = Merge(Ji,JPREVIOUS) 10: Optimize(JPREVIOUS) 11: else 12: Push(JPREVIOUS) to Stack 13: Schedule Ji in the interval [si,di] at frequency fi = xidi?si 14: Push(Ji) to Stack Algorithm 4 Merge jobs 1: function Merge(Ji,JPREVIOUS) 2: Create...

George, John Vijoe

2005-11-01

228

1993 Wholesale Power and Transmission Rate Schedules.

Bonneville Power Administration 1993 Wholesale Power Rate Schedules and General Rate Schedule Provisions and 1993 Transmission Rate Schedules and General Transmission Rate Schedule Provisions, contained herein, were approved on an interim basis effective October 1, 1993. These rate schedules and provisions were approved by the Federal Energy Commission, United States Department of Energy, in September, 1993. These rate schedules and provisions supersede the Administration`s Wholesale Power Rate Schedules and General Rate Schedule Provisions and Transmission Rate Schedules and General Transmission Rate Schedule Provisions effective October 1, 1991.

US Bonneville Power Administration

1993-10-01

229

Record-breaking earthquake intervals in a global catalogue and an aftershock sequence

NASA Astrophysics Data System (ADS)

For the purposes of this study, an interval is the elapsed time between two earthquakes in a designated region; the minimum magnitude for the earthquakes is prescribed. A record-breaking interval is one that is longer (or shorter) than preceding intervals; a starting time must be specified. We consider global earthquakes with magnitudes greater than 5.5 and show that the record-breaking intervals are well estimated by a Poissonian (random) theory. We also consider the aftershocks of the 2004 Parkfield earthquake and show that the record-breaking intervals are approximated by very different statistics. In both cases, we calculate the number of record-breaking intervals (nrb) and the record-breaking interval durations ?trb as a function of "natural time", the number of elapsed events. We also calculate the ratio of record-breaking long intervals to record-breaking short intervals as a function of time, r(t), which is suggested to be sensitive to trends in noisy time series data. Our data indicate a possible precursory signal to large earthquakes that is consistent with accelerated moment release (AMR) theory.

Yoder, M. R.; Turcotte, D. L.; Rundle, J. B.

2010-03-01

230

Probability Distribution for Flowing Interval Spacing

The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M&O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M&O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M&O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be less than the spacing determined in this analysis. Therefore, in terms of repository performance the results of this analysis may underestimate the effect of matrix diffusion processes in SZ transport models. In summary, performance analysis will be conservative if the flowing interval spacing determined by this study is used in the simulation of mass transport in the saturated zone instead of the fracture spacing.

S. Kuzio

2001-05-16

231

Implementation of an adaptive, constant thermodynamic speed simulated annealing schedule

NASA Astrophysics Data System (ADS)

We present a new implementation of simulated annealing. It calculates its own cooling schedule from thermodynamic information computed enroute. Our schedule proceeds at constant "thermodynamic speed" to minimize the entropy produced along the way. It is based on finite time thermodynamics and is, in some sense, optimal. We use ensembles over the entire configuration space to compute thermodynamic averages. Ensembles provide good statistics for the thermodynamic process as well as a direct, instantaneous measures of the nature of the configuration space with Hamming distances. We compare our implementation with some others on a random graph problem. It is found to produce superior results.

Ruppeiner, George

1988-09-01

232

A Novel Hybrid Algorithm for Task Graph Scheduling

One of the important problems in multiprocessor systems is Task Graph Scheduling. Task Graph Scheduling is an NP-Hard problem. Both learning automata and genetic algorithms are search tools which are used for solving many NP-Hard problems. In this paper a new hybrid method based on Genetic Algorithm and Learning Automata is proposed. The proposed algorithm begins with an initial population of randomly generated chromosomes and after some stages, each chromosome maps to an automaton. Experimental results show that superiority of the proposed algorithm over the current approaches.

Nezhad, Vahid Majid; Efimov, Evgueni

2011-01-01

233

Scheduling manufacturing systems with work-in-process inventory control: Reentrant systems

In this paper, we propose a procedure for production flow control in reentrant manufacturing systems. The system under study consists ofN machines and producesM product types simultaneously. Each part goes through the system following a predefined process and may visit a machine many times. All machines are subject to random failures and need random repair times. The scheduling objectives are

S. X. Bai; S. B. Gershwin

1996-01-01

234

Interval Estimation for a Binomial Proportion

We revisit the problem of interval estimation of a binomial\\u000aproportion. The erratic behavior of the coverage probability of the standard\\u000aWald confidence interval has previously been remarked on in the literature\\u000a(Blyth and Still, Agresti and Coull, Santner and others). We begin by showing\\u000athat the chaotic coverage properties of the Wald interval are far more\\u000apersistent than is

Lawrence D. Brown; T. Tony Cai; Anirban DasGupta

2001-01-01

235

2007 Wholesale Power Rate Schedules : 2007 General Rate Schedule Provisions.

This schedule is available for the contract purchase of Firm Power to be used within the Pacific Northwest (PNW). Priority Firm (PF) Power may be purchased by public bodies, cooperatives, and Federal agencies for resale to ultimate consumers, for direct consumption, and for Construction, Test and Start-Up, and Station Service. Rates in this schedule are in effect beginning October 1, 2006, and apply to purchases under requirements Firm Power sales contracts for a three-year period. The Slice Product is only available for public bodies and cooperatives who have signed Slice contracts for the FY 2002-2011 period. Utilities participating in the Residential Exchange Program (REP) under Section 5(c) of the Northwest Power Act may purchase Priority Firm Power pursuant to the Residential Exchange Program. Rates under contracts that contain charges that escalate based on BPA's Priority Firm Power rates shall be based on the three-year rates listed in this rate schedule in addition to applicable transmission charges. This rate schedule supersedes the PF-02 rate schedule, which went into effect October 1, 2001. Sales under the PF-07 rate schedule are subject to BPA's 2007 General Rate Schedule Provisions (2007 GRSPs). Products available under this rate schedule are defined in the 2007 GRSPs. For sales under this rate schedule, bills shall be rendered and payments due pursuant to BPA's 2007 GRSPs and billing process.

United States. Bonneville Power Administration.

2006-11-01

236

Random root movements in weightlessness

NASA Technical Reports Server (NTRS)

The dynamics of root growth was studied in weightlessness. In the absence of the gravitropic reference direction during weightlessness, root movements could be controlled by spontaneous growth processes, without any corrective growth induced by the gravitropic system. If truly random of nature, the bending behavior should follow so-called 'random walk' mathematics during weightlessness. Predictions from this hypothesis were critically tested. In a Spacelab ESA-experiment, denoted RANDOM and carried out during the IML-2 Shuttle flight in July 1994, the growth of garden cress (Lepidium sativum) roots was followed by time lapse photography at 1-h intervals. The growth pattern was recorded for about 20 h. Root growth was significantly smaller in weightlessness as compared to gravity (control) conditions. It was found that the roots performed spontaneous movements in weightlessness. The average direction of deviation of the plants consistently stayed equal to zero, despite these spontaneous movements. The average squared deviation increased linearly with time as predicted theoretically (but only for 8-10 h). Autocorrelation calculations showed that bendings of the roots, as determined from the 1-h photographs, were uncorrelated after about a 2-h interval. It is concluded that random processes play an important role in root growth. Predictions from a random walk hypothesis as to the growth dynamics could explain parts of the growth patterns recorded. This test of the hypothesis required microgravity conditions as provided for in a space experiment.

Johnsson, A.; Karlsson, C.; Iversen, T. H.; Chapman, D. K.

1996-01-01

237

Characteristics of aversive thresholds measured by a titration schedule1

Three variables were studied for their single and joint effects on aversive thresholds obtained by a titration schedule with electric shock as the aversive stimulus. One variable was the interval between the periodic increments in shock amplitude. Another was the size of this increment. The third was the size of the decrement produced when the subject responded. Both monkeys and humans tolerated more shock at the shorter increment-to-increment intervals than at the longer ones, confirming rat data previously obtained. Reducing the decremental step size also led to an increase in the amount of shock tolerated by monkeys. In addition, simply reducing both incremental and decremental step size without changing the interval between increments led monkeys to tolerate more shock. The attempt to explain these data was based on the changes that such manipulations might produce in the amount of avoidance behavior elicited. PMID:14055954

Weiss, Bernard; Laties, Victor G.

1963-01-01

238

Energy-Saving Service Scheduling for Low-End Cyber-Physical Systems

Energy consumption and timely requirements are two key factors affecting the performance of mission-critical cyber-physical systems. Little work deals with scheduling a set of time-sensitive services in a finite time interval. We consider a server serving N users in time interval [0, T]. Each user demands its service to be completed in a strict deadline. Based on convex power-speed relationship,

Wei Jiang; Guangze Xiong; Xuyang Ding

2008-01-01

239

Motor and executive control in repetitive timing of brief intervals.

We investigated the causal role of executive control functions in the production of brief time intervals by means of a concurrent task paradigm. To isolate the influence of executive functions on timing from motor coordination effects, we dissociated executive load from the number of effectors used in the dual task situation. In 3 experiments, participants produced isochronous intervals ranging from 524 to 2,000 ms with either the left or the right hand. The concurrent task consisted of the production of either a pseudorandom (high cognitive load) or a simple repeated (low cognitive load) spatial sequence of key presses, while also maintaining a regular temporal sequence. This task was performed with either a single hand (unimanual) or with both hands simultaneously (bimanual). Interference in terms of increased timing variability caused by the concurrent task was observed only in the bimanual condition. We verified that motor coordination in bimanual tasks alone could not account for the interference. Timing interference only appeared when (a) more than 1 effector was involved and (b) there were simultaneous task demands that recruited executive functions. Task interference was not seen if only 1 of these 2 conditions was met. Thus, our results suggest that executive functions are not directly involved in motor timing, but can indirectly affect timing performance when they are required to schedule complex motor coordination. PMID:22731995

Holm, Linus; Ullén, Fredrik; Madison, Guy

2013-04-01

240

NSDL National Science Digital Library

This site offers an explanation of how and why the round-robin CPUscheduling mechanisms work from the Department of Computer Science at the University of Illinois at Urbana-Champaign. It includes a concise overview and mathematical analysis, and includes an online Java based interactive simulation of round-robin scheduling. It includes the Java source code as well as an explanation of how the simulation was created and how it works. This site is applicable for technical / community college or university level computer science or engineering courses for instructors lecture notes, in class demonstrations, or for possible student assignment to experiment with the simulation.

Kon, Fabio

2009-08-14

241

Trajectory scheduled missile autopilot design

An alternate approach to the design of gain-scheduled controllers is given. Standard gain-scheduling typically relies on the instantaneous value of the scheduling variable to update the control gains. This approach has the drawbacks of being limited to slow transitions between operating points and requiring a number of possibly tedious point-by-point designs. The present approach is to have the controller gains

Jeff S. Shamma; James R. Cloutier

1992-01-01

242

NASA Astrophysics Data System (ADS)

This paper combines a verified interval optimization method with the FEM for designing structures, which is denominated as the Hybrid Interval Genetic Algorithm (HIGA). This algorithm can neglect formulated equations and interval analysis, and while determining the optimum interval parameters. Furthermore, it can also maximize the design scope. In this paper, this algorithm is implemented for both a truss and frame structure. The interval optimizations include the static and dynamic responses of these structures. The results show that the algorithm which combines the IGA with the FEM can determine the feasible interval design parameters of structures with allowable objective errors.

Shiau, Ting-Nung; Kang, Chung-Hao; Liu, De-Shin; Hsu, Wei-Chun

243

Randomized Scheduling Algorithm for Data Aggregation in Wireless Sensor Networks

, tunnels and water supply systems. In this context, one of the most challenging applications of WSNs is to transmit the acoustic emission signals and the vibration measurements detected in bridges. This application their data to their next hop soon after the event is detected. Second, the intensive use of This work

Leung, Kin K.

244

COMPASS: An Ada based scheduler

NASA Technical Reports Server (NTRS)

COMPASS is a generic scheduling system developed by McDonnell Douglas and funded by the Software Technology Branch of NASA Johnson Space Center. The motivation behind COMPASS is to illustrate scheduling technology and provide a basis from which custom scheduling systems can be built. COMPASS was written in Ada to promote readability and to conform to DOD standards. COMPASS has some unique characteristics that distinguishes it from commercial products. This paper discusses these characteristics and uses them to illustrate some differences between scheduling tools.

Mcmahon, Mary Beth; Culbert, Chris

1992-01-01

245

Patient safety considerations concerning the scheduling of emergency-off system tests.

Emergency-off systems (EOS) are essential to the safe operation of medical accelerators and other high-risk equipment. To assure reliable functioning, some states require weekly tests; others permit monthly, tri-monthly or even six-monthly tests, while some do not specify test intervals. We investigate the relative safety of the various test schedules by computing the fraction of time during which a nonfunctional state of the EOS may remain undetected. Special attention is given to the effect of flexibility (i.e., to regulations that specify the number of tests that have to be done in any given time interval, but allow a range within the interval during which a test can be done). Compared to strict test intervals, a schedule that provides flexibility increases risk only marginally. Performing tests on any arbitrary day of the week when weekly tests are required increases the time span during which a nonfunctionality goes undetected by only 17%, compared to an exact one-week schedule. The same ratio applies for monthly tests. For a three-month schedule, the relative risk increases by only 2% if tests are done on an arbitrarily chosen day during each due-month, compared to tests done on an exact three-month schedule. The most irregular time intervals possible in a three-calendar month schedule increase the relative risk by 11%. For the six-month and the 12-month schedule the ratio of risks is even smaller. The relative risk is virtually independent of the mean time between failures of the EOS, but the absolute risk decreases in proportion the mean time between failures. Adherence to strict, resource-intensive test intervals provides little extra safety compared to flexible intervals that require the same number of tests per year. Regulations should be changed to provide the practicality offered by flexible test schedules. Any additional increase in patient safety could be achieved by strict regulations concerning reliability of emergency-stop (e-stop) systems. PMID:24710445

Brezovich, Ivan A; Popple, Richard A

2014-01-01

246

SINGLE-INTERVAL GAS PERMEABILITY ESTIMATION

Single-interval, steady-steady-state gas permeability testing requires estimation of pressure at a screened interval which in turn requires measurement of friction factors as a function of mass flow rate. Friction factors can be obtained by injecting air through a length of pipe...

247

Interval Arithmetic Kalman Filtering Steven Reece

Interval Arithmetic Kalman Filtering Steven Reece Abstract The problem of robust estimation filter (BDF), is proposed which combines interval arithmetic with statistical Kalman filter estimation and v t are zeroÂmean and temporarily uncorrelated. The EKF determines the estimate of the system state

Roberts, Stephen

248

Interval arithmetic in power flow analysis

The authors describe a method for taking uncertainty into account during power flow solutions with uncertain input data. The method is based on interval arithmetic which takes into consideration the uncertainty of the nodal information, and is able to provide strict bounds for the solutions to the problem: all possible solutions are included within the bounds given by interval arithmetic.

Zian Wang; Fernando L. Alvarado

1992-01-01

249

Interval regression analysis using support vector networks

Support vector machines (SVMs) have been very successful in pattern classification and function estimation problems for crisp data. In this paper, the v-support vector interval regression network (v-SVIRN) is proposed to evaluate interval linear and nonlinear regression models for crisp input and output data. As it is difficult to select an appropriate value of the insensitive tube width in ?-support

Pei-yi Hao

2009-01-01

250

Multiple Regression: The Confidence Interval of rho

NSDL National Science Digital Library

This page will calculate the 0.95 and 0.99 confidence intervals for rho, based on the Fisher r-to-z transformation. To perform the calculations, enter the values of r and n in the designated places, then click the "calculate" button. Note that the confidence interval of rho is symmetrical around the observed r only with large values of n.

Lowry, Richard

2009-01-12

251

Interval Arithmetic Implementations using Floating Point Arithmetic

This paper presents some algorithms implementing interval arithmetic using floating point arithmetic. The algorithms apply to almost any digital computer supporting normalized floating point arithmetic and provide better performance than conventional interval arithmetic program libraries. For reasons of generality and machine independence, algorithms are presented in a high-level language. They are intended to be used as an implementation guide for

Michael Clemmesen

1984-01-01

252

Interval arithmetic implementations: using floating point arithmetic

This paper presents some algorithms implementing interval arithmetic using floating point arithmetic. The algorithms apply to almost any digital computer supporting normalized floating point arithmetic and provide better performance than conventional interval arithmetic program libraries. For reasons of generality and machine independence, algorithms are presented in a high-level language. They are intended to be used as an implementation guide for

Michael Clemmesen

1984-01-01

253

March 29, 2008 OS:CPU Scheduling 1 CPU Scheduling

in the order: P1 , P2 , P3 The Gantt Chart for the schedule is: Waiting time for P1 = 0; P2 = 24; P3 = 27 Example 2: FCFS Scheduling Suppose that the processes arrive in the order P2 , P3 , P1 . The Gantt chart

Adam, Salah

254

Pfair Scheduling of Generalized Pinwheel Task Systems

The scheduling of generalized pinwheel task systems is considered. It is shown that pinwheel scheduling is closely related to the fair scheduling of periodic task systems. This relationship is exploited to obtain new scheduling algorithms for generalized pinwheel task systems. When compared to traditional pinwheel scheduling algorithms, these new algorithms are both more efficient from a run-time complexity point of

Sanjoy K. Baruah; Shun-shii Lin

1998-01-01

255

Statler College Scheduling Guidelines General Procedures

Statler College Scheduling Guidelines General Procedures Basic event information needed o Event://eventscheduling.wvu.edu and use MyID login o Send email to statler-scheduling@mail.wvu.edu o Visit a department scheduler Statler College location FAQs http://www.statler.wvu.edu/scheduling Statler College Scheduling Guidelines General

Mohaghegh, Shahab

256

Biomathematics and Interval Analysis: A Prosperous Marriage

NASA Astrophysics Data System (ADS)

In this survey paper we focus our attention on dynamical bio-systems involving uncertainties and the use of interval methods for the modelling study of such systems. The kind of envisioned uncertain systems are those described by a dynamical model with parameters bounded in intervals. We point out to a fruitful symbiosis between dynamical modelling in biology and computational methods of interval analysis. Both fields are presently in the stage of rapid development and can benefit from each other. We point out on recent studies in the field of interval arithmetic from a new perspective—the midpoint-radius arithmetic which explores the properties of error bounds and approximate numbers. The midpoint-radius approach provides a bridge between interval methods and the "uncertain but bounded" approach used for model estimation and identification. We briefly discuss certain recently obtained algebraic properties of errors and approximate numbers.

Markov, S. M.

2010-11-01

257

77 FR 75075 - Schedules of Controlled Substances: Placement of Lorcaserin Into Schedule IV

Federal Register 2010, 2011, 2012, 2013

...produced by zolpidem (Schedule IV) and ketamine (Schedule III) HHS described a human...comparator drugs zolpidem (Schedule IV) and ketamine (Schedule III) produced significant...Lorcaserin, as well as zolpidem and ketamine, significantly increased reports...

2012-12-19

258

Distributed Opportunistic Scheduling With Two-Level Probing

Distributed opportunistic scheduling (DOS) is studied for wireless ad-hoc networks in which many links contend for the channel using random access before data transmissions. Simply put, DOS involves a process of joint channel probing and distributed scheduling for ad-hoc (peer-to-peer) communications. Since, in practice, link conditions are estimated with noisy observations, the transmission rate has to be backed off from the estimated rate to avoid transmission outages. Then, a natural question to ask is whether it is worthwhile for the link with successful contention to perform further channel probing to mitigate estimation errors, at the cost of additional probing. Thus motivated, this work investigates DOS with two-level channel probing by optimizing the tradeoff between the throughput gain from more accurate rate estimation and the resulting additional delay. Capitalizing on optimal stopping theory with incomplete information, we show that the optimal scheduling policy is threshold-based and is character...

S., Chandrashekhar Thejaswi P; Pun, Man-On; Poor, H Vincent; Zheng, Dong

2009-01-01

259

Technology Transfer Automated Retrieval System (TEKTRAN)

We investigate the use of confidence intervals and standard error intervals to draw conclusions regarding tests of hypotheses about normal population means. Mathematical expressions and algebraic manipulations are given, and computer simulations are performed to assess the usefulness of confidence ...

260

Background Informal Problem Statement QC Confidence Intervals Illustrations Women's Health Informal Problem Statement QC Confidence Intervals Illustrations Women's Health Initiative Outline Bounds Illustrations Women's Health Initiative #12;Background Informal Problem Statement QC Confidence

Stark, Philip B.

261

Despite numerous published studies describing adjuvant chemotherapy for canine appendicular osteosarcoma, there is no consensus as to the optimal chemotherapy protocol. The purpose of this study was to determine whether either of two protocols would be associated with longer disease-free interval (DFI) in dogs with appendicular osteosarcoma following amputation. Dogs with histologically confirmed appendicular osteosarcoma that were free of gross metastases and underwent amputation were eligible for enrollment. Dogs were randomized to receive either six doses of carboplatin or three doses each of carboplatin and doxorubicin on an alternating schedule. Fifty dogs were included. Dogs receiving carboplatin alone had a significantly longer DFI (425 versus 135?days) than dogs receiving alternating carboplatin and doxorubicin (P?=?0.04). Toxicity was similar between groups. These results suggest that six doses of carboplatin may be associated superior DFI when compared to six total doses of carboplatin and doxorubicin. PMID:24118677

Skorupski, K A; Uhl, J M; Szivek, A; Allstadt Frazier, S D; Rebhun, R B; Rodriguez, C O

2013-10-01

262

The purpose of this study was to evaluate the role of interferon as initial and maintenance therapy in patients with newly diagnosed follicular lymphoma. Between 1984 and 1994, 204 patients with newly diagnosed Stage III or Stage IV follicular lymphoma were randomized to receive either, Chlorambucil (CB): 10?mg daily for 6 weeks, followed by a 2-week interval, with 3 subsequent 2-week treatment periods at the same dose, separated by 2-week intervals, or, CB given concurrently with interferon (IFN). IFN was given at a dose of 3 × 106units thrice weekly, subcutaneously, throughout the 18-week treatment period. Responding patients were subsequently randomized to receive maintenance IFN at the dose and schedule described above, or to expectant management. The overall response rate was 161/204 (78%), complete remission being achieved in 24% of patients. Neither the addition of IFN to the initial treatment, nor the use of maintenance IFN influenced response rate, remission duration or survival. This study was undertaken to determine whether IFN, given in combination with, and then subsequent to, CB would alter the clinical course of patients with follicular lymphoma. Disappointingly, this objective was not achieved, no advantage having been demonstrated for the addition of IFN. © 2001 Cancer Research Campaign http://www.bjcancer.com PMID:11437398

Rohatiner, A; Radford, J; Deakin, D; Earl, H; Love, S B; Price, O; Wilson, A; Lister, T A

2001-01-01

263

Practical Scheffe-type credibility intervals for variables of a groundwater model

Simultaneous Scheffe-type credibility intervals (the Bayesian version of confidence intervals) for variables of a groundwater flow model calibrated using a Bayesian maximum a posteriori procedure were derived by Cooley [1993b]. It was assumed that variances reflecting the expected differences between observed and model-computed quantities used to calibrate the model are known, whereas they would often be unknown for an actual model. In this study the variances are regarded as unknown, and variance variability from observation to observation is approximated by grouping the data so that each group is characterized by a uniform variance. The credibility intervals are calculated from the posterior distribution, which was developed by considering each group variance to be a random variable about which nothing is known a priori, then eliminating it by integration. Numerical experiments using two test problems illustrate some characteristics of the credibility intervals. Nonlinearity of the statistical model greatly affected some of the credibility intervals, indicating that credibility intervals computed using the standard linear model approximation may often be inadequate to characterize uncertainty for actual field problems. The parameter characterizing the probability level for the credibility intervals was, however, accurately computed using a linear model approximation, as compared with values calculated using second-order and fully nonlinear formulations. This allows the credibility intervals to be computed very efficiently.Simultaneous Scheffe-type credibility intervals for variables of a groundwater flow model calibrated using a Bayesian maximum a posteriori procedure were developed. The variances reflecting the expected differences between the observed and model-computed quantities were unknown, and variance variability from observation to observation was approximated by grouping the data so that each group was characterized by a uniform variance. Nonlinearity of the statistical model greatly affected some of the credibility intervals, indicating that credibility intervals computed using the standard linear model approximation may often be inadequate to characterize uncertainty for actual field problems.

Cooley, R.L.

1999-01-01

264

ERIC Educational Resources Information Center

Monte Carlo methods were used to examine techniques for constructing confidence intervals around multivariate effect sizes. Using interval inversion and bootstrapping methods, confidence intervals were constructed around the standard estimate of Mahalanobis distance (D[superscript 2]), two bias-adjusted estimates of D[superscript 2], and Huberty's…

Hess, Melinda R.; Hogarty, Kristine Y.; Ferron, John M.; Kromrey, Jeffrey D.

2007-01-01

265

ERIC Educational Resources Information Center

This paper discusses the impact of sampling error on the construction of confidence intervals around effect sizes. Sampling error affects the location and precision of confidence intervals. Meta-analytic resampling demonstrates that confidence intervals can haphazardly bounce around the true population parameter. Special software with graphical…

Du, Yunfei

266

Block Schedule: Breaking the Barriers.

ERIC Educational Resources Information Center

As of 1996, Chaparral High School in Las Vegas, Nevada, was in the fourth year of a radical restructuring effort. The school changed from a 6-period day, composed of 51-minute periods, to an alternating day schedule, composed of 3 102-minute periods per day. This report describes how the school developed and implemented the new schedule. Faculty…

West, Mike

267

Nurse Scheduling Using Harmony Search

In this research an adaption of Harmony Search Al- gorithm (HSA) for Nurse Scheduling Problem (NSP) is presented. Nurse scheduling problem is a task of assigning shifts to nurses for the duties that have to carry out. The difficulty of handling this problem is due to the high number of constraints to be satisfied. Thus, we are proposing an adaptation

Mohammed A. Awadallah; Ahamad Tajudin Khader; Mohammed Azmi Al-Betar; Asaju La'aro Bolaji

2011-01-01

268

Web-based irrigation scheduler

Technology Transfer Automated Retrieval System (TEKTRAN)

Currently, few web-based irrigation scheduling tools are available for the humid growing environments of the Mid-South. Common irrigation scheduling systems rely on soil or weather data to estimate crop water use, and are more commonly calibrated for dry growing environments. Increasing use of water...

269

Block Scheduling: Pathways to Success.

ERIC Educational Resources Information Center

Block scheduling and intensive or extended-time scheduling have been appearing in high schools across the United States. This guidebook is designed to help each school find the best route to achieve this organizational change. The book's premise is that structural change in schools is not sufficient to create meaningful educational improvement.…

Adams, Don C.; Salvaterra, Mary E.

270

Hopfield networks and scheduling problems

In this paper we present a neural generator method that uses a neural network to generate initial search points for a discrete heuristic. We demonstrate the method for the subset-sum problem (SSP), and consider the SSP to be typical of the sub-problems that a scheduling algorithm must solve while on route to solving an entire scheduling problem. The neural generator

William J. Wolfe

1995-01-01

271

UPDATING THE ARKANSAS IRRIGATION SCHEDULER

Technology Transfer Automated Retrieval System (TEKTRAN)

The complexity of scheduling irrigation can be greatly reduced by the use of publicly available computer programs. However, irrigation scheduling is more complicated in humid regions than arid locations, due to factors such as cloudy weather, rainfall, and temperature swings caused by the movement o...

272

Paired Gang Scheduling Yair Wiseman

Paired Gang SchedulingÂ£ Yair WisemanÃ? School of CS & Engineering Hebrew University and Dept Hebrew University Jerusalem Israel feit@cs.huji.ac.il Abstract Conventional gang scheduling has alternative processes cannot be run independently of their own gangs. To alleviate this problem we suggest

Feitelson, Dror

273

Production Scheduling and Genetic Algorithms

This treatise deals with the applicability of genetic algorithms to the area of production scheduling. To begin with, an introduction to the principles of genetic algorithms is given. After having outlined a standard genetic algorithm, first approaches to the traveling salesman problem are explained. On this basis, a survey on several approaches to different production scheduling problems is given.

Michael Neubauer; Gesamthochschule Essen

1995-01-01

274

CONFIDENCE INTERVALS FOR POPULATION RANKS IN THE PRESENCE OF TIES AND NEAR TIES

CONFIDENCE INTERVALS FOR POPULATION RANKS IN THE PRESENCE OF TIES AND NEAR TIES Minge Xie, Kesar of unknown ties or "near ties" among the populations to be ranked. Given random samples from k populations theory and non- standard bootstrap inference for population ranks which allow ties and near ties

Xie, Minge

275

Sample Size for Confidence Interval of Covariate-Adjusted Mean Difference

ERIC Educational Resources Information Center

This article provides a way to determine adequate sample size for the confidence interval of covariate-adjusted mean difference in randomized experiments. The standard error of adjusted mean difference depends on covariate variance and balance, which are two unknown quantities at the stage of planning sample size. If covariate observations are…

Liu, Xiaofeng Steven

2010-01-01

276

CHROMIUM SUPPLEMENTATION SHORTENS QTC INTERVAL DURATION IN PATIENTS WITH TYPE 2 DIABETES

Technology Transfer Automated Retrieval System (TEKTRAN)

We investigated the potential effects of chromium supplementation on QTc interval duration in patients with type 2 diabetes. Sixty patients with type 2 diabetes mellitus were randomly assigned to two groups. Group A received 1000 'g of chromium as chromium picolinate (CrPic) daily for 3 months, fo...

277

CARMENES instrument control system and operational scheduler

NASA Astrophysics Data System (ADS)

The main goal of the CARMENES instrument is to perform high-accuracy measurements of stellar radial velocities (1m/s) with long-term stability. CARMENES will be installed in 2015 at the 3.5 m telescope in the Calar Alto Observatory (Spain) and it will be equipped with two spectrographs covering from the visible to the near-infrared. It will make use of its near-IR capabilities to observe late-type stars, whose peak of the spectral energy distribution falls in the relevant wavelength interval. The technology needed to develop this instrument represents a challenge at all levels. We present two software packages that play a key role in the control layer for an efficient operation of the instrument: the Instrument Control System (ICS) and the Operational Scheduler. The coordination and management of CARMENES is handled by the ICS, which is responsible for carrying out the operations of the different subsystems providing a tool to operate the instrument in an integrated manner from low to high user interaction level. The ICS interacts with the following subsystems: the near-IR and visible channels, composed by the detectors and exposure meters; the calibration units; the environment sensors; the front-end electronics; the acquisition and guiding module; the interfaces with telescope and dome; and, finally, the software subsystems for operational scheduling of tasks, data processing, and data archiving. We describe the ICS software design, which implements the CARMENES operational design and is planned to be integrated in the instrument by the end of 2014. The CARMENES operational scheduler is the second key element in the control layer described in this contribution. It is the main actor in the translation of the survey strategy into a detailed schedule for the achievement of the optimization goals. The scheduler is based on Artificial Intelligence techniques and computes the survey planning by combining the static constraints that are known a priori (i.e., target visibility, sky background, required time sampling coverage) and the dynamic change of the system conditions (i.e., weather, system conditions). Off-line and on-line strategies are integrated into a single tool for a suitable transfer of the target prioritization made by the science team to the real-time schedule that will be used by the instrument operators. A suitable solution will be expected to increase the efficiency of telescope operations, which will represent an important benefit in terms of scientific return and operational costs. We present the operational scheduling tool designed for CARMENES, which is based on two algorithms combining a global and a local search: Genetic Algorithms and Hill Climbing astronomy-based heuristics, respectively. The algorithm explores a large amount of potential solutions from the vast search space and is able to identify the most efficient ones. A planning solution is considered efficient when it optimizes the objectives defined, which, in our case, are related to the reduction of the time that the telescope is not in use and the maximization of the scientific return, measured in terms of the time coverage of each target in the survey. We present the results obtained using different test cases.

Garcia-Piquer, Alvaro; Guàrdia, Josep; Colomé, Josep; Ribas, Ignasi; Gesa, Lluis; Morales, Juan Carlos; Pérez-Calpena, Ana; Seifert, Walter; Quirrenbach, Andreas; Amado, Pedro J.; Caballero, José A.; Reiners, Ansgar

2014-07-01

278

NASA Astrophysics Data System (ADS)

We study the effects of randomness on competitions based on an elementary random process in which there is a finite probability that a weaker team upsets a stronger team. We apply this model to sports leagues and sports tournaments, and compare the theoretical results with empirical data. Our model shows that single-elimination tournaments are efficient but unfair: the number of games is proportional to the number of teams N, but the probability that the weakest team wins decays only algebraically with N. In contrast, leagues, where every team plays every other team, are fair but inefficient: the top ?{N} of teams remain in contention for the championship, while the probability that the weakest team becomes champion is exponentially small. We also propose a gradual elimination schedule that consists of a preliminary round and a championship round. Initially, teams play a small number of preliminary games, and subsequently, a few teams qualify for the championship round. This algorithm is fair and efficient: the best team wins with a high probability and the number of games scales as N 9/5, whereas traditional leagues require N 3 games to fairly determine a champion.

Ben-Naim, E.; Hengartner, N. W.; Redner, S.; Vazquez, F.

2013-05-01

279

CALCULATION OF NONLINEAR CONFIDENCE AND PREDICTION INTERVALS FOR GROUND-WATER FLOW MODELS.

A method is derived to efficiently compute nonlinear confidence and prediction intervals on any function of parameters derived as output from a mathematical model of a physical system. The method is applied to the problem of obtaining confidence and prediction intervals for manually-calibrated ground-water flow models. To obtain confidence and prediction intervals resulting from uncertainties in parameters, the calibrated model and information on extreme ranges and ordering of the model parameters within one or more independent groups are required. If random errors in the dependent variable are present in addition to uncertainties in parameters, then calculation of prediction intervals also requires information on the extreme range of error expected. A simple Monte Carlo method is used to compute the quantiles necessary to establish probability levels for the confidence and prediction intervals. Application of the method to a hypothetical example showed that inclusion of random errors in the dependent variable in addition to uncertainties in parameters can considerably widen the prediction intervals.

Cooley, Richard L.; Vecchia, Aldo V.

1987-01-01

280

below in pairs. NPREQ I ~t' 1 t I f1*id I. th p M Il I range of 0 to 4. 34 PRE1 ( )/PRE2 ( ) ? Course pairs speci fying precedence re uirements to be used in building schedule options. JJPREQ ( 4) pairs of course precedes)ce condition" will C... 0 'I An& I CRR CI&IN VOD Svaoa IS ~ 5 5va&A '1 4 ~ ~ on 1 ~ r5II 1405I 2 ranalo 140878 1&QSPT 1 ~ 0177 7 ~ QPQI ~ ~ 4*4 ~ ~ ~ ~ I 'TOPD 8 538 ~ 3 ~ 5 & Tol 1 ' 050& 1&0828 30 20 0 0 0 5 ALTU ~ FAIN 3 lo Hnal 0 'P MORI C I ~ I...

Beatty, Thomas Melvin

1975-01-01

281

Pinwheel Scheduling: Achievable Densities

\\u000a \\u000a Abstract. A pinwheel schedule for a vector v= (v\\u000a \\u000a 1\\u000a \\u000a , v\\u000a \\u000a 2\\u000a \\u000a , . . ., v\\u000a \\u000a n\\u000a \\u000a ) of positive integers 2 ? v\\u000a \\u000a 1\\u000a \\u000a ? v\\u000a \\u000a 2\\u000a \\u000a ? ?s ? v\\u000a \\u000a n\\u000a is an infinite symbol sequence {S\\u000a \\u000a j\\u000a \\u000a : j ? Z } with each symbol drawn from [n] = {1,2, . . ., n

Peter C. Fishburn; J. C. Lagarias

2002-01-01

282

Astronaut Office Scheduling System Software

NASA Technical Reports Server (NTRS)

AOSS is a highly efficient scheduling application that uses various tools to schedule astronauts weekly appointment information. This program represents an integration of many technologies into a single application to facilitate schedule sharing and management. It is a Windows-based application developed in Visual Basic. Because the NASA standard office automation load environment is Microsoft-based, Visual Basic provides AO SS developers with the ability to interact with Windows collaboration components by accessing objects models from applications like Outlook and Excel. This also gives developers the ability to create newly customizable components that perform specialized tasks pertaining to scheduling reporting inside the application. With this capability, AOSS can perform various asynchronous tasks, such as gathering/ sending/ managing astronauts schedule information directly to their Outlook calendars at any time.

Brown, Estevancio

2010-01-01

283

A Framework for Scheduling Professional Sports Leagues

A Framework for Scheduling Professional Sports Leagues Kimmo Nurmia , Dries Goossensb , Thomas constrained sports scheduling problem which is modeled from the requirements of various professional sports leagues. We define a sports scheduling problem, introduce the necessary terminology and detail

Bonomo, Flavia

284

Spatial instruction scheduling for raw machines

Instruction scheduling on software exposed architectures, such as Raw, must be performed in both time and space. The complexity and variance of application scheduling regions dictates that the space-time scheduling task ...

Swenson, Shane Michael, 1979-

2002-01-01

285

40 CFR 141.702 - Sampling schedules.

Code of Federal Regulations, 2010 CFR

...2010-07-01 2010-07-01 false Sampling schedules. 141.702 Section 141...Monitoring Requirements § 141.702 Sampling schedules. (a) Systems required...monitoring under § 141.701 must submit a sampling schedule that specifies the...

2010-07-01

286

Code of Federal Regulations, 2010 CFR

...2010-10-01 false Scheduling. 665.21 Section...Operations § 665.21 Scheduling. (a) To schedule...manufacturer shall contact the operator of FTA's...testing program. Contact information and procedures are available...

2010-10-01

287

Fast transfer of crossmodal time interval training.

Sub-second time perception is essential for many important sensory and perceptual tasks including speech perception, motion perception, motor coordination, and crossmodal interaction. This study investigates to what extent the ability to discriminate sub-second time intervals acquired in one sensory modality can be transferred to another modality. To this end, we used perceptual classification of visual Ternus display (Ternus in Psychol Forsch 7:81-136, 1926) to implicitly measure participants' interval perception in pre- and posttests and implemented an intra- or crossmodal sub-second interval discrimination training protocol in between the tests. The Ternus display elicited either an "element motion" or a "group motion" percept, depending on the inter-stimulus interval between the two visual frames. The training protocol required participants to explicitly compare the interval length between a pair of visual, auditory, or tactile stimuli with a standard interval or to implicitly perceive the length of visual, auditory, or tactile intervals by completing a non-temporal task (discrimination of auditory pitch or tactile intensity). Results showed that after fast explicit training of interval discrimination (about 15 min), participants improved their ability to categorize the visual apparent motion in Ternus displays, although the training benefits were mild for visual timing training. However, the benefits were absent for implicit interval training protocols. This finding suggests that the timing ability in one modality can be rapidly acquired and used to improve timing-related performance in another modality and that there may exist a central clock for sub-second temporal processing, although modality-specific perceptual properties may constrain the functioning of this clock. PMID:24570386

Chen, Lihan; Zhou, Xiaolin

2014-06-01

288

NSDL National Science Digital Library

Created by authors Mike Lestik, Scott Plous and Geoffrey Urbaniak of the Social Psychology Network, this "Research Randomizer" is a free service offered to "assist researchers and students who want an easy way to perform random sampling or assign participants to experimental conditions." Since its creation in 1997, this resource has been used 7.5 million times for a various amount of different tests. The authors offer a simple survey to fill in to begin the exam. This is a great resource for those interested in statistical testing or simply need assistant performing a specific test.

Lestik, Mike; Plous, Scott; Urbaniak, Geoffrey

2009-02-26

289

Adaptive sensor scheduling for target tracking in wireless sensor network

NASA Astrophysics Data System (ADS)

Target tracking is an essential capability for Wireless Sensor Networks (WSNs) and is used as a canonical problem for collaborative signal and information processing to dynamically manage sensor resources and efficiently process distributed sensor measurements. In existing work for target tracking in WSNs, such as the information-driven sensor query (IDSQ) approach, the tasking sensors are scheduled based on uniform sampling interval, ignoring the changing of the target dynamics and obtained estimation accuracy. This paper proposes the adaptive sensor scheduling strategy by jointly selecting the tasking sensor and determining the sampling interval according to the predicted tracking accuracy and tracking cost. The sensors are scheduled in two tracking modes, i.e., the fast tracking approaching mode when the predicted tracking accuracy is not satisfactory, and the tracking maintenance mode when the predicted tracking accuracy is satisfactory. The approach employs an Extended Kalman Filter (EKF) based estimation technique to predict the tracking accuracy, and adopts a linear energy model to predict the energy consumption. Simulation results demonstrate that, compared to the non-adaptive approach, the proposed approach can achieve significant improvement on energy consumption without degrading the tracking accuracy.

Xiao, Wendong; Wu, Jian Kang; Xie, Lihua

2005-08-01

290

Interval arithmetic in power flow analysis

The power flow is the fundamental tool for the study of power systems. The data for this problem are subject to uncertainty. This paper uses interval arithmetic to solve the power flow problem. Interval arithmetic takes into consideration the uncertainty of the nodal information, and is able to provide strict bounds for the solution to the problem: all possible solutions are included within the bounds given by interval arithmetic. Results are compared with those obtainable by Monte Carlo simulations and by the use of stochastic power flows.

Wang, Z.; Alvarado, F.L. (Dept. of Electrical and Computer Engineering, Univ. of Wisconsin-Madison, WI (US))

1992-08-01

291

Evaluation of Fixed Momentary DRO Schedules under Signaled and Unsignaled Arrangements

ERIC Educational Resources Information Center

Fixed momentary schedules of differential reinforcement of other behavior (FM DRO) generally have been ineffective as treatment for problem behavior. Because most early research on FM DRO included presentation of a signal at the end of the DRO interval, it is unclear whether the limited effects of FM DRO were due to (a) the momentary response…

Hammond, Jennifer L.; Iwata, Brian A.; Fritz, Jennifer N.; Dempsey, Carrie M.

2011-01-01

292

The Time Between: Continuously-defined accessibility functions for schedule-based transportation is traditionally considered to be a property of a point or region in space, and to be invariant over time (or at least over some computationally convenient time interval). However, a location's accessibility can vary

Levinson, David M.

293

Genetic algorithm and the application for job shop group scheduling

NASA Astrophysics Data System (ADS)

Genetic algorithm (GA) is a heuristic and random search technique mimicking nature. This paper first presents the basic principle of GA, the definition and the function of the genetic operators, and the principal character of GA. On the basis of these, the paper proposes using GA as a new solution method of the job-shop group scheduling problem, discusses the coded representation method of the feasible solution, and the particular limitation to the genetic operators.

Mao, Jianzhong; Wu, Zhiming

1995-08-01

294

The authors present a new technique for the design of approximation algorithms that can be viewed as a generalization of randomized rounding. They derive new or improved approximation guarantees for a class of generalized congestion problems such as multicast congestion, multiple TSP etc. Their main mathematical tool is a structural decomposition theorem related to the integrality gap of a relaxation.

CARR,ROBERT D.; VEMPALA,SANTOSH

2000-01-25

295

Approximate Confidence Intervals for Effect Sizes.

ERIC Educational Resources Information Center

Investigated the approximate confidence intervals for effect sizes developed by K. Bird (2002) and proposed a more accurate method developed through simulation studies. The average coverage probability for the new method was 0.959. (SLD)

Algina, James; Keselman, H. J.

2003-01-01

296

Qualitative, semiqualitative and interval algebras, and their

are generated. This tendency towards overabstraction has prompted research into ways of making qualitative-1- Qualitative, semiqualitative and interval algebras, and their application to engineering of complex physical systems. Semiqualitative modelling is a generalisation of qualitative modelling which

Parsons, Simon

297

Optimal dynamic interval management in external memory

The authors present a space- and I\\/O-optimal external-memory data structure for answering stabbing queries on a set of dynamically maintained intervals. The data structure settles an open problem in databases and I\\/O algorithms by providing the first optimal external-memory solution to the dynamic interval management problem, which is a special case of 2-dimensional range searching and a central problem for

Lars Arge; J. S. Vitter

1996-01-01

298

Interval Order Representation via Shortest Paths

Our goal in this paper is to illustrate how the representation theorems for finite interval orders and semiorders can be seen\\u000a as special instances of existence results for potentials in digraphs. This viewpoint yields short proofs of the representation\\u000a theorems and provides a framework for certain types of additional constraints on the intervals. We also use it to obtain a

Garth Isaak

299

Recurrence interval analysis of trading volumes

NASA Astrophysics Data System (ADS)

We study the statistical properties of the recurrence intervals ? between successive trading volumes exceeding a certain threshold q . The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.

Ren, Fei; Zhou, Wei-Xing

2010-06-01

300

We tested the accuracy of 95% individual prediction intervals for hydraulic heads, streamflow gains, and effective transmissivities computed by groundwater models of two Danish aquifers. To compute the intervals, we assumed that each predicted value can be written as the sum of a computed dependent variable and a random error. Testing was accomplished by using a cross-validation method and by using new field measurements of hydraulic heads and transmissivities that were not used to develop or calibrate the models. The tested null hypotheses are that the coverage probability of the prediction intervals is not significantly smaller than the assumed probability (95%) and that each tail probability is not significantly different from the assumed probability (2.5%). In all cases tested, these hypotheses were accepted at the 5% level of significance. We therefore conclude that for the groundwater models of two real aquifers the individual prediction intervals appear to be accurate.We tested the accuracy of 95% individual prediction intervals for hydraulic heads, streamflow gains, and effective transmissivities computed by groundwater models of two Danish aquifers. To compute the intervals, we assumed that each predicted value can be written as the sum of a computed dependent variable and a random error. Testing was accomplished by using a cross-validation method and by using new field measurements of hydraulic heads and transmissivities that were not used to develop or calibrate the models. The tested null hypotheses are that the coverage probability of the prediction intervals is not significantly smaller than the assumed probability (95%) and that each tail probability is not significantly different from the assumed probability (2.5%). In all cases tested, these hypotheses were accepted at the 5% level of significance. We therefore conclude that for the groundwater models of two real aquifers the individual prediction intervals appear to be accurate.

Christensen, S.; Cooley, R.L.

1999-01-01

301

Research Ship Information and Schedules

NSDL National Science Digital Library

This unusual resource for Research Ship Information and Schedules is maintained by the University of Delaware's Ocean Information Center (OCEANIC). Complete with a searchable database of schedules and characteristics of deep-water scientific research vessels, this site is an excellent example of the range of eclectic and specialized information available on the Web. Information on facilities, research capabilities, layouts, schedules, and much more is organized by country and ship name; for US research vessels, information is also listed by agency and institution. Other features include links to a staggering array of related research ship information sites.

1999-01-01

302

An Extended Deterministic Dendritic Cell Algorithm for Dynamic Job Shop Scheduling

NASA Astrophysics Data System (ADS)

The problem of job shop scheduling in a dynamic environment where random perturbation exists in the system is studied. In this paper, an extended deterministic Dendritic Cell Algorithm (dDCA) is proposed to solve such a dynamic Job Shop Scheduling Problem (JSSP) where unexpected events occurred randomly. This algorithm is designed based on dDCA and makes improvements by considering all types of signals and the magnitude of the output values. To evaluate this algorithm, ten benchmark problems are chosen and different kinds of disturbances are injected randomly. The results show that the algorithm performs competitively as it is capable of triggering the rescheduling process optimally with much less run time for deciding the rescheduling action. As such, the proposed algorithm is able to minimize the rescheduling times under the defined objective and to keep the scheduling process stable and efficient.

Qiu, X. N.; Lau, H. Y. K.

303

A practical scheduling algorithm for Shuttle-based astronomy missions

NASA Technical Reports Server (NTRS)

In the Astro mission series (initial flight planned for March, 1986), the Shuttle will be used as a dedicated stellar astronomy observatory. A modified Spacelab pallet is to be used for the Astro payload, which will consist of three ultraviolet (UV) telescopes and a wide field camera mounted together on a single gimbal mount called the Inertial Pointing System (IPS). Three flights of 7-10 days duration are to be made with the same payload at intervals of 8-9 months. Previous experience has shown that changes in design requirements are inevitable, and the evolution of operational concepts will effect changes in scheduling algorithm software. For these reasons, the design goals of the Astron algorithm and its family of auxiliary software modules have been related to functional modularity, constraint flexibility, user friendliness, and 'light' input requirements. Attention is given to hardware characteristics, environmental constraints, the basic criteria function, 'Cinderella' logic, counters and constraints, and scheduling trends.

Guffin, O. T.; Roberts, B. H.; Williamson, P. L.

1985-01-01

304

A Bayesian two-stage phase I-II design is proposed for optimizing administration schedule and dose of an experimental agent based on the times to response and toxicity in the case where schedules are non-nested and qualitatively different. Sequentially adaptive decisions are based on the joint utility of the two event times. A utility function is constructed by partitioning the two-dimensional positive real quadrant of possible event time pairs into rectangles, eliciting a numerical utility for each rectangle, and fitting a smooth parametric function to the elicited values. We assume that each event time follows a gamma distribution with shape and scale parameters both modeled as functions of schedule and dose. A copula is assumed to obtain a bivariate distribution. To ensure an ethical trial, adaptive safety and efficacy acceptability conditions are imposed on the (schedule, dose) regimes. In stage 1 of the design, patients are randomized fairly among schedules and, within each schedule, a dose is chosen using a hybrid algorithm that either maximizes posterior mean utility or randomizes among acceptable doses. In stage 2, fair randomization among schedules is replaced by the hybrid algorithm. A modified version of this algorithm is used for nested schedules. Extensions of the model and utility function to accommodate death or discontinuation of follow up are described. The method is illustrated by an autologous stem cell transplantation trial in multiple myeloma, including a simulation study. PMID:23957592

Thall, Peter F; Nguyen, Hoang Q; Braun, Thomas M; Qazilbash, Muzaffar H

2013-09-01

305

Federal Register 2010, 2011, 2012, 2013

...Schedule D, Schedule K-1, and Schedule M-3 AGENCY: Internal Revenue Service (IRS...and Losses and Built-In Gains, Schedule M-3 (Form 1120S), Net Income (Loss...622-6665, or through the Internet at Allan.M.Hopkins@irs.gov. [[Page...

2012-10-23

306

Reclaimer Scheduling: Complexity and Algorithms

Feb 18, 2014 ... its contents at one of three dump stations. The coal is ... and scheduling their reclaiming, the limited stockyard space, stacking rates, reclaiming rates, and ...... In Control and Automation (ICCA), 2013 10th IEEE International.

2014-02-18

307

Interchange scheduling: Discipline or disorder?

Electric power transmission access is being opened. The world of interchange scheduling is now a mixture of both old and new players, with changing rules, and uncertain risks to electric system reliability. Reliable electric system operation requires rigorous discipline in executing interchange schedules. Can disorder be avoided in this rapidly changing environment? How will the reliability of the interconnected electric systems in the US be impacted by these changes?

Day, L.R.

1996-10-01

308

Rotation scheduling: a loop pipelining algorithm

We consider the resource-constrained scheduling of loops with interiteration dependencies. A loop is modeled as a data flow graph (DFG) , where edges are labeled with the number of iterations between dependencies. We design a novel and flexible technique, called rotation scheduling, for scheduling cyclic DFG's using loop pipelining. The rotation technique re- peatedly transforms a schedule to a more

Liang-Fang Chao; Andrea S. LaPaugh; Edwin Hsing-Mean Sha

1993-01-01

309

Sunspot Time Series: Passive and Active Intervals

NASA Astrophysics Data System (ADS)

Solar activity slowly and irregularly decreases from the first spotless day (FSD) in the declining phase of the old sunspot cycle and systematically, but also in an irregular way, increases to the new cycle maximum after the last spotless day (LSD). The time interval between the first and the last spotless day can be called the passive interval (PI), while the time interval from the last spotless day to the first one after the new cycle maximum is the related active interval (AI). Minima of solar cycles are inside PIs, while maxima are inside AIs. In this article, we study the properties of passive and active intervals to determine the relation between them. We have found that some properties of PIs, and related AIs, differ significantly between two group of solar cycles; this has allowed us to classify Cycles 8 - 15 as passive cycles, and Cycles 17 - 23 as active ones. We conclude that the solar activity in the PI declining phase (a descending phase of the previous cycle) determines the strength of the approaching maximum in the case of active cycles, while the activity of the PI rising phase (a phase of the ongoing cycle early growth) determines the strength of passive cycles. This can have implications for solar dynamo models. Our approach indicates the important role of solar activity during the declining and the rising phases of the solar-cycle minimum.

Zi?ba, S.; Nieckarz, Z.

2014-07-01

310

NASA Astrophysics Data System (ADS)

We study the statistics of return intervals between large heartbeat intervals (above a certain threshold Q) in 24 h records obtained from healthy subjects. We find that both the linear and the nonlinear long-term memory inherent in the heartbeat intervals lead to power-laws in the probability density function PQ(r) of the return intervals. As a consequence, the probability WQ(t; ?t) that at least one large heartbeat interval will occur within the next ?t heartbeat intervals, with an increasing elapsed number of intervals t after the last large heartbeat interval, follows a power-law. Based on these results, we suggest a method of obtaining a priori information about the occurrence of the next large heartbeat interval, and thus to predict it. We show explicitly that the proposed method, which exploits long-term memory, is superior to the conventional precursory pattern recognition technique, which focuses solely on short-term memory. We believe that our results can be straightforwardly extended to obtain more reliable predictions in other physiological signals like blood pressure, as well as in other complex records exhibiting multifractal behaviour, e.g. turbulent flow, precipitation, river flows and network traffic.

Bogachev, Mikhail I.; Kireenkov, Igor S.; Nifontov, Eugene M.; Bunde, Armin

2009-06-01

311

Diversity ALOHA--A Random Access Scheme for Satellite Communications

A generalization of the slotted ALOHA random access scheme is considered in which a user transmits multiple copies of the same packet. The multiple copies can be either transmitted simultaneously on different frequency channels (frequency diversity) or they may be transmitted on a single high-speed channel but spaced apart by random time intervals (time diversity). In frequency diversity, two schemes

G. Choudhury; S. S. Rapparort

1983-01-01

312

Multiple Comparisons with the Best Using Common Random Numbers

numbers) among the populations. Nelson and Matejcik (1995) studied two-stage MCB procedures using commonMultiple Comparisons with the Best Using Common Random Numbers for Steady-State Simulations Marvin a single- stage simulation using common random numbers to construct simultaneous confidence intervals

Nakayama, Marvin K.

313

Multiple Comparisons with the Best Using Common Random Numbers

numbers) among the populations. Nelson and Matejcik (1995) studied twoÂstage MCB procedures using commonMultiple Comparisons with the Best Using Common Random Numbers for SteadyÂState Simulations Marvin a singleÂ stage simulation using common random numbers to construct simultaneous confidence intervals

Nakayama, Marvin K.

314

NASA Technical Reports Server (NTRS)

Scheduling has become an increasingly important element in today's society and workplace. Within the NASA environment, scheduling is one of the most frequently performed and challenging functions. Towards meeting NASA's scheduling needs, a research version of a generic expert scheduling system architecture and toolkit has been developed. This final report describes the development and testing of GUESS (Generically Used Expert Scheduling System).

Liebowitz, Jay; Krishnamurthy, Vijaya; Rodens, Ira; Houston, Chapman; Liebowitz, Alisa; Baek, Seung; Radko, Joe; Zeide, Janet

1996-01-01

315

Scheduling: Proportional Share In this note, we'll examine a different type of scheduler known

8 Scheduling: Proportional Share In this note, we'll examine a different type of scheduler known as a proportional-share scheduler, also sometimes referred to as a fair-share scheduler. Proportional-share is based example of proportional-share schedul- ing is found in research by Waldspurger and Weihl [WW94

Sheridan, Jennifer

316

OBJECTIVE The requirement to inject current basal insulin analogs at a fixed time each day may complicate adherence and compromise glycemic control. This trial evaluated the efficacy and safety of varying the daily injection time of insulin degludec (IDeg), an ultra-long-acting basal insulin. RESEARCH DESIGN AND METHODS This 26-week, open-label, treat-to-target trial enrolled adults (?18 years) with type 2 diabetes who were either insulin naïve and receiving oral antidiabetic drugs (OADs) (HbA1c = 7–11%) or previously on basal insulin ± OAD(s) (HbA1c = 7–10%). Participants were randomized to 1) once-daily (OD) IDeg in a prespecified dosing schedule, creating 8–40-h intervals between injections (IDeg OD Flex; n = 229); 2) once-daily IDeg at the main evening meal (IDeg OD; n = 228); or 3) once-daily insulin glargine at the same time each day (IGlar OD; n = 230). The primary outcome was noninferiority of IDeg OD Flex to IGlar OD in HbA1c reduction after 26 weeks. RESULTS After 26 weeks, IDeg OD Flex, IDeg OD, and IGlar OD improved HbA1c by 1.28, 1.07, and 1.26% points, respectively (estimated treatment difference [IDeg OD Flex ? IGlar OD]: 0.04% points [–0.12 to 0.20], confirming noninferiority). No statistically significant differences in overall or nocturnal hypoglycemia were found between IDeg OD Flex and IGlar OD. Comparable glycemic control and rates of hypoglycemia were seen with IDeg OD Flex and IDeg OD. Adverse event profiles were similar across groups. CONCLUSIONS The use of extreme dosing intervals of 8–40 h demonstrates that the daily injection time of IDeg can be varied without compromising glycemic control or safety. PMID:23340894

Meneghini, Luigi; Atkin, Stephen L.; Gough, Stephen C.L.; Raz, Itamar; Blonde, Lawrence; Shestakova, Marina; Bain, Stephen; Johansen, Thue; Begtrup, Kamilla; Birkeland, Kåre I.

2013-01-01

317

Detecting independent and recurrent copy number aberrations using interval graphs

Motivation: Somatic copy number aberrations (SCNAs) are frequent in cancer genomes, but many of these are random, passenger events. A common strategy to distinguish functional aberrations from passengers is to identify those aberrations that are recurrent across multiple samples. However, the extensive variability in the length and position of SCNAs makes the problem of identifying recurrent aberrations notoriously difficult. Results: We introduce a combinatorial approach to the problem of identifying independent and recurrent SCNAs, focusing on the key challenging of separating the overlaps in aberrations across individuals into independent events. We derive independent and recurrent SCNAs as maximal cliques in an interval graph constructed from overlaps between aberrations. We efficiently enumerate all such cliques, and derive a dynamic programming algorithm to find an optimal selection of non-overlapping cliques, resulting in a very fast algorithm, which we call RAIG (Recurrent Aberrations from Interval Graphs). We show that RAIG outperforms other methods on simulated data and also performs well on data from three cancer types from The Cancer Genome Atlas (TCGA). In contrast to existing approaches that employ various heuristics to select independent aberrations, RAIG optimizes a well-defined objective function. We show that this allows RAIG to identify rare aberrations that are likely functional, but are obscured by overlaps with larger passenger aberrations. Availability: http://compbio.cs.brown.edu/software. Contact: braphael@brown.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24931984

Wu, Hsin-Ta; Hajirasouliha, Iman; Raphael, Benjamin J.

2014-01-01

318

Exact intervals and tests for median when one sample value possibly an outliner

NASA Technical Reports Server (NTRS)

Available are independent observations (continuous data) that are believed to be a random sample. Desired are distribution-free confidence intervals and significance tests for the population median. However, there is the possibility that either the smallest or the largest observation is an outlier. Then, use of a procedure for rejection of an outlying observation might seem appropriate. Such a procedure would consider that two alternative situations are possible and would select one of them. Either (1) the n observations are truly a random sample, or (2) an outlier exists and its removal leaves a random sample of size n-1. For either situation, confidence intervals and tests are desired for the median of the population yielding the random sample. Unfortunately, satisfactory rejection procedures of a distribution-free nature do not seem to be available. Moreover, all rejection procedures impose undesirable conditional effects on the observations, and also, can select the wrong one of the two above situations. It is found that two-sided intervals and tests based on two symmetrically located order statistics (not the largest and smallest) of the n observations have this property.

Keller, G. J.; Walsh, J. E.

1973-01-01

319

Scheduling: A guide for program managers

NASA Technical Reports Server (NTRS)

The following topics are discussed concerning scheduling: (1) milestone scheduling; (2) network scheduling; (3) program evaluation and review technique; (4) critical path method; (5) developing a network; (6) converting an ugly duckling to a swan; (7) network scheduling problem; (8) (9) network scheduling when resources are limited; (10) multi-program considerations; (11) influence on program performance; (12) line-of-balance technique; (13) time management; (14) recapitulization; and (15) analysis.

1994-01-01

320

Artificial intelligence approaches to astronomical observation scheduling

NASA Technical Reports Server (NTRS)

Automated scheduling will play an increasing role in future ground- and space-based observatory operations. Due to the complexity of the problem, artificial intelligence technology currently offers the greatest potential for the development of scheduling tools with sufficient power and flexibility to handle realistic scheduling situations. Summarized here are the main features of the observatory scheduling problem, how artificial intelligence (AI) techniques can be applied, and recent progress in AI scheduling for Hubble Space Telescope.

Johnston, Mark D.; Miller, Glenn

1988-01-01

321

NSDL National Science Digital Library

Created by Kyle Siegrist of the University of Alabama-Huntsville, this is an online, interactive lesson on random samples. The author provides examples, exercises, and applets concerning many different topics. Some of these include: sample mean, law of large numbers, sample variance, partial sums, central limit theorem, special properties of normal samples, order statistics, and sample covariance and correlation. Additionally, the author provides links to external resources for students wanting to engage further with the topic. This is simply one of seventeen lessons. They are all easily accessible as the author has formatted his site much like an online textbook.

Siegrist, Kyle

322

Wind-hydrothermal power system dispatching has received intensive attention in recent years because it can help develop various reasonable plans to schedule the power generation efficiency. But future data such as wind power output and power load would not be accurately predicted and the nonlinear nature involved in the complex multiobjective scheduling model; therefore, to achieve accurate solution to such complex problem is a very difficult task. This paper presents an interval programming model with 2-step optimization algorithm to solve multiobjective dispatching. Initially, we represented the future data into interval numbers and simplified the object function to a linear programming problem to search the feasible and preliminary solutions to construct the Pareto set. Then the simulated annealing method was used to search the optimal solution of initial model. Thorough experimental results suggest that the proposed method performed reasonably well in terms of both operating efficiency and precision. PMID:24895663

Jihong, Qu

2014-01-01

323

46 CFR 176.675 - Extension of examination intervals.

Code of Federal Regulations, 2010 CFR

...2010-10-01 false Extension of examination intervals. 176.675 Section 176.675 Shipping ...Examinations § 176.675 Extension of examination intervals. The intervals between drydock examinations and internal...

2010-10-01

324

29 CFR 452.65 - Interval between nominations and election.

Code of Federal Regulations, 2010 CFR

...2010-07-01 2010-07-01 false Interval between nominations and election...Nominations for Office § 452.65 Interval between nominations and election. The Act specifies no time interval between nominations and election....

2010-07-01

325

46 CFR 61.20-21 - Extension of examination interval.

Code of Federal Regulations, 2010 CFR

...2010-10-01 false Extension of examination interval. 61.20-21 Section 61.20-21...20-21 Extension of examination interval. The Commandant CG-543 may authorize extensions of the interval between tailshaft examinations....

2010-10-01

326

46 CFR 61.20-17 - Examination intervals.

Code of Federal Regulations, 2010 CFR

... 2010-10-01 false Examination intervals. 61.20-17 Section 61.20-17...Equipment § 61.20-17 Examination intervals. (a) A lubricant that demonstrates...purposes of the tailshaft examination interval. (b) Except as provided in...

2010-10-01

327

46 CFR 115.675 - Extension of examination intervals.

Code of Federal Regulations, 2010 CFR

...2010-10-01 false Extension of examination intervals. 115.675 Section 115.675 Shipping ...Examinations § 115.675 Extension of examination intervals. The intervals between drydock examinations and internal...

2010-10-01

328

Six possums (Trichosuruus vulpecula) responded under dependent concurrent variable-interval variable-interval schedules of reinforcement. Over 15 conditions, barley-carob was one reinforcer with the other reinforcer consisting of Coco Pops, coconut, or a barley-carob mixture with 0%, 2%, 4%, or 6% salt added to the barley. The schedules were both variable-interval 40 s. As has been found with other species, behavior on the concurrent schedules was biased by the type of feed, with the 6% salt and the coconut giving the greatest biases towards the barley-carob mixture. The schedules were varied over 17 conditions using the barley-carob mixture alone or the barley-carob mixture versus the mixture with 4% or 6% salt. Both the contingency-discriminability model (Davison & Jenkins, 1985) and the generalized matching law described the data from the three sets of conditions equally well. Both gave similar measures of bias; however, some of the parameter values found with the contingency discriminability model were uninterpretable. Thus, any argument for this model based on the interpretability of the parameter values becomes weak. It is worth retaining the generalized matching law as a descriptor of such data. PMID:12908759

Bron, Angela; Sumpter, Catherine E; Foster, T Mary; Temple, William

2003-01-01

329

A compact adaptive optics system with 3mm narrow-interval deformable mirror

NASA Astrophysics Data System (ADS)

A 3mm narrow interval deformable mirror (DM) with tip-tilt stage has been developed for astronomical instruments. Benefiting from its compact design, the adaptive optics system can be built with simple structure and smaller optical elements. First, a 37-elements prototype mirror has been developed for our 1.8-meter telescope, which interval space is 3mm, maximum tilt is +/-10', and maximum deformation is +/-2?m. Based on this mirror, a simple adaptive optics system has been set up and its performance was tested in the laboratory especially the closed-loop correction ability. This adaptive optics subsystem is scheduled to be mounted at one folded Cassegrain focus of the 1.8-meter telescope this year, and comparison test for star compensation observation using this compact system and conventional adaptive optics system will also be carried out at the same time.

Li, Min; Xue, Lixia; Xian, Hao; Rao, Xuejun; Wei, Kai; Zhang, Xiaojun; Chen, Shanqiu; Zhang, Ang; Chen, Donghong; Rao, ChangHui; Zhang, Yudong

2014-08-01

330

78 FR 26701 - Schedules of Controlled Substances: Placement of Lorcaserin Into Schedule IV

Federal Register 2010, 2011, 2012, 2013

...controlled in Schedule II or Schedule III. 5- HT 2A receptors mediate hallucinogenic properties of other drugs, such as lysergic acid diethylamide (LSD). DEA Response: DEA believes that placement in Schedule IV of the CSA will help restrict...

2013-05-08

331

An Interval Classifier for Database Mining Applications

We are given a large population database that containsinformation about population instances. The populationis known to comprise of m groups, but the population instances are not labeled with the group identification. Also given is a population sample (muchsmaller than the population but representative of it)in which the group labels of the instances are known.We present an interval classifier (IC) which

Rakesh Agrawal; Sakti P. Ghosh; Tomasz Imielinski; Balakrishna R. Iyer; Arun N. Swami

1992-01-01

332

Actions and Events in Interval Temporal Logic

We present a representation of events and action based on interval temporal logic that is significantly more expressive and more natural than most previous AI approaches. The representation is motivated by work in natural language semantics and discourse, temporal logic, and AI planning and plan recognition. The formal basis of the representation is presented in detail, from the axiomatization of

James F. Allen; George Ferguson

1994-01-01

333

Interval regression analysis by quadratic programming approach

When we use linear programming in possibilistic regression analysis, some coefficients tend to become crisp because of the characteristic of linear programming. On the other hand, a quadratic programming approach gives more diverse spread coefficients than a linear programming one. Therefore, to overcome the crisp characteristic of linear programming, we propose interval regression analysis based on a quadratic programming approach.

Hideo Tanaka; Haekwan Lee

1998-01-01

334

ConcepTest: EQ Recurrence Interval

NSDL National Science Digital Library

Plate A moves 3 cm/yr past plate B. A typical earthquake along the plate boundary has 6 meters of slip. What is the recurrence interval of the earthquake? a. 0.2 years b. 2 years c. 200 years d. 2000 years

335

Interval Computations BIBLIOGRAPHY OF SOVIET WORKS ON

methods for solving the initial value problem for systems of ordinary differential equations. Preprint 2 of ordinary differential equations. Computer Cen- ter, Siberian Branch of the USSR Academy of Sciences of interval solving for a class of initial value problems for systems of ordinary differential equations". Inf

Kearfott, R. Baker

336

Interval Analysis and Reliability in Robotics

Interval Analysis and Reliability in Robotics J-P. Merlet INRIA, Sophia-Antipolis, France Abstract: A robot is typical of systems that are inherently submitted to uncertainties although they should be highly reliable (i.e. for a robot used in surgical applications). The sources of uncertainties

Paris-Sud XI, UniversitÃ© de

337

Confidence Trick: The Interpretation of Confidence Intervals

ERIC Educational Resources Information Center

The frequent misinterpretation of the nature of confidence intervals by students has been well documented. This article examines the problem as an aspect of the learning of mathematical definitions and considers the tension between parroting mathematically rigorous, but essentially uninternalized, statements on the one hand and expressing…

Foster, Colin

2014-01-01

338

Toward Using Confidence Intervals to Compare Correlations

ERIC Educational Resources Information Center

Confidence intervals are widely accepted as a preferred way to present study results. They encompass significance tests and provide an estimate of the magnitude of the effect. However, comparisons of correlations still rely heavily on significance testing. The persistence of this practice is caused primarily by the lack of simple yet accurate…

Zou, Guang Yong

2007-01-01

339

Coefficient Omega Bootstrap Confidence Intervals: Nonnormal Distributions

ERIC Educational Resources Information Center

The performance of the normal theory bootstrap (NTB), the percentile bootstrap (PB), and the bias-corrected and accelerated (BCa) bootstrap confidence intervals (CIs) for coefficient omega was assessed through a Monte Carlo simulation under conditions not previously investigated. Of particular interests were nonnormal Likert-type and binary items.…

Padilla, Miguel A.; Divers, Jasmin

2013-01-01

340

Coefficient Alpha Bootstrap Confidence Interval under Nonnormality

ERIC Educational Resources Information Center

Three different bootstrap methods for estimating confidence intervals (CIs) for coefficient alpha were investigated. In addition, the bootstrap methods were compared with the most promising coefficient alpha CI estimation methods reported in the literature. The CI methods were assessed through a Monte Carlo simulation utilizing conditions…

Padilla, Miguel A.; Divers, Jasmin; Newton, Matthew

2012-01-01

341

Interval Analysis for Unknown Dependencies and Genetic

Interactions and Market Power Final Project Report Power Systems Engineering Research Center A National Science Research Center Interval Analysis for Unknown Dependencies and Genetic Algorithm Emulation of Markets;Preface The Power Systems Engineering Research Center sponsored the research project titled "Market

342

MEETING DATA QUALITY OBJECTIVES WITH INTERVAL INFORMATION

Immunoassay test kits are promising technologies for measuring analytes under field conditions. Frequently, these field-test kits report the analyte concentrations as falling in an interval between minimum and maximum values. Many project managers use field-test kits only for scr...

343

Interval coding. II. Dendrite-dependent mechanisms.

The rich temporal structure of neural spike trains provides multiple dimensions to code dynamic stimuli. Popular examples are spike trains from sensory cells where bursts and isolated spikes can serve distinct coding roles. In contrast to analyses of neural coding, the cellular mechanics of burst mechanisms are typically elucidated from the neural response to static input. Bridging the mechanics of bursting with coding of dynamic stimuli is an important step in establishing theories of neural coding. Electrosensory lateral line lobe (ELL) pyramidal neurons respond to static inputs with a complex dendrite-dependent burst mechanism. Here we show that in response to dynamic broadband stimuli, these bursts lack some of the electrophysiological characteristics observed in response to static inputs. A simple leaky integrate-and-fire (LIF)-style model with a dendrite-dependent depolarizing afterpotential (DAP) is sufficient to match both the output statistics and coding performance of experimental spike trains. We use this model to investigate a simplification of interval coding where the burst interspike interval (ISI) codes for the scale of a canonical upstroke rather than a multidimensional stimulus feature. Using this stimulus reduction, we compute a quantization of the burst ISIs and the upstroke scale to show that the mutual information rate of the interval code is maximized at a moderate DAP amplitude. The combination of a reduced description of ELL pyramidal cell bursting and a simplification of the interval code increases the generality of ELL burst codes to other sensory modalities. PMID:17409177

Doiron, Brent; Oswald, Anne-Marie M; Maler, Leonard

2007-04-01

344

1996 Wholesale Power and Transmission Rate Schedules.

Bonneville Power Administration`s (BPA) 1996 Wholesale Power Rate Schedules, 1996 Ancillary Products and Services Rate Schedule, 1996 Transmission Rate Schedules, and General Rate Schedule Provisions, contained herein, were approved on an interim basis effective October 1, 1996. These rate schedules and provisions were approved by the Federal Energy Regulatory Commission (FERC), United States Department of Energy, in September 1996 (Docket Nos EF96-2011-000 and EF96f-2021-000). These rate schedules and General Rate Schedule Provisions were approved on a final basis by the FERC July 30, 1997, in Dept. of Energy--Bonneville Power Administration, Docket Nos. EF96-2011-000 and EF96-2021-000. Except as noted elsewhere, these 1996 rate schedules and provisions supersede BPA`s Wholesale Power Rate Schedules and General Rate Schedule Provisions, and Transmission Rate Schedules and General Transmission Rate Schedule Provisions, effective October 1, 1995. These rate schedules and general rate schedule provisions include all errata.

United States. Bonneville Power Administration.

1996-10-01

345

Randomness + Determinism = Why Random Processes Could Be

: pure randomness is unable to focus on e cient moves, and deterministic process alone would lack reliable source of randomness and civilizations using these rit- uals in their decision process could thusRandomness + Determinism = Progresses Why Random Processes Could Be Favored by Evolution Nicolas

Fondements et Applications, UniversitÃ© Paris 7

346

A Mixed Integer Linear Program for Airport Departure Scheduling

NASA Technical Reports Server (NTRS)

Aircraft departing from an airport are subject to numerous constraints while scheduling departure times. These constraints include wake-separation constraints for successive departures, miles-in-trail separation for aircraft bound for the same departure fixes, and time-window or prioritization constraints for individual flights. Besides these, emissions as well as increased fuel consumption due to inefficient scheduling need to be included. Addressing all the above constraints in a single framework while allowing for resequencing of the aircraft using runway queues is critical to the implementation of the Next Generation Air Transport System (NextGen) concepts. Prior work on airport departure scheduling has addressed some of the above. However, existing methods use pre-determined runway queues, and schedule aircraft from these departure queues. The source of such pre-determined queues is not explicit, and could potentially be a subjective controller input. Determining runway queues and scheduling within the same framework would potentially result in better scheduling. This paper presents a mixed integer linear program (MILP) for the departure-scheduling problem. The program takes as input the incoming sequence of aircraft for departure from a runway, along with their earliest departure times and an optional prioritization scheme based on time-window of departure for each aircraft. The program then assigns these aircraft to the available departure queues and schedules departure times, explicitly considering wake separation and departure fix restrictions to minimize total delay for all aircraft. The approach is generalized and can be used in a variety of situations, and allows for aircraft prioritization based on operational as well as environmental considerations. We present the MILP in the paper, along with benefits over the first-come-first-serve (FCFS) scheme for numerous randomized problems based on real-world settings. The MILP results in substantially reduced delays as compared to FCFS, and the magnitude of the savings depends on the queue and departure fix structure. The MILP assumes deterministic aircraft arrival times at the runway queues. However, due to taxi time uncertainty, aircraft might arrive either earlier or later than these deterministic times. Thus, to incorporate this uncertainty, we present a method for using the MILP with "overlap discounted rolling planning horizon". The approach is based on valuing near-term decision results more than future ones. We develop a model of taxitime uncertainty based on real-world data, and then compare the baseline FCFS delays with delays using the above MILP in a simple rolling-horizon method and in the overlap discounted scheme.

Gupta, Gautam; Jung, Yoon Chul

2009-01-01

347

User requirements for a patient scheduling system

NASA Technical Reports Server (NTRS)

A rehabilitation institute's needs and wants from a scheduling system were established by (1) studying the existing scheduling system and the variables that affect patient scheduling, (2) conducting a human-factors study to establish the human interfaces that affect patients' meeting prescribed therapy schedules, and (3) developing and administering a questionnaire to the staff which pertains to the various interface problems in order to identify staff requirements to minimize scheduling problems and other factors that may limit the effectiveness of any new scheduling system.

Zimmerman, W.

1979-01-01

348

NSDL National Science Digital Library

With hectic schedules serving as the order of the day, many persons may find this application more than a bit handy. Schedule World is a free calendaring and scheduling client that utilizes open standards. The application allows users to schedule and receive meetings to and from Microsoft Exchange or Outlook, among other programs. Additionally, the application provides TV-listings for 13 countries and global weather schedules as well. Schedule World 1.3.8 is compatible with all systems running Mac OS X.

349

Hubble Systems Optimize Hospital Schedules

NASA Technical Reports Server (NTRS)

Don Rosenthal, a former Ames Research Center computer scientist who helped design the Hubble Space Telescope's scheduling software, co-founded Allocade Inc. of Menlo Park, California, in 2004. Allocade's OnCue software helps hospitals reclaim unused capacity and optimize constantly changing schedules for imaging procedures. After starting to use the software, one medical center soon reported noticeable improvements in efficiency, including a 12 percent increase in procedure volume, 35 percent reduction in staff overtime, and significant reductions in backlog and technician phone time. Allocade now offers versions for outpatient and inpatient magnetic resonance imaging (MRI), ultrasound, interventional radiology, nuclear medicine, Positron Emission Tomography (PET), radiography, radiography-fluoroscopy, and mammography.

2009-01-01

350

Integrated production and maintenance scheduling

. 60 Levels of Processing Times and Rate Modifiers . . . 60 Experiment I: Comparing g C, with Methods ERI, ER2, ER3 with PC 62 Experiment I: Comparing Total CPU Time. Experiment I: Comparing Number of Schedules Explored . . . 64 65 Experiment I... . . . . . . . . . . . . . . . . . . . . . . 72 Experiment 2: Percentage of Replications in which Optimal Solution was Found with Methods ERI, ER2, ER3, Gl, G2 73 12 Experiment 2: Comparing Total Number of Schedules Explored for ERI, ER2, ER3, PC 75 13 Experiment 2: Comparing Total CPU Time...

Whitaker, Laura Oakes

2012-06-07

351

A planning and scheduling lexicon

NASA Technical Reports Server (NTRS)

A lexicon related to mission planning and scheduling for spacecraft is presented. Planning and scheduling work is known as sequencing. Sequencing is a multistage process of merging requests from both the science and engineering arenas to accomplish the objectives defined in the requests. The multistage process begins with the creation of science and engineering goals, continues through their integration into the sequence, and eventually concludes with command execution onboard the spacecraft. The objective of this publication is to introduce some formalism into the field of spacecraft sequencing-system technology. This formalism will make it possible for researchers and potential customers to communicate about system requirements and capabilities in a common language.

Cruz, Jennifer W.; Eggemeyer, William C.

1989-01-01

352

Background Human circadian rhythms are regulated by the interplay between circadian genes and environmental stimuli. The influence of altered sleep/wake schedules or light on human circadian gene expression patterns is not well characterized. Methods Twenty-one participants were asked to keep to their usual sleep schedules and two blood samples were drawn at the end of the first week for each subject based upon estimated time of dim light melatonin onset (DLMO); the first sample was obtained one and a half hours before the estimated DLMO and the second three hours later, at one and a half hours after the estimated DLMO. During the second week, participants were randomized into two groups, one that received a one hour “blue” light (?max = 470 nm) exposure in the morning and one that received a comparable morning “dim” light exposure. Two blood samples were obtained at the same clock times as previous week at the end of the second week. Results We measured the expression of 10 circadian genes in response to sleep/wake schedule advancement and morning “blue” light stimulation in the peripheral blood of 21 young adults during a two week field study. We found that nine of the 10 circadian genes showed significant expression changes from the first to the second week for participants in both the “blue” and “dim” light groups, likely reflecting significant advances in circadian time. Conclusions This wholesale change in circadian clock gene expression may reflect significant advances in circadian time (i.e., advance in DLMO) from the first to the second week resulting from the advanced, daily personal light exposures. PMID:23562184

Zhu, Yong; Fu, Alan; Hoffman, Aaron E.; Figueiro, Mariana G.; Carskadon, Mary A.; Sharkey, Katherine M.; Rea, Mark S.

2013-01-01

353

Sensitivity of time allocation to concurrent-schedule reinforcement

Four pigeons were trained on concurrent variable-interval schedules programmed on a center response key, with access to those schedules controlled by responses on left or right side keys. Two procedures were used. In one, the pigeon was given limited access, in that each side-key response produced 3-s access to a center-key schedule, and in the other procedure, access was unlimited. Data were analyzed using the generalized matching law. Comparison of sensitivities to reinforcement of interchangeover time for both procedures showed them to be of similar magnitude. Response sensitivities were also similar in magnitude for both procedures. From the limited-access procedure a second time measure that was available, switched-in time, was relatively uncontaminated by time spent emitting behavior other than key pecking. Sensitivities to reinforcement for the switched-in time measure were always smaller than interchangeover-time sensitivities for either procedure, and were approximately equal to response sensitivities for the limited-access procedure. Two other access times (5 and 7.5 s) were studied to validate the choice of 3 s as the main access time. These results indicate that when time spent emitting other behavior is excluded from interchangeover time, time and response sensitivities will be approximately equal. PMID:16812427

Aldiss, Matthew; Davison, Michael

1985-01-01

354

Colour cues facilitate learning flower refill schedules in wild hummingbirds.

Free-living hummingbirds can learn the refill schedules of individual experimental flowers but little is known about what information they use to do this. Colour cues, in particular, may be important to hummingbirds when learning about rewarded flower properties. We investigated, therefore, whether colour cues facilitated the learning of flower refill schedules in wild, free-living rufous hummingbirds (Selasphorus rufus). In the Cued condition, we presented birds with an array of six flowers, three of one colour, each of which were refilled 10min after being emptied by the bird and three of a different colour, which were refilled 20min after being emptied. In the Uncued condition we presented birds with six flowers of the same colour, three of which were refilled after 10min and three of which were refilled after 20min as for the birds in the Cued condition. In the second part of the experiment, we moved the array 2m and changed the shape of the array. Across both phases, birds in the Cued condition learned to discriminate between 10 and 20-min flowers more quickly than did the birds in the Uncued condition. The Cued birds were also better at discriminating between the two distinct refill intervals. Colour cues can, therefore, facilitate learning the refill schedules of experimental flowers in these birds. This article is part of a Special Issue entitled: Cognition in the wild. PMID:25234604

Samuels, Michael; Hurly, T Andrew; Healy, Susan D

2014-11-01

355

Low-bias high-speed quantum random number generator via shaped optical pulses

Low-bias high-speed quantum random number generator via shaped optical pulses Michael A. Wayne generator (QRNG) based on the digitized time interval between random photon arrivals. By tailoring, secure quantum random number generation at rates exceeding 110 Mbit/s. Â©2010 Optical Society of America

Kwiat, Paul

356

CONTOURING RANDOMLY SPACED DATA

NASA Technical Reports Server (NTRS)

This program prepares contour plots of three-dimensional randomly spaced data. The contouring techniques use a triangulation procedure developed by Dr. C. L. Lawson of the Jet Propulsion Laboratory which allows the contouring of randomly spaced input data without first fitting the data into a rectangular grid. The program also allows contour points to be fitted with a smooth curve using an interpolating spline under tension. The input data points to be contoured are read from a magnetic tape or disk file with one record for each data point. Each record contains the X and Y coordinates, value to be contoured, and an alternate contour value (if applicable). The contour data is then partitioned by the program to reduce core storage requirements. Output consists of the contour plots and user messages. Several output options are available to the user such as: controlling which value in the data record is to be contoured, whether contours are drawn by polygonal lines or by a spline under tension (smooth curves), and controlling the contour level labels which may be suppressed if desired. The program can handle up to 56,000 data points and provide for up to 20 contour intervals for a multiple number of parameters. This program was written in FORTRAN IV for implementation on a CDC 6600 computer using CALCOMP plotting capabilities. The field length required is dependent upon the number of data points to be contoured. The program requires 42K octal storage locations plus the larger of: 24 times the maximum number of points in each data partition (defaults to maximum of 1000 data points in each partition with 20 percent overlap) or 2K plus four times the total number of points to be plotted. This program was developed in 1975.

Hamm, R. W.

1994-01-01

357

Constraint-based Attribute and Interval Planning

NASA Technical Reports Server (NTRS)

In this paper we describe Constraint-based Attribute and Interval Planning (CAIP), a paradigm for representing and reasoning about plans. The paradigm enables the description of planning domains with time, resources, concurrent activities, mutual exclusions among sets of activities, disjunctive preconditions and conditional effects. We provide a theoretical foundation for the paradigm, based on temporal intervals and attributes. We then show how the plans are naturally expressed by networks of constraints, and show that the process of planning maps directly to dynamic constraint reasoning. In addition, we de ne compatibilities, a compact mechanism for describing planning domains. We describe how this framework can incorporate the use of constraint reasoning technology to improve planning. Finally, we describe EUROPA, an implementation of the CAIP framework.

Jonsson, Ari; Frank, Jeremy

2013-01-01

358

Fast interval-valued statistical interconnect modeling and reduction

Correlated interval representations of range uncertainty offer an attractive solution for approximating computations on statistical quantities. The key idea is to use finite intervals to approximate the essential mass of a pdf as it moves through numerical operators; the resulting compact interval-valued solution can be easily interpreted as a statistical distribution and efficiently sampled. This paper describes improved interval-valued algorithms

James D. Z. Ma; Rob A. Rutenbar

2005-01-01

359

Purpose Suboptimal bowel preparation can result in decreased neoplasia detection, shortened surveillance intervals, and increased costs. We assessed bowel preparation recommendations and the relationship to self-reported proportion of suboptimal bowel preparations in practice; and evaluated the impact of suboptimal bowel preparation on colonoscopy surveillance practices. A random sample of a national organization of gastroenterologists in the U.S. was surveyed. Methods Demographic and practice characteristics, bowel preparation regimens, and proportion of suboptimal bowel preparations in practice were ascertained. Recommended follow-up colonoscopy intervals were evaluated for optimal and suboptimal bowel preparation and select clinical scenarios. Results We identified 6,777 physicians, of which 1,354 were randomly selected; 999 were eligible, and 288 completed the survey. Higher proportion of suboptimal bowel preparations/week (?10 %) was associated with hospital/university practice, teaching hospital affiliation, >25 % Medicaid insured patients, recommendation of PEG alone and sulfate-free. Those reporting >25 % Medicare and privately insured patients, split dose recommendation, and use of MoviPrep® were associated with a <10 % suboptimal bowel preparations/week. Shorter surveillance intervals for three clinical scenarios were reported for suboptimal preparations and were shortest among participants in the Northeast who more often recommended early follow-up for normal findings and small adenomas. Those who recommended 4-l PEG alone more often advised <1 year surveillance interval for a large adenoma. Conclusions Our study demonstrates significantly shortened surveillance interval recommendations for suboptimal bowel preparation and that these interval recommendations vary regionally in the United States. Findings suggest an interrelationship between dietary restriction, purgative type, and practice and patient characteristics that warrant additional research. PMID:22885884

Basch, Corey H.; Lebwohl, Benjamin; Basch, Charles E.; Kastrinos, Fay; Insel, Beverly J.; Neugut, Alfred I.

2013-01-01

360

Determining Earthquake Recurrence Intervals from Trench Logs

NSDL National Science Digital Library

Trench logs of the San Andreas Fault at Pallett Creek, CA are the data base for a lab or homework assignment that teaches about relative dating, radiometric dating, fault recurrence intervals and the reasons for uncertainty in predicting geologic phenomena. Students are given a trench log that includes several fault strands and dated stratigraphic horizons. They estimate the times of faulting based on bracketing ages of faulted and unfaulted strata. They compile a table with the faulting events from the trench log and additional events recognized in nearby trenches, then calculate maximum, minimum and average earthquake recurrence intervals for the San Andreas Fault in this area. They conclude by making their own prediction for the timing of the next earthquake. While basically an exercise in determining relative ages of geologic horizons and events, this assignment includes radiometric dates, recurrence intervals, and an obvious societal significance that has been well received by students. With minor modifications, this exercise has been used successfully with elementary school students through university undergraduate geology majors. Less experienced students can work in groups, with each group determining the age of a single fault strand; combining the results from different groups and calculating recurrence intervals can then be done as a class activity. University students in an introductory geology course for non-majors can add their data from the trench log to an existing table with other faulting events already provided. The exercise can be made more challenging for advanced students by using logs from several different trenches, requiring students to design the table themselves, and giving students the uncertainties for the radiometric dates rather than simple ages for the strata. Most studentsÃ¢at all levelsÃ¢are initially frustrated by their inability to determine an exact date of faulting from the available data. They gain a new appreciation for the task of the geoscientist who attempts to relate geologic phenomena to the human, rather than geologic, time scale.

Cashman, Patricia

361

On the Interval Number of Special Graphs

On the Interval Number of Special Graphs JoÂ´ zsef Balogh,1 Pascal Ochem,2 and AndraÂ´s PluhaÂ´r3 1 AT, then t ! dÃ°eÃ°GÃ? Ã¾ 1Ã?=vÃ°GÃ?e. Thus the equality is attained for example by d-regular, triangle-free graphs, since is rich if it contains a triangle TH that after deleting the edges of TH, the remaining graph has at most

Ochem, Pascal

362

Interval Forecasting of Crude Oil Price

Uncertainty is the main obstacle in predicting crude oil price. Although there are various models and computational methods\\u000a on crude oil price forecasting in literature, most of them do not effectively predict the variability of crude oil price due\\u000a to uncertainty. Very recently, Hu and He [2] reported of using ILS (Interval Least Square) approach to forecast the stock\\u000a market

Shanying Xu; Xi Chen; Ai Han

2008-01-01

363

Local complexity functions of interval exchange transformations

NASA Astrophysics Data System (ADS)

We studied numerically complexity functions for interval exchange transformations. We have shown that they grow linearly in time as well as the ?-complexity function. Moreover, we found out that they depend also linearly on ? where ? is the Lebesgue measure of a set of initial points. This allows us to hypothesize that the dimension of the measure related to the ?-complexity function could be determined by studying the dependence of local complexity functions on ?.

Afraimovich, V.; Rechtman, R.

2009-04-01

364

Kriging Prediction Intervals Based on Semiparametric Bootstrap

Kriging is a widely used method for prediction, which, given observations of a (spatial) process, yields the best linear unbiased\\u000a predictor of the process at a new location. The construction of corresponding prediction intervals typically relies on Gaussian\\u000a assumptions. Here we show that the distribution of kriging predictors for non-Gaussian processes may be far from Gaussian,\\u000a even asymptotically. This emphasizes

Lina Schelin; Sara Sjöstedt-de Luna

2010-01-01

365

A Dynamic Programming Algorithm for Robust Runway Scheduling Bala Chandran and Hamsa Balakrishnan

sequences, the uncertainty in the system could result in the aircraft violating important safety constraints throughput and the probability that random deviations of aircraft from the schedule violate system successive aircraft, earliest and latest times for each aircraft, precedence constraints among aircraft

Gummadi, Ramakrishna

366

Time-Utility Function-Driven Switched Ethernet: Packet Scheduling Algorithm, Implementation,

-time applications due to the randomness in Ethernet's CSMA/CD protocol, the Ethernet is still attrac- tive for real-time Ethernet efforts such as TDMA [1], token-passing techniques [2], [3], Virtual Time Protocols [4], [5], [6Time-Utility Function-Driven Switched Ethernet: Packet Scheduling Algorithm, Implementation

Ravindran, Binoy

367

In this paper, a HGA (hybrid genetic algorithm) is proposed for permutation flowshop scheduling problems (PFSP) with total flowtime minimization, which are known to be NP-hard. One of the chromosomes in the initial population is constructed by a suitable heuristic and the others are yielded randomly. An artificial chromosome is generated by a weighted simple mining gene structure, with which

Yi Zhang; Xiaoping Li; Qian Wang

2009-01-01

368

LOW-DELAY SCHEDULING FOR GRASSMANNIAN BEAMFORMING WITH A SINR Claude Simon and Geert Leus

division multiple access (SDMA) [1]. A popular low-complexity joint-beamforming-and-scheduling algorithm is opportunistic SDMA (OSDMA) [2]. OSDMA uses a random set of orthogonal beamforming vectors. The users calculate. An algorithm that switches from TDMA to SDMA based on statistical assumptions has been proposed in [6

Langendoen, Koen

369

An Experimental Study of Scheduling and Duration of "Tier 2" First-Grade Reading Intervention

ERIC Educational Resources Information Center

This study compared the effects on reading outcomes of delivering supplemental, small-group intervention to first-grade students at risk for reading difficulties randomly assigned to one of three different treatment schedules: extended (4 sessions per week, 16 weeks; n = 66), concentrated (4 sessions per week, 8 weeks; n = 64), or distributed (2…

Denton, Carolyn A.; Cirino, Paul T.; Barth, Amy E.; Romain, Melissa; Vaughn, Sharon; Wexler, Jade; Francis, David J.; Fletcher, Jack M.

2011-01-01

370

A note on exact algorithms for the identical parallel machine scheduling problem

A recent paper by E. Mokotofi presents an exact algorithm for the classical PjjCmax scheduling problem, evaluating its average performance through computational ex- periments on a series of randomly generated test problems. It is shown that, on the same types of instances, an exact algorithm proposed 10 years ago by the authors of the present note outperforms the new algorithm

Mauro Dell’Amico; Silvano Martello

2005-01-01

371

Sampling intervals verification in pharmacokinetics studies.

Regulatory agencies do not specify how to plan the sampling intervals in pharmacokinetics (PK) studies. Every interval between each sampling point forms one of the fractions of the area under the curve (AUC). The aim of this study is to propose a method of qualitative evaluation of PK studies, on the basis of the analysis of the partial AUC fields' values. For the pharmacokinetic analysis, average concentrations of high variability drug-itraconazole were used before (BO) and after sampling intervals optimization (AO). PK calculations were performed using Phoenix(TM) WinNonlin 6.3(®) (Certara L.P.) and in house software Biokinetica 4.0. Arithmetic formula and acceptance limit (AL%) was established, below which the mean of partial fields (MAF) value in PK study can be considered optimal. In case of MAF the CV% value before optimization was 125.35 and after the optimization 46.51. In the cases of AUC fractions for several partial fields BO data, the AL% value was exceeded. The values of AUC fractions did not exceed AL% established for AO data. The paper proposes an empirical method of quality assessment, made on the basis of the percentage of the AUC fractions. This method can be used in the quality assessment of PK studies. PMID:24327523

Grabowski, T; Jaroszewski, J J; Sasinowska-Motyl, M; Bujalska-Zadro?ny, M

2014-08-01

372

Block Scheduling: Three Years Later.

ERIC Educational Resources Information Center

This is a followup study of teacher perceptions regarding block scheduling. The original study was done in 1996 at a small city high school in a predominantly rural county in Ohio. At that time, lack of communication was found to be the central theme in the resistance that emerged. This paper is based on data from written responses to open-ended…

Corley, Edward L.

373

Scheduled Castes and Land Deprivation

The paper is more in the form of setting broad contours for a comprehensive buildup. It is divided into four sections. The first section discusses the landlessness or nearlandlessness of majority of Scheduled Caste households and the benefits of land assigned to them as a part of redistributive land reform strategy. The second section analyses the impact of redistributive land

D. Narasimha Reddy

374

Scheduling Algebra Rob van Glabbeek

for this process is: a; (b || c; d) || e #12; 2 To visualize a schedule, the soÂcalled GANTT chart [Cla35] is used of the action (see fig. 2). Fig. 2: GANTT diagram It has already been observed [BEP96] that in an efficient

Pratt, Vaughan

375

Scheduling Algebra Rob van Glabbeek

for this process is: a; (b || c; d) || e #12;2 To visualize a schedule, the so-called GANTT chart [Cla35] is used of the action (see fig. 2). Fig. 2: GANTT diagram It has already been observed [BEP96] that in an efficient

Pratt, Vaughan

376

Online Scheduling with Bounded Migration

Consider the classical online scheduling problem where jobs that arrive one by one are assigned to identical parallel ma- chines with the objective of minimizing the makespan. We general- ize this problem by allowing the current assignment to be changed whenever a new job arrives, subject to the constraint that the to- tal size of moved jobs is bounded by

Peter Sanders; Naveen Sivadasan; Martin Skutella

2004-01-01

377

Flexible Schedules and Shift Work.

ERIC Educational Resources Information Center

Flexible work hours have gained prominence, as more than 25 million workers (27.6% of all full-time workers) can now vary their schedules. However, there has been little change since the mid-1980s in the proportion who work a shift other than a regular daytime shift. (JOW)

Beers, Thomas M.

2000-01-01

378

NCI Shady Grove Shuttle Schedule

NCI Shady Grove Shuttle Service Schedule Live Shuttle Route: http://ncishadygroveshuttle.com/map Effective March 2014 Shady Grove Metro Shuttle NCI Express Shuttle NCI SG SG Metro -- 6:00 AM -- 6:15 AM 6:15 AM 6:30 AM 6:30 AM 6:45 AM 6:45 AM

379

Highly efficient gang scheduling implementation

A new and more highly efficient gang scheduling implementation technique is the basis for this paper. Network preemption,in which network interface contexts are saved and restored, has already been proposed to enable parallel applications to perform efficent user-level communication. This network preemption technique can be used to for detecting global state, such as deadlock, of a parallel program execution. A

Atsushi Hori; Hiroshi Tezuka; Yutaka Ishikawa

1998-01-01

380

Packing Schemes for Gang Scheduling

. Jobs that do not require all processors in the system can bepacked together for gang scheduling. We examine accounting traces fromseveral parallel computers to show that indeed many jobs have small sizesand can be packed together. We then formulate a number of such packingalgorithms, and evaluate their effectiveness using simulations based onour workload study. The results are that two

Dror G. Feitelson

1996-01-01

381

Emeriti Hiking Group 2012 Schedule

Emeriti Hiking Group 2012 Schedule Dedicated to Health, Happiness, and Camaraderie Friday January 6-no hike Friday December 7 Hike and Wassail Barbara West 801-484-2996 For further information please call the Ohlsens at 801-364-2159 On hike day Cell Phone is 801-598-1237 #12;

Provancher, William

382

Emeriti Hiking Group 2013 Schedule

Emeriti Hiking Group 2013 Schedule Friday January 11 Spruces Snowshoe Stroll Bill Ohlsen, 801-273-9821 Friday Nov, 22 Jordan River S. with shuttle + Joyce Foster 801-278-6727 Friday Dec. 6 Hike and Wassail Barbara West 801-484-2996 For further information please call the Ohlsens at 801-364-2159 On hike day Cell

383

42 CFR 412.610 - Assessment schedule.

Code of Federal Regulations, 2010 CFR

...facility hospital stay is counted as day one of the patient assessment schedule. (c) Assessment schedules and references dates...inpatient rehabilitation facility must complete a patient assessment instrument upon the Medicare Part A...

2010-10-01

384

42 CFR 412.610 - Assessment schedule.

Code of Federal Regulations, 2011 CFR

...facility hospital stay is counted as day one of the patient assessment schedule. (c) Assessment schedules and references dates...inpatient rehabilitation facility must complete a patient assessment instrument upon the Medicare Part A...

2011-10-01

385

29 CFR 1952.273 - Developmental schedule.

Code of Federal Regulations, 2010 CFR

...2010-07-01 false Developmental schedule. 1952.273 Section 1952.273...Vermont § 1952.273 Developmental schedule. (a) Introduction and enactment...and Employees by January 1974; (f) Appointment of advisory committees for safety...

2010-07-01

386

29 CFR 1952.273 - Developmental schedule.

Code of Federal Regulations, 2011 CFR

...2011-07-01 false Developmental schedule. 1952.273 Section 1952.273...Vermont § 1952.273 Developmental schedule. (a) Introduction and enactment...and Employees by January 1974; (f) Appointment of advisory committees for safety...

2011-07-01

387

Human aspects of scheduling : a case study

This work presents a look at real-life production-floor scheduling, comparing and contrasting it to both normative OR theory and Cognitive Psychology theory. Relevant literature in OR, scheduling and psychology is reviewed, ...

Boasson, Yishai, 1973-

2006-01-01

388

Immunization Schedules for Preteens and Teens

... Immunize? Vaccines: The Basics Vaccines Home Share Compartir Immunization Schedules for Preteens and Teens in Easy-to-read Formats The recommended immunization schedules list the age or age range when ...

389

Immunization Schedules for Infants and Children

... Immunize? Vaccines: The Basics Vaccines Home Share Compartir Immunization Schedules for Infants and Children in Easy-to- ... Skipped Vaccines Spanish Version (en español) The recommended immunization schedule is designed to protect infants and children ...

390

Thread Scheduling in Embedded Runtime Systems.

??This study explores the design space of thread scheduler on the resource-constrained embedded run-time system. The current state-of-art priority-based scheduler heavily relies on developers deliberate… (more)

Dai, Linge

2014-01-01

391

Optimal scheduling of fighter aircraft maintenance

The effective scheduling of fighter aircraft maintenance in the Air Force is crucial to overall mission accomplishment. An effective maintenance scheduling policy maximizes the use of maintenance resources and aircraft ...

Cho, Philip Y

2011-01-01

392

Scheduling from the perspective of the application

Metacomputing is the aggregation of distributed and high-performance resources on coordinated networks. With careful scheduling, resource-intensive applications can be implemented efficiently on metacomputing systems at the sizes of interest to developers and users. In this paper we focus on the problem of scheduling applications on metacomputing systems. We introduce the concept of application-centric scheduling in which everything about the system is evaluated in terms of its impact on the application. Application-centric scheduling is used by virtually all metacomputer programmers to achieve performance on metacomputing systems. We describe two successful metacomputing applications to illustrate this approach, and describe AppLeS scheduling agents which generalize the application-centric scheduling approach. Finally, we show preliminary results which compare AppLeS-derived schedules with conventional strip and blocked schedules for a two-dimensional Jacobi code.

Berman, F.; Wolski, R. [Univ. of California, San Diego, La Jolla, CA (United States)

1996-12-31

393

75 FR 7411 - Schedule of Water Charges

Federal Register 2010, 2011, 2012, 2013

...BASIN COMMISSION 18 CFR Part 410 Schedule of Water Charges AGENCY: Delaware River Basin Commission...Manual--Part III--Basin Regulations--Water Supply Charges to revise the schedule of water charges. DATES: The Commission will hold...

2010-02-19

394

Scheduled Bright Light for Treatment of Insomnia in Older Adults

Objectives To determine if bright light can improve sleep in older individuals with insomnia. Design Single-blind, placebo-controlled, twelve-week, parallel-group randomized design comparing four treatment groups representing a factorial combination of two lighting conditions and two times of light administration. Setting At-home light treatment, eight office therapy sessions. Participants Thirty six females, fifteen males (63.6 ± 7.1 years) meeting primary insomnia criteria, recruited from the community. Interventions A 12-week program of sleep hygiene and exposure either to bright (?4000 lux) or dim light (?65 lux) scheduled daily in the morning or evening for 45 minutes. Measurements and Results Within group changes were observed for subjective sleep measures (sleep logs, questionnaires) after morning or evening bright light, but were not significantly different from those observed after exposure to scheduled dim light. Objective sleep changes (actigraphy, polysomnography) after treatment were not significantly different between bright and dim light groups. Scheduled light exposure was able to shift circadian phase predictably, but was unrelated to changes in objective or subjective sleep measures. A polymorphism in CLOCK predicted morningness, but did not moderate the effects of light on sleep. The phase angle between the circadian system (melatonin midpoint) and sleep (darkness) was able to predict the magnitude of phase delays, but not phase advances, engendered by bright light. Conclusion Except for one subjective measure, scheduled morning or evening bright light effects were not different from those of scheduled dim light. Thus, support was not found for bright light treatment of older individuals with primary insomnia. PMID:19187411

Friedman, Leah; Zeitzer, Jamie M.; Kushida, Clete; Zhdanova, Irina; Noda, Art; Lee, Tina; Schneider, Bret; Guilleminault, Christian; Sheikh, Javaid; Yesavage, Jerome A.

2009-01-01

395

ERIC Educational Resources Information Center

The effects of variable-interval (VI) and fixed-ratio (FR) schedules of reinforcement for work-related behavior and an organizer for the work materials (behavioral prosthesis) were evaluated with 3 adults with severe or profound mental retardation. The participants had been recommended for study because of high rates of off-task and aberrant…

Saunders, Richard R.; McEntee, Julie E.; Saunders, Muriel D.

2005-01-01

396

A minimal fuzzy gain scheduling speed controller with torque compensation for brushless DC (BLDC) motors is proposed in this paper. Because the speed measurement of BLDC motors is based on Hall sensors, which is build in for commutation purpose originally, the intervals of speed feedbacks are variable. This variable sampling system response of low resolution position sensors structure is different

Chung-Wen Hung; Jhih-Han Chen; Hsuan T. Chang

2011-01-01

397

Application of Interval and Fuzzy Analysis in Economic Modeling

\\u000a In this chapter, a new approach to solving interval and fuzzy equations based on the generalized procedure of interval extension\\u000a called “interval extended zero” method is proposed. The central for the proposed approach is the treatment of “interval zero”\\u000a as an interval centered around 0. It is shown that such proposition is not of heuristic nature, but is a direct

Ludmila Dymowa

398

Objective We examined the effects of an advanced sleep/wake schedule and morning short wavelength (blue) light in 25 adults (mean age±SD = 21.8±3 years; 13 women) with late sleep schedules and subclinical features of delayed sleep phase syndrome (DSPD). Methods After a baseline week, participants kept individualized, fixed, advanced 7.5-hour sleep schedules for 6 days. Participants were randomly assigned to groups to receive “blue” (470 nm, ~225 lux, n=12) or “dim” (< 1 lux, n=13) light for one hour after waking each day. Head-worn “Daysimeters” measured light exposure; actigraphs and sleep diaries confirmed schedule compliance. Salivary dim light melatonin onset (DLMO), self-reported sleep, and mood were examined with 2×2 ANOVA. Results After 6 days, both groups showed significant circadian phase advances, but morning blue-light was not associated with larger phase shifts than dim-light exposure. The average DLMO advances (mean±SD) were 1.5±1.1 hours in the dim light group and 1.4±0.7 hours in the blue light group. Conclusions Adherence to a fixed advanced sleep/wake schedule resulted in significant circadian phase shifts in young adults with subclinical DSPD with or without morning blue light exposure. Light/dark exposures associated with fixed early sleep schedules are sufficient to advance circadian phase in young adults. PMID:21704557

Sharkey, Katherine M.; Carskadon, Mary A.; Figueiro, Mariana G.; Zhu, Yong; Rea, Mark S.

2011-01-01

399

Real-time scheduling using minimum search

NASA Technical Reports Server (NTRS)

In this paper we consider a simple model of real-time scheduling. We present a real-time scheduling system called RTS which is based on Korf's Minimin algorithm. Experimental results show that the schedule quality initially improves with the amount of look-ahead search and tapers off quickly. So it sppears that reasonably good schedules can be produced with a relatively shallow search.

Tadepalli, Prasad; Joshi, Varad

1992-01-01

400

NSDL National Science Digital Library

The RandomWalk1D program simulates a random walk in one dimension for steps of unit length and equal time intervals. The default number of steps is N = 16 and the probability of going right or left at any step is the same (the probability p of going to the right for a single step is 0.5). RandomWalk1D is part of a suite of Open Source Physics programs that model aspects of Statistical and Thermal Physics (STP). The program is distributed as a ready-to-run (compiled) Java archive. Double clicking the stp RandomWalk1D.jar file will run the program if Java is installed on your computer. Additional programs can be found by searching ComPADRE for Open Source Physics, STP, or Statistical and Thermal Physics.

Gould, Harvey; Tobochnik, Jan; Christian, Wolfgang; Cox, Anne

2008-10-10

401

Affectionate Writing Reduces Total Cholesterol: Two Randomized, Controlled Trials

ERIC Educational Resources Information Center

In two 5-week trials, healthy college students were randomly assigned either to experimental or control groups. Participants in the experimental groups wrote about their affection for significant friends, relatives, and/or romantic partners for 20 minutes on three separate occasions; on the same schedule, those in the control groups wrote about…

Floyd, Kory; Mikkelson, Alan C.; Hesse, Colin; Pauley, Perry M.

2007-01-01

402

process of scheduled arrivals. Appointment schedules that balance the waiting time at the facility and Scheduling; Queuing Theory; Stochastic Methods 1 Introduction Developing appointment schedules for service are a possible highly variable demand and as a consequence low utilization and high waiting time (the time

Boucherie, Richard J.

403

1 of 2 Scheduling-Guidelines.docx Statler College Scheduling Guidelines

1 of 2 Scheduling-Guidelines.docx Statler College Scheduling Guidelines effective August 27, 2013 with the above information o Online at http://eventscheduling.wvu.edu and use MyID login o Send email to statler-scheduling@mail.wvu.edu o Visit a department scheduler Â· Statler College location FAQs http://www.statler

Mohaghegh, Shahab

404

Improving First-Come-First-Serve Job Scheduling by Gang Scheduling ?

Improving First-Come-First-Serve Job Scheduling by Gang Scheduling ? Uwe Schwiegelshohn and Ramin and emphasizes the notion of fairness. Severe fragmentation is prevented by using gang scheduling which is only of gang scheduling to be used in the management system of the IBM RS/6000 SP2, a widely available

Feitelson, Dror

405

An Integrated Approach to Parallel Scheduling Using Gang-Scheduling, Backfilling and Migration

An Integrated Approach to Parallel Scheduling Using Gang-Scheduling, Backfilling and Migration Y technique we analyze is backfilling, the second is gang-scheduling, and the third is migration. The main that, under certain conditions, a strategy that combines backfilling, gang-scheduling, and migration

Feitelson, Dror

406

Schedule Worksheet -Table of Contents Subject Description

Schedule Worksheet - Table of Contents Subject Description AAE AAE-Aero & Astro Engineering AGEC-Mechanical Engineering MGMT MGMT-Management MSE MSE-Materials Engineering #12;Schedule Worksheet - Table of Contents THTR-Theatre VCS VCS-Veterinary Clinical Sci VM VM-Veterinary Medicine #12;Schedule Worksheet

Pittendrigh, Barry

407

Schedule Worksheet -Table of Contents Subject Description

Schedule Worksheet - Table of Contents Subject Description AAE AAE-Aero & Astro Engineering AGEC-Mechanical Engineering MGMT MGMT-Management MSE MSE-Materials Engineering #12;Schedule Worksheet - Table of Contents THTR-Theatre VCS VCS-Veterinary Clinical Sci VM VM-Veterinary Medicine #12;Schedule Worksheet Campus

Pittendrigh, Barry

408

Distributed link scheduling with constant overhead

This paper proposes a new class of simple, distributed algo- rithms for scheduling in wireless networks. The algorithms generate new schedules in a distributed manner via simple local changes to existing schedules. The class is parame- terized by integers k ? 1. We show that algorithm k of our class achieves k\\/(k +2) of the capacity region, for every k

Sujay Sanghavi; Loc Bui; R. Srikant

2007-01-01

409

Real-time scheduling of freight railroads

This article presents a model for the optimization of freight trains schedules that is intended to be used as part of a real-time control system. The goal of this model is to provide a link between strategic schedules, which might be decided every month, and line dispatching or CAD models which need to know the scheduled arrival and departure times

Patrick T. Harker

1995-01-01

410

Gain scheduling using the Youla parameterization

Gain scheduling controllers are considered in this paper. The gain scheduling problem where the scheduling parameter vector ? cannot be measured directly, but needs to be estimated, is considered. An estimation of ? has been derived by using the Youla parametrization. The use of the Youla parameterization in connection with the estimation of ? gives a direct validation method for

Henrik Niemann; J. Stroustrup

1999-01-01

411

Scheduling Multiprocessor Tasks with Genetic Algorithms

In the multiprocessor scheduling problem, a given program is to be scheduled in a given multiprocessor system such that the program's execution time is minimized. This problem being very hard to solve exactly, many heuristic methods for finding a suboptimal schedule exist. We propose a new combined approach, where a genetic algorithm is improved with the introduction of some knowledge

Ricardo C. Corrêa; Afonso Ferreira; Pascal Rebreyend

1999-01-01

412

Early Departures Schedules The Rice Guide

Early Departures Schedules The Rice Guide to Houston Airports on METRO Buses http://www.ridemetro.org METRO info: 713 635 4000 METRO Police: 713 224 COPS (2677) Schedules for all METRO services are available at http://www.ridemetro.org/. Paper schedules are available at the METRO Ridestore next

413

CSC 1600: Chapter 5 CPU Scheduling

1 CSC 1600: Chapter 5 CPU Scheduling Review of Process States #12;2 OS Queuing Model CPUReady queue Does a Scheduler Take Decisions? When a process: 1. Switches from running to waiting state. 2. Switches from running to ready state. 3. Switches from waiting to ready. 4. Terminates. Scheduling under 1

414

Improved Utilization and Responsiveness with Gang Scheduling

Improved Utilization and Responsiveness with Gang Scheduling Dror G. Feitelson1 and Morris A. Jette is solved by using gang scheduling, because then the impact of each decision is limited to its time slice overall system utilization and responsiveness. Em- pirical evidence from using gang scheduling on a Cray T

Feitelson, Dror

415

Scheduling Aircraft Landings The Dynamic Case

Scheduling Aircraft Landings The Dynamic Case Master Thesis April 2007 Supervisor: Jens Clausen #12;Abstract This Master Theses is about solving the aircraft landing problem dynamically. Given an original landing schedule for the incoming aircraft this schedule are rescheduled whenever an aircraft

416

A visual calendar for scheduling group meetings

Scheduling group meetings requires access to participants' calendars, typically located in scattered pockets or desks. Placing participants' calendars on-line and using a rule-based scheduler to find a time slot would alleviate the problem to some extent, but it often is difficult to trust the results, because correct scheduling rules are elusive, varying with the participants and the agenda of a

David Beard; Murugappan Palaniappan; Alan Humm; David Banks; Anil Nair; Yen-Ping Shan

1990-01-01

417

Learning User Preferences in Distributed Calendar Scheduling

Within the fleld of software agents, there has been increas- ing interest in automating the process of calendar scheduling in recent years. Calendar (or meeting) scheduling is an example of a timetabling domain that is most naturally formulated and solved as a continuous, distributed problem. Fundamentally, it involves reconciliation of a given user's scheduling preferences with those of others that

Oh Jean; Stephen F. Smith

2004-01-01

418

14 CFR 406.139 - Joint procedural or discovery schedule.

Code of Federal Regulations, 2010 CFR

...2010-01-01 2010-01-01 false Joint procedural or discovery schedule. 406...Transportation Adjudications § 406.139 Joint procedural or discovery schedule. ...schedule. If the parties agree to a joint procedural or discovery schedule,...

2010-01-01

419

420

14 CFR 406.139 - Joint procedural or discovery schedule.

Code of Federal Regulations, 2011 CFR

...2011-01-01 2011-01-01 false Joint procedural or discovery schedule. 406...Transportation Adjudications § 406.139 Joint procedural or discovery schedule. ...schedule. If the parties agree to a joint procedural or discovery schedule,...

2011-01-01

421

14 CFR 13.217 - Joint procedural or discovery schedule.

Code of Federal Regulations, 2011 CFR

...2011-01-01 2011-01-01 false Joint procedural or discovery schedule. 13...FAA Civil Penalty Actions § 13.217 Joint procedural or discovery schedule. ...schedule. If the parties agree to a joint procedural or discovery schedule,...

2011-01-01

422

14 CFR 406.139 - Joint procedural or discovery schedule.

Code of Federal Regulations, 2013 CFR

...2013-01-01 2013-01-01 false Joint procedural or discovery schedule. 406...Transportation Adjudications § 406.139 Joint procedural or discovery schedule. ...schedule. If the parties agree to a joint procedural or discovery schedule,...

2013-01-01

423

49 CFR 1503.627 - Joint procedural or discovery schedule.

Code of Federal Regulations, 2010 CFR

...2010-10-01 2010-10-01 false Joint procedural or discovery schedule. 1503...Civil Penalty Actions § 1503.627 Joint procedural or discovery schedule. ...schedule. If the parties agree to a joint procedural or discovery schedule,...

2010-10-01

424

14 CFR 13.217 - Joint procedural or discovery schedule.

Code of Federal Regulations, 2013 CFR

...2013-01-01 2013-01-01 false Joint procedural or discovery schedule. 13...FAA Civil Penalty Actions § 13.217 Joint procedural or discovery schedule. ...schedule. If the parties agree to a joint procedural or discovery schedule,...

2013-01-01

425

14 CFR 13.217 - Joint procedural or discovery schedule.

Code of Federal Regulations, 2010 CFR

...2010-01-01 2010-01-01 false Joint procedural or discovery schedule. 13...FAA Civil Penalty Actions § 13.217 Joint procedural or discovery schedule. ...schedule. If the parties agree to a joint procedural or discovery schedule,...

2010-01-01

426

49 CFR 1503.627 - Joint procedural or discovery schedule.

Code of Federal Regulations, 2011 CFR

...2011-10-01 2011-10-01 false Joint procedural or discovery schedule. 1503...Civil Penalty Actions § 1503.627 Joint procedural or discovery schedule. ...schedule. If the parties agree to a joint procedural or discovery schedule,...

2011-10-01

427

49 CFR 1503.627 - Joint procedural or discovery schedule.

Code of Federal Regulations, 2013 CFR

...2013-10-01 2013-10-01 false Joint procedural or discovery schedule. 1503...Civil Penalty Actions § 1503.627 Joint procedural or discovery schedule. ...schedule. If the parties agree to a joint procedural or discovery schedule,...

2013-10-01

428

14 CFR 406.139 - Joint procedural or discovery schedule.

Code of Federal Regulations, 2012 CFR

...2012-01-01 2012-01-01 false Joint procedural or discovery schedule. 406...Transportation Adjudications § 406.139 Joint procedural or discovery schedule. ...schedule. If the parties agree to a joint procedural or discovery schedule,...

2012-01-01

429

49 CFR 1503.627 - Joint procedural or discovery schedule.

Code of Federal Regulations, 2012 CFR

...2012-10-01 2012-10-01 false Joint procedural or discovery schedule. 1503...Civil Penalty Actions § 1503.627 Joint procedural or discovery schedule. ...schedule. If the parties agree to a joint procedural or discovery schedule,...

2012-10-01

430

431

432

14 CFR 13.217 - Joint procedural or discovery schedule.

Code of Federal Regulations, 2012 CFR

...2012-01-01 2012-01-01 false Joint procedural or discovery schedule. 13...FAA Civil Penalty Actions § 13.217 Joint procedural or discovery schedule. ...schedule. If the parties agree to a joint procedural or discovery schedule,...

2012-01-01

433

5 CFR 6.2 - Schedules of excepted positions.

Code of Federal Regulations, 2011 CFR

...be listed in Schedule A. Schedule B. Positions other than...examination shall be listed in Schedule B. Appointments to these positions shall be...may be prescribed by OPM. Schedule C. Positions of a...

2011-01-01

434

5 CFR 6.2 - Schedules of excepted positions.

Code of Federal Regulations, 2010 CFR

...be listed in Schedule A. Schedule B. Positions other than...examination shall be listed in Schedule B. Appointments to these positions shall be...may be prescribed by OPM. Schedule C. Positions of a...

2010-01-01

435

Two Efficient Twin ELM Methods With Prediction Interval.

In the operational optimization and scheduling problems of actual industrial processes, such as iron and steel, and microelectronics, the operational indices and process parameters usually need to be predicted. However, for some input and output variables of these prediction models, there may exist a lot of uncertainties coming from themselves, the measurement error, the rough representation, and so on. In such cases, constructing a prediction interval (PI) for the output of the corresponding prediction model is very necessary. In this paper, two twin extreme learning machine (TELM) models for constructing PIs are proposed. First, we propose a regularized asymmetric least squares extreme learning machine (RALS-ELM) method, in which different weights of its squared error loss function are set according to whether the error of the model output is positive or negative in order that the above error can be differentiated in the parameter learning process, and Tikhonov regularization is introduced to reduce overfitting. Then, we propose an asymmetric Bayesian extreme learning machine (AB-ELM) method based on the Bayesian framework with the asymmetric Gaussian distribution (AB-ELM), in which the weights of its likelihood function are determined as the same method in RALS-ELM, and the type II maximum likelihood algorithm is derived to learn the parameters of AB-ELM. Based on RALS-ELM and AB-ELM, we use a pair of weights following the reciprocal relationship to obtain two nonparallel regressors, including a lower-bound regressor and an upper-bound regressor, respectively, which can be used for calculating the PIs. Finally, some discussions are given, about how to adjust the weights adaptively to meet the desired PI, how to use the proposed TELMs for nonlinear quantile regression, and so on. Results of numerical comparison on data from one synthetic regression problem, three University of California Irvine benchmark regression problems, and two actual industrial regression problems show the effectiveness of the proposed models. PMID:25423657

Ning, Kefeng; Liu, Min; Dong, Mingyu; Wu, Cheng; Wu, ZhanSong

2014-11-20

436

Background Aerobic interval exercise training has greater benefits on cardiovascular function as compared with aerobic continuous exercise training. Objective The present study aimed at analyzing the effects of both exercise modalities on acute and subacute hemodynamic responses of healthy rats. Methods Thirty male rats were randomly assigned into three groups as follows: continuous exercise (CE, n = 10); interval exercise (IE, n = 10); and control (C, n = 10). Both IE and CE groups performed a 30-minute exercise session. The IE group session consisted of three successive 4-minute periods at 60% of maximal velocity (Max Vel), with 4-minute recovery intervals at 40% of Max Vel. The CE group ran continuously at 50% of Max Vel. Heart rate (HR), blood pressure(BP), and rate pressure product (RPP) were measured before, during and after the exercise session. Results The CE and IE groups showed an increase in systolic BP and RPP during exercise as compared with the baseline values. After the end of exercise, the CE group showed a lower response of systolic BP and RPP as compared with the baseline values, while the IE group showed lower systolic BP and mean BP values. However, only the IE group had a lower response of HR and RPP during recovery. Conclusion In healthy rats, one interval exercise session, as compared with continuous exercise, induced similar hemodynamic responses during exercise. However, during recovery, the interval exercise caused greater reductions in cardiac workload than the continuous exercise. PMID:24270864

Borges, Juliana Pereira; Masson, Gustavo Santos; Tibiriçá, Eduardo; Lessa, Marcos Adriano

2014-01-01

437

Weibull Distribution From Interval Inspection Data

NASA Technical Reports Server (NTRS)

Most likely failure sequence assumed. Memorandum discusses application of Weibull distribution to statistics of failures of turbopump blades. Is generalization of well known exponential random probability distribution and useful in describing component-failure modes including aging effects. Parameters found from experimental data by method of maximum likelihood.

Rheinfurth, Mario H.

1987-01-01

438

ERIC Educational Resources Information Center

Methods of sample size planning are developed from the accuracy in parameter approach in the multiple regression context in order to obtain a sufficiently narrow confidence interval for the population squared multiple correlation coefficient when regressors are random. Approximate and exact methods are developed that provide necessary sample size…

Kelley, Ken

2008-01-01

439

Adaptive Parallel Job Scheduling with Flexible CoScheduling

Abstract—Many scientific and high-performance computing applications consist of multiple processes running on different processors that communicate frequently. Because of their synchronization needs, these applications can suffer severe performance penalties if their processes are not all coscheduled to run together. Two common approaches to coscheduling jobs are batch scheduling, wherein nodes are dedicated for the duration of the run, and gang scheduling, wherein time slicing is coordinated across processors. Both work well when jobs are load-balanced and make use of the entire parallel machine. However, these conditions are rarely met and most realistic workloads consequently suffer from both internal and external fragmentation, in which resources and processors are left idle because jobs cannot be packed with perfect efficiency. This situation leads to reduced utilization and suboptimal performance. Flexible CoScheduling (FCS) addresses this problem by monitoring each job’s computation granularity and communication pattern and scheduling jobs based on their synchronization and load-balancing requirements. In particular, jobs that do not require stringent synchronization are identified, and are not coscheduled; instead, these processes are used to reduce fragmentation. FCS has been fully implemented on top of the STORM resource manager on a 256-processor Alpha cluster and compared to batch, gang, and implicit coscheduling algorithms. This paper describes in detail the implementation of FCS and its performance evaluation with a variety of workloads, including large-scale benchmarks, scientific applications, and dynamic workloads. The experimental results show that FCS saturates at higher loads than other algorithms (up to 54 percent higher in some cases), and displays lower response times and slowdown than the other algorithms in nearly all scenarios.

Frachtenberg, Eitan; Feitelson, Dror; Petrini, Fabrizio; Fernandez, Juan

2005-11-01

440

This paper considers the design and the practical implementation of a stable multiple objective real-time scheduling problem for a complex production system. In this paper, a complex production system is viewed as a kind of systems producing a variety of products (multiple-part-type) under constraints and multiple production objectives often conflicting. Previously, fuzzy control theory and fuzzy intervals arithmetic have been

Karim Tamani; Reda Boukezzoula; Georges Habchi

2011-01-01

441

This article describes a factor analytically derived, self-report instrument—the Schedule of Compulsions, Obsessions, and Pathological Impulses (SCOPI)—using data from college students, adults, psychiatric outpatients, and patients with obsessive-compulsive disorder (OCD). The five SCOPI scales all are internally consistent (with coefficient alphas of .80 and higher) and are strongly stable across a 2-month interval (with retest correlations ranging from .79 to

David Watson; Kevin D. Wu

2005-01-01

442

How to Do Random Allocation (Randomization)

Purpose To explain the concept and procedure of random allocation as used in a randomized controlled study. Methods We explain the general concept of random allocation and demonstrate how to perform the procedure easily and how to report it in a paper. PMID:24605197

Shin, Wonshik

2014-01-01

443

A planning language for activity scheduling

NASA Technical Reports Server (NTRS)

Mission planning and scheduling of spacecraft operations are becoming more complex at NASA. Described here are a mission planning process; a robust, flexible planning language for spacecraft and payload operations; and a software scheduling system that generates schedules based on planning language inputs. The mission planning process often involves many people and organizations. Consequently, a planning language is needed to facilitate communication, to provide a standard interface, and to represent flexible requirements. The software scheduling system interprets the planning language and uses the resource, time duration, constraint, and alternative plan flexibilities to resolve scheduling conflicts.

Zoch, David R.; Lavallee, David; Weinstein, Stuart; Tong, G. Michael

1991-01-01

444

Stride Scheduling: Deterministic Proportional-Share Resource Management

This paper presents stride scheduling, a deterministic schedul- ing technique that efficiently supports the same flexible resource management abstractions introduced by lottery scheduling. Compared to lottery scheduling, stride schedul- ing achieves significantly improved accuracy over relative throughput rates, with significantly lower response time vari- ability. Stride scheduling implements proportional-share con- trol over processor time and other resources by cross-applying elements

Carl A. Waldspurger; William E. Weihl

1995-01-01

445

Backtracking-Based Instruction Scheduling to Fill Branch Delay Slots

Conventional schedulers schedule operations in dependence order and never revisit or undo a scheduling decision on any operation. In contrast, backtracking schedulers may unschedule operations and can often generate better schedules. This paper develops and evaluates the backtracking approach to fill branch delay slots. We first present the structure of a generic backtracking scheduling algorithm and prove that it terminates.

Ivan D. Baev; Waleed M. Meleis; Santosh G. Abraham

2002-01-01

446

Spike: AI scheduling for NASA's Hubble Space Telescope

The Spike scheduling system, developed for scheduling astronomical observations for NASA's Hubble Space Telescope (HST), is described. Spike is an activity-based AI scheduler which incorporates innovative approaches to constraint representation and reasoning and scheduling search. Although developed for space telescope scheduling, the Spike system was designed around a general scheduling-constraint framework that can be applied to other domains. Techniques for

Mark D. Johnston

1990-01-01

447

Distributed Scheduling Extension on Hadoop

NASA Astrophysics Data System (ADS)

Distributed computing splits a large-scale job into multiple tasks and deals with them on clusters. Cluster resource allocation is the key point to restrict the efficiency of distributed computing platform. Hadoop is the current most popular open-source distributed platform. However, the existing scheduling strategies in Hadoop are kind of simple and cannot meet the needs such as sharing the cluster for multi-user, ensuring a concept of guaranteed capacity for each job, as well as providing good performance for interactive jobs. This paper researches the existing scheduling strategies, analyses the inadequacy and adds three new features in Hadoop which can raise the weight of job temporarily, grab cluster resources by higher-priority jobs and support the computing resources share among multi-user. Experiments show they can help in providing better performance for interactive jobs, as well as more fairly share of computing time among users.

Dadan, Zeng; Xieqin, Wang; Ningkang, Jiang

448

Routine environmental monitoring schedule, calendar year 1995

This document provides Bechtel Hanford, Inc. (BHI) and Westinghouse Hanford Company (WHC) a schedule of monitoring and sampling routines for the Operational Environmental Monitoring (OEM) program during calendar year (CY) 1995. Every attempt will be made to consistently follow this schedule; any deviation from this schedule will be documented by an internal memorandum (DSI) explaining the reason for the deviation. The DSI will be issued by the scheduled performing organization and directed to Near-Field Monitoring. The survey frequencies for particular sites are determined by the technical judgment of Near-Field Monitoring and may depend on the site history, radiological status, use and general conditions. Additional surveys may be requested at irregular frequencies if conditions warrant. All radioactive wastes sites are scheduled to be surveyed at least annually. Any newly discovered wastes sites not documented by this schedule will be included in the revised schedule for CY 1995.

Schmidt, J.W.; Markes, B.M.; McKinney, S.M.

1994-12-01

449

Environmental surveillance master sampling schedule

Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the U.S. Department of Energy (DOE). This document contains the planned 1994 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP), Drinking Water Project, and Ground-Water Surveillance Project. Samples are routinely collected for the SESP and analyzed to determine the quality of air, surface water, soil, sediment, wildlife, vegetation, foodstuffs, and farm products at Hanford Site and surrounding communities. The responsibility for monitoring onsite drinking water falls outside the scope of the SESP. PNL conducts the drinking water monitoring project concurrent with the SESP to promote efficiency and consistency, utilize expertise developed over the years, and reduce costs associated with management, procedure development, data management, quality control, and reporting. The ground-water sampling schedule identifies ground-water sampling .events used by PNL for environmental surveillance of the Hanford Site. Sampling is indicated as annual, semi-annual, quarterly, or monthly in the sampling schedule. Some samples are collected and analyzed as part of ground-water monitoring and characterization programs at Hanford (e.g. Resources Conservation and Recovery Act (RCRA), Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), or Operational). The number of samples planned by other programs are identified in the sampling schedule by a number in the analysis column and a project designation in the Cosample column. Well sampling events may be merged to avoid redundancy in cases where sampling is planned by both-environmental surveillance and another program.

Bisping, L.E.

1995-02-01

450

Hourly Wind Speed Interval Prediction in Arid Regions

NASA Astrophysics Data System (ADS)

The long and extended warm and dry summers, the low rate of rain and humidity are the main factors that explain the increase of electricity consumption in hot arid regions. In such regions, the ventilating and air-conditioning installations, that are typically the most energy-intensive among energy consumption activities, are essential for securing healthy, safe and suitable indoor thermal conditions for building occupants and stored materials. The use of renewable energy resources such as solar and wind represents one of the most relevant solutions to overcome the increase of the electricity demand challenge. In the recent years, wind energy is gaining more importance among the researchers worldwide. Wind energy is intermittent in nature and hence the power system scheduling and dynamic control of wind turbine requires an estimate of wind energy. Accurate forecast of wind speed is a challenging task for the wind energy research field. In fact, due to the large variability of wind speed caused by the unpredictable and dynamic nature of the earth's atmosphere, there are many fluctuations in wind power production. This inherent variability of wind speed is the main cause of the uncertainty observed in wind power generation. Furthermore, producing wind power forecasts might be obtained indirectly by modeling the wind speed series and then transforming the forecasts through a power curve. Wind speed forecasting techniques have received substantial attention recently and several models have been developed. Basically two main approaches have been proposed in the literature: (1) physical models such as Numerical Weather Forecast and (2) statistical models such as Autoregressive integrated moving average (ARIMA) models, Neural Networks. While the initial focus in the literature has been on point forecasts, the need to quantify forecast uncertainty and communicate the risk of extreme ramp events has led to an interest in producing probabilistic forecasts. In short term context, probabilistic forecasts might be more relevant than point forecasts for the planner to build scenarios In this paper, we are interested in estimating predictive intervals of the hourly wind speed measures in few cities in United Arab emirates (UAE). More precisely, given a wind speed time series, our target is to forecast the wind speed at any specific hour during the day and provide in addition an interval with the coverage probability 0

interval we need to estimate the lower band (resp. upper band) which corresponds to the (1-p)/2-th (resp. (1+p)/2-th) conditional quantile. In this paper, a kernel-smoothed estimator of the conditional quantiles is introduced. The proposed non-parametric approach has many advantages since it is flexible because it does not need a specification of the model to work with (such as normal distribution or a linear relation). Here, we use a covariable that is correlated to the wind speed. In practice, many possible choices of the covariate are available. In fact, in addition to its historical data, the wind speed is highly correlated to temperature, humidity and wind direction. In this paper a comparison, in terms of Mean Absolute Prediction Errors and Interquartile Range, between those choices will be provided to show which covariates are more suitable to forecast wind speed.

Chaouch, M.; Ouarda, T.

2013-12-01

451

We investigate the procedure of checking for overlap between confidence intervals or standard error intervals to draw conclusions regarding hypotheses about differences between population parameters. Mathematical expressions and algebraic manipulations are given, and computer simulations are performed to assess the usefulness of confidence and standard error intervals in this manner. We make recommendations for their use in situations in which standard tests of hypotheses do not exist. An example is given that tests this methodology for comparing effective dose levels in independent probit regressions, an application that is also pertinent to derivations of LC50s for insect pathogens and of detectability half-lives for prey proteins or DNA sequences in predator gut analysis. PMID:15841249

Payton, Mark E.; Greenstone, Matthew H.; Schenker, Nathaniel

2003-01-01

452

Small-Sample Confidence Intervals For Impulse Response Functions

Bias-corrected bootstrap confidence intervals explicitly account for the bias and skewness of the small-sample distribution of the impulse response estimator, while retaining asymptotic validity in stationary autoregressions. Monte Carlo simulations for a wide range of bivariate models show that in small samples bias-corrected bootstrap intervals tend to be more accurate than delta method intervals, standard bootstrap intervals, and Monte Carlo

Lutz Kilian

1998-01-01

453

Maternal methadone therapy increases QTc interval in newborn infants

IntroductionProlongation of the QT interval is a risk factor for sudden death. Methadone treatment is a well-recognised cause of QT interval lengthening in adults. The effect of maternal methadone treatment on the QT interval of the newborn infant is not known. This is the first prospective study of corrected QT (QTc) interval in infants born to mothers receiving methadone.AimTo compare

R Parikh; T Hussain; G Holder; A Bhoyar; AK Ewer

2011-01-01

454

Multifactorial QT Interval Prolongation and Takotsubo Cardiomyopathy

A 71-year-old woman collapsed while working as a grocery store cashier. CPR was performed and an AED revealed torsades de pointes (TdP). She was subsequently defibrillated resulting in restoration of sinus rhythm with a QTc interval of 544?msec. Further evaluation revealed a diagnosis of Takotsubo Cardiomyopathy (TCM) contributing to the development of a multifactorial acquired long QT syndrome (LQTS). The case highlights the role of TCM as a cause of LQTS in the setting of multiple risk factors including old age, female gender, hypokalemia, and treatment with QT prolonging medications. It also highlights the multifactorial nature of acquired LQTS and lends support to growing evidence of an association with TCM. PMID:24822130

Hancox, Jules C.; Baranchuk, Adrian

2014-01-01

455

Anthracyclines and taxanes are very effective drugs in the treatment of advanced breast cancer. With G-CSF support, the dose-intensity of this combination can be increased by reducing the interval between chemotherapy cycles, the so-called ‘shortening of cycle time’. We treated 36 patients with advanced breast cancer in a multicentre phase I/II study. The treatment regimen consisted of epirubicin 75 mg m?2followed by paclitaxel 135 mg m?2(3 h) in combination with G-CSF. At least six patients were treated in each cohort and were evaluated over the first three cycles. Starting at an interval of 14 days, in subsequent cohorts of patients the interval could be shortened to 10 days. An 8-day interval was not feasible due mainly to incomplete neutrophil recovery at the day of the next scheduled cycle. In the 10-day interval cohort it was feasible to increase the paclitaxel dose to 175 mg m?2. The haematological and non-haematological toxicity was relatively mild. No cumulative myelosuppression was observed over at least three consecutive cycles. In combination with G-CSF, epirubicin 75 mg m?2and paclitaxel 175 mg m?2could be safely administered every 10 days over at least three cycles, enabling a dose intensity of 52 and 122 mg m?2per week, respectively. © 2000 Cancer Research Campaign PMID:10864197

Lalisang, R I; Voest, E E; Wils, J A; Nortier, J W; Erdkamp, F L; Hillen, H F; Wals, J; Schouten, H C; Blijham, G H

2000-01-01

456

ERIC Educational Resources Information Center

The use of confidence intervals (CIs) as an addition or as an alternative to null hypothesis significance testing (NHST) has been promoted as a means to make researchers more aware of the uncertainty that is inherent in statistical inference. Little is known, however, about whether presenting results via CIs affects how readers judge the…

Hoekstra, Rink; Johnson, Addie; Kiers, Henk A. L.

2012-01-01

457

Confidence Intervals as an Alternative to Significance Testing

The article argues to replace null hypothesis significance testing by confidence intervals. Correctly interpreted, confidence intervals avoid the problems associated with null hy- pothesis statistical testing. Confidence intervals are formally valid, do not depend on a- priori hypotheses and do not result in trivial knowledge. The first part presents critique of null hypothesis significance testing; the second part replies to

Eduard Brandstätter; Johannes Kepler

1999-01-01

458

Interval Volume Decomposer: A Topological Approach to Volume Traversal

Interval Volume Decomposer: A Topological Approach to Volume Traversal Shigeo Takahashia, Issei-9-3 Higashi-Ueno, Taito-ku, Tokyo, 110-0015 Japan ABSTRACT The Interval Volume Decomposer (IVD) is an interface for decomposing an entire volume into interval volumes each of which characterizes a distinctive

Takahashi, Shigeo

459

Construction of Quasi Interval Wavelet Based on Constrained Variational Principle

A construction method of the interval wavelet is proposed based on constrained variational principle, and a quasi-Shannon interval wavelet is constructed by this method. The linear, the conic, the quartic and the sine function are approximated using quasi interval wavelet, the affection of width parameter r and the number of external collocation points L on the computation precision is discussed,

Qin Ma; Shu-Li Mei; De-Hai Zhu

2009-01-01

460

Comparing Bacterial Genomes by Searching their Common Intervals

Comparing Bacterial Genomes by Searching their Common Intervals SÂ´ebastien Angibaud, Damien based on a set of con- served genes. Following this assumption, the common interval appears to be a good candidate. For evidences, we propose herein an approach to compute the common intervals between two circular

Fertin, Guillaume

461

Calculating Confidence Intervals for Effect Sizes Using Noncentral Distributions.

ERIC Educational Resources Information Center

This paper provides a brief review of the concepts of confidence intervals, effect sizes, and central and noncentral distributions. The use of confidence intervals around effect sizes is discussed. A demonstration of the Exploratory Software for Confidence Intervals (G. Cuming and S. Finch, 2001; ESCI) is given to illustrate effect size confidence…

Norris, Deborah

462

Weibull Prediction Intervals for a Future Number of Failures

Weibull Prediction Intervals for a Future Number of Failures Daniel J. Nordman Dept. of Statistics prediction intervals for the number of failures that will be observed in a future inspec- tion of a sample-based prediction intervals perform better than the alternatives. Key Words: Coverage probability, Prediction bounds

463

Weibull Prediction Intervals for a Future Number of Failures

Weibull Prediction Intervals for a Future Number of Failures Daniel J. Nordman Dept. of Statistics prediction intervals for the number of failures that will be observed in a future inspecÂ tion of a sampleÂbased prediction intervals perform better than the alternatives. Key Words: Coverage probability, Prediction bounds

464

On central tendency and dispersion measures for intervals and hypercubes

The uncertainty or the variability of the data may be treated by considering, rather than a single value for each data, the interval of values in which it may fall. This paper studies the derivation of basic description statistics for interval-valued datasets. We propose a geometrical approach in the determination of summary statistics (central tendency and dispersion measures) for interval-valued

Marie Chavent; Jérôme Saracco

2008-01-01

465

Exact interval propagation for the efficient solution of planar linkages

This paper presents an interval propagation algorithm for variables in single-loop linkages. Given al- lowed intervals of values for all variables, the algorithm provides, for every variable, the exact interval of values for which the linkage can actually be assembled. We show fur- ther how this algorithm can be integrated in a branch-and- bound search scheme, in order to solve

E. Celaya; T. Creemers; L. Ros

466

Introduction. The practice of modern pediatric dentistry requires delivery of quality care in combination with adherence to excellent business as well as time management principles. A definite appointment schedule should be presented to the parents on the first or second appointment. More importantly, the prescribed schedule should be followed to the best of the professional abilities of the pediatric dentist. Aims. The aim of the study was to assess the co-relation between appointment scheduling and patient satisfaction in a pediatric dental setup with the objective of understanding the parameters related to appointment scheduling to increase patient satisfaction. Method. A total of 40 patients, who visited the Department of Pediatric and Preventive Dentistry, YMT Dental College & Hospital, for dental treatment were selected on a random basis. A questionnaire with a set of 6 questions with a rating scale of 1–5 to assess the patient satisfaction related to appointment scheduling was prepared. Results. A significant number of the patients were happy with the existing appointment scheduling system barring a few exceptions. PMID:25610464

Katre, Amar N.

2014-01-01

467

Evolution of carfilzomib dose and schedule in patients with multiple myeloma: a historical overview.

Carfilzomib is a proteasome inhibitor that binds selectively and irreversibly to its target. In July 2012, carfilzomib received accelerated approval in the United States for the treatment of relapsed and refractory multiple myeloma. Based on emerging preclinical data and clinical results, the total dose, infusion time, and administration schedule of carfilzomib have evolved during phase I and phase II clinical studies, with the aim of optimizing the risk-benefit profile of the agent. Based on in vitro and in vivo findings and encouraging phase I tolerability data, a consecutive-day, twice-weekly dosing schedule was implemented early in the development program. Other phase II studies have led to further refinements in the dosing schedule of carfilzomib, resulting in the current approved schedule for carfilzomib to be administered intravenously over 2-10 min on 2 consecutive days each week for 3 weeks of a 28-day cycle. Prolonged infusion over 30 min has also been assessed in clinical studies to enable the use of higher carfilzomib doses with the aim of improving drug tolerability and efficacy. These data collectively informed the dosing and scheduling schemas for carfilzomib in ongoing trials, including phase I and II studies of combination regimens, and the randomized phase III trials ASPIRE, FOCUS, ENDEAVOR, and CLARION. Additional studies are underway to examine alternative dosing schedules (e.g., once-weekly dosing [CHAMPION-1]). PMID:24630735

Jakubowiak, Andrzej J

2014-07-01

468

NASA Astrophysics Data System (ADS)

We construct uncertainty intervals for weak Poisson signals in the presence of background. We consider the case where a primary experiment yields a realization of the signal plus background, and a second experiment yields a realization of the background. The data acquisition times, for the background-only experiment, Tbg, and the primary experiment, T, are selected so that their ratio, Tbg/T, varies from 1 to 25. The upper choice of 25 is motivated by an experimental study at the National Institute of Standards and Technology (NIST). The expected number of background counts in the primary experiment varies from 0.2 to 2. We construct 90% and 95% confidence intervals based on a propagation-of-errors method as well as two implementations of a Neyman procedure where acceptance regions are constructed based on a likelihood-ratio criterion that automatically determines whether the resulting confidence interval is one-sided or two-sided. In one of the implementations of the Neyman procedure due to Feldman and Cousins (FC), uncertainty in the expected background contribution is neglected. In the other implementation, we account for random uncertainty in the estimated expected background with a parametric bootstrap implementation of a method due to Conrad. We also construct minimum length Bayesian credibility intervals. For each method, we test for the presence of a signal based on the value of the lower endpoint of the uncertainty interval. In general, the propagation-of-errors method performs the worst compared to the other methods according to frequentist coverage and detection probability criteria, and sometimes produces nonsensical intervals where both endpoints are negative. The Neyman procedures generally yield intervals with better frequentist coverage properties compared to the Bayesian method except for some cases where Tbg/T = 1. In general, the Bayesian method yields intervals with lower detection probabilities compared to Neyman procedures. One of the main conclusions is that when Tbg/T is 5 or more and the expected background is 2 or less, the FC method outperforms the other methods considered. For Tbg/T = 1, 2 we observe that the Neyman procedure methods yield false detection probabilities for the case of no signal that are higher than expected given the nominal frequentist coverage of the interval. In contrast, for Tbg/T = 1, 2, the false detection probability of the Bayesian method is less than expected according to the nominal frequentist coverage.

Coakley, K. J.; Splett, J. D.; Simons, D. S.

2010-03-01

469

Background. Capecitabine administered for 7 days biweekly with oxaliplatin (XELOX) biweekly has been reported to have activity and safety profiles similar to those of standard capecitabine given for 14 days triweekly. Multiple studies have shown that the addition of bevacizumab to 5-fluorouracil–based chemotherapy is active and well tolerated. Methods. Patients with metastatic colorectal cancer (mCRC) were randomized to XELOX plus bevacizumab using a standard triweekly cycle (Q3W) or a dose-dense biweekly cycle (Q2W) schedule. The primary endpoint was the progression-free survival (PFS) interval. This trial is registered on ClinicalTrials.gov (identifier, NCT00159432). Results. In total, 435 U.S. patients were randomized. The median PFS intervals were 9.6 months in the Q3W group and 9.1 months in the Q2W group. The median overall survival times were 28.4 months and 22.1 months and the median times to treatment failure were 5.5 months and 3.4 months, respectively. Overall, gastrointestinal disorders were the most common (93%) adverse event (AE). Grade 3 or 4 AEs occurred in 75% and 81% of patients in the Q3W and Q2W groups, respectively. Treatment discontinuation as a result of diarrhea (5% versus 10%) and hand–foot syndrome (2% versus 9%) was less common in the Q3W group than in the Q2W group, respectively. Conclusions. Based on these results, the first-line treatment of U.S. patients with mCRC using a biweekly combination of XELOX and bevacizumab at the doses studied cannot be recommended. XELOX Q3W remains the preferred schedule for the management of mCRC. PMID:22622147

Mitchell, Edith P.; Cartwright, Thomas; Kwok, Ambrose; Hu, Sylvia; McKenna, Edward; Patt, Yehuda Z.

2012-01-01

470

With the issuance of this final rule, the Administrator of the Drug Enforcement Administration reschedules hydrocodone combination products from schedule III to schedule II of the Controlled Substances Act. This scheduling action is pursuant to the Controlled Substances Act which requires that such actions be made on the record after opportunity for a hearing through formal rulemaking. This action imposes the regulatory controls and administrative, civil, and criminal sanctions applicable to schedule II controlled substances on persons who handle (manufacture, distribute, dispense, import, export, engage in research, conduct instructional activities with, conduct chemical analysis with, or possess) or propose to handle hydrocodone combination products. PMID:25167591

2014-08-22

471

Job Scheduling Under the Portable Batch System

NASA Technical Reports Server (NTRS)

The typical batch queuing system schedules jobs for execution by a set of queue controls. The controls determine from which queues jobs may be selected. Within the queue, jobs are ordered first-in, first-run. This limits the set of scheduling policies available to a site. The Portable Batch System removes this limitation by providing an external scheduling module. This separate program has full knowledge of the available queued jobs, running jobs, and system resource usage. Sites are able to implement any policy expressible in one of several procedural language. Policies may range from "bet fit" to "fair share" to purely political. Scheduling decisions can be made over the full set of jobs regardless of queue or order. The scheduling policy can be changed to fit a wide variety of computing environments and scheduling goals. This is demonstrated by the use of PBS on an IBM SP-2 system at NASA Ames.

Henderson, Robert L.; Woodrow, Thomas S. (Technical Monitor)

1995-01-01

472

Dynamic Proportional Share Scheduling in Hadoop

NASA Astrophysics Data System (ADS)

We present the Dynamic Priority (DP) parallel task scheduler for Hadoop. It allows users to control their allocated capacity by adjusting their spending over time. This simple mechanism allows the scheduler to make more efficient decisions about which jobs and users to prioritize and gives users the tool to optimize and customize their allocations to fit the importance and requirements of their jobs. Additionally, it gives users the incentive to scale back their jobs when demand is high, since the cost of running on a slot is then also more expensive. We envision our scheduler to be used by deadline or budget optimizing agents on behalf of users. We describe the design and implementation of the DP scheduler and experimental results. We show that our scheduler enforces service levels more accurately and also scales to more users with distinct service levels than existing schedulers.

Sandholm, Thomas; Lai, Kevin

473

Multiband signal reconstruction for random equivalent sampling.

The random equivalent sampling (RES) is a sampling approach that can be applied to capture high speed repetitive signals with a sampling rate that is much lower than the Nyquist rate. However, the uneven random distribution of the time interval between the excitation pulse and the signal degrades the signal reconstruction performance. For sparse multiband signal sampling, the compressed sensing (CS) based signal reconstruction algorithm can tease out the band supports with overwhelming probability and reduce the impact of uneven random distribution in RES. In this paper, the mathematical model of RES behavior is constructed in the frequency domain. Based on the constructed mathematical model, the band supports of signal can be determined. Experimental results demonstrate that, for a signal with unknown sparse multiband, the proposed CS-based signal reconstruction algorithm is feasible, and the CS reconstruction algorithm outperforms the traditional RES signal reconstruction method. PMID:25362458

Zhao, Y J; Liu, C J

2014-10-01

474

Multiband signal reconstruction for random equivalent sampling

NASA Astrophysics Data System (ADS)

The random equivalent sampling (RES) is a sampling approach that can be applied to capture high speed repetitive signals with a sampling rate that is much lower than the Nyquist rate. However, the uneven random distribution of the time interval between the excitation pulse and the signal degrades the signal reconstruction performance. For sparse multiband signal sampling, the compressed sensing (CS) based signal reconstruction algorithm can tease out the band supports with overwhelming probability and reduce the impact of uneven random distribution in RES. In this paper, the mathematical model of RES behavior is constructed in the frequency domain. Based on the constructed mathematical model, the band supports of signal can be determined. Experimental results demonstrate that, for a signal with unknown sparse multiband, the proposed CS-based signal reconstruction algorithm is feasible, and the CS reconstruction algorithm outperforms the traditional RES signal reconstruction method.

Zhao, Y. J.; Liu, C. J.

2014-10-01

475

The Distribution of the Product Explains Normal Theory Mediation Confidence Interval Estimation

The distribution of the product has several useful applications. One of these applications is its use to form confidence intervals for the indirect effect as the product of 2 regression coefficients. The purpose of this article is to investigate how the moments of the distribution of the product explain normal theory mediation confidence interval coverage and imbalance. Values of the critical ratio for each random variable are used to demonstrate how the moments of the distribution of the product change across values of the critical ratio observed in research studies. Results of the simulation study showed that as skewness in absolute value increases, coverage decreases. And as skewness in absolute value and kurtosis increases, imbalance increases. The difference between testing the significance of the indirect effect using the normal theory versus the asymmetric distribution of the product is further illustrated with a real data example. This article is the first study to show the direct link between the distribution of the product and indirect effect confidence intervals and clarifies the results of previous simulation studies by showing why normal theory confidence intervals for indirect effects are often less accurate than those obtained from the asymmetric distribution of the product or from resampling methods.

Kisbu-Sakarya, Yasemin; MacKinnon, David P.; Mio?evi?, Milica

2014-01-01

476

Outcome variables that are semicontinuous with clumping at zero are commonly seen in biomedical research. In addition, the outcome measurement is sometimes subject to interval censoring and a lower detection limit (LDL). This gives rise to interval-censored observations with clumping below the LDL. Level of antibody against influenza virus measured by the hemagglutination inhibition assay is an example. The interval censoring is due to the assay's technical procedure. The clumping below LDL is likely a result of the lack of prior exposure in some individuals such that they either have zero level of antibodies or do not have detectable level of antibodies. Given a pair of such measurements from the same subject at two time points, a binary 'fold-increase' endpoint can be defined according to the ratio of these two measurements, as it often is in vaccine clinical trials. The intervention effect or vaccine immunogenicity can be assessed by comparing the binary endpoint between groups of subjects given different vaccines or placebos. We introduce a two-part random effects model for modeling the paired interval-censored data with clumping below the LDL. Based on the estimated model parameters, we propose to use Monte Carlo approximation for estimation of the 'fold-increase' endpoint and the intervention effect. Bootstrapping is used for variance estimation. The performance of the proposed method is demonstrated by simulation. We analyze antibody data from an influenza vaccine trial for illustration. Copyright © 2014 John Wiley & Sons, Ltd. PMID:25345814

Xu, Ying; Lam, K F; Cowling, Benjamin J; Bun Cheung, Yin

2015-01-30

477

FSILP: fuzzy-stochastic-interval linear programming for supporting municipal solid waste management.

Although many studies on municipal solid waste management (MSW management) were conducted under uncertain conditions of fuzzy, stochastic, and interval coexistence, the solution to the conventional linear programming problems of integrating fuzzy method with the other two was inefficient. In this study, a fuzzy-stochastic-interval linear programming (FSILP) method is developed by integrating Nguyen's method with conventional linear programming for supporting municipal solid waste management. The Nguyen's method was used to convert the fuzzy and fuzzy-stochastic linear programming problems into the conventional linear programs, by measuring the attainment values of fuzzy numbers and/or fuzzy random variables, as well as superiority and inferiority between triangular fuzzy numbers/triangular fuzzy-stochastic variables. The developed method can effectively tackle uncertainties described in terms of probability density functions, fuzzy membership functions, and discrete intervals. Moreover, the method can also improve upon the conventional interval fuzzy programming and two-stage stochastic programming approaches, with advantageous capabilities that are easily achieved with fewer constraints and significantly reduces consumption time. The developed model was applied to a case study of municipal solid waste management system in a city. The results indicated that reasonable solutions had been generated. The solution can help quantify the relationship between the change of system cost and the uncertainties, which could support further analysis of tradeoffs between the waste management cost and the system failure risk. PMID:21232844

Li, Pu; Chen, Bing

2011-04-01

478

Dosing Schedules for Pneumococcal Conjugate Vaccine

Since second generation pneumococcal conjugate vaccines (PCVs) targeting 10 and 13 serotypes became available in 2010, the number of national policy makers considering these vaccines has steadily increased. An important consideration for a national immunization program is the timing and number of doses—the schedule—that will best prevent disease in the population. Data on disease epidemiology and the efficacy or effectiveness of PCV schedules are typically considered when choosing a schedule. Practical concerns, such as the existing vaccine schedule, and vaccine program performance are also important. In low-income countries, pneumococcal disease and deaths typically peak well before the end of the first year of life, making a schedule that provides PCV doses early in life (eg, a 6-, 10- and 14-week schedule) potentially the best option. In other settings, a schedule including a booster dose may address disease that peaks in the second year of life or may be seen to enhance a schedule already in place. A large and growing body of evidence from immunogenicity studies, as well as clinical trials and observational studies of carriage, pneumonia and invasive disease, has been systematically reviewed; these data indicate that schedules of 3 or 4 doses all work well, and that the differences between these regimens are subtle, especially in a mature program in which coverage is high and indirect (herd) effects help enhance protection provided directly by a vaccine schedule. The recent World Health Organization policy statement on PCVs endorsed a schedule of 3 primary doses without a booster or, as a new alternative, 2 primary doses with a booster dose. While 1 schedule may be preferred in a particular setting based on local epidemiology or practical considerations, achieving high coverage with 3 doses is likely more important than the specific timing of doses. PMID:24336059

2014-01-01

479

Intelligent perturbation algorithms to space scheduling optimization

NASA Technical Reports Server (NTRS)

The limited availability and high cost of crew time and scarce resources make optimization of space operations critical. Advances in computer technology coupled with new iterative search techniques permit the near optimization of complex scheduling problems that were previously considered computationally intractable. Described here is a class of search techniques called Intelligent Perturbation Algorithms. Several scheduling systems which use these algorithms to optimize the scheduling of space crew, payload, and resource operations are also discussed.

Kurtzman, Clifford R.

1991-01-01

480

Multi-agent-based agile scheduling

The multi-agent systems paradigm represents one of the most promising approaches to the development of agile scheduling systems in manufacturing. Innovative and balanced perspectives of multi-agent approaches to agile scheduling are discussed in this paper, and several achieved results and current developments are described. The description of the multi-agent framework emphasizes the use of negotiation mechanism to improve the scheduling

Ricardo J. Rabelo; Luis M. Camarinha-matos; Hamideh Afsarmanesh

1999-01-01

481

Improved Utilization and Responsiveness with Gang Scheduling

Most commercial multicomputers use space-slicingschemes in which each scheduling decision has an unknownimpact on the future: should a job be scheduled,risking that it will block other larger jobs later,or should the processors be left idle for now in anticipationof future arrivals? This dilemma is solvedby using gang scheduling, because then the impact ofeach decision is limited to its time slice,

Dror G. Feitelson; Morris A. Jette

1997-01-01

482

A hybrid job-shop scheduling system

NASA Technical Reports Server (NTRS)

The intention of the scheduling system developed at the Fraunhofer-Institute for Material Flow and Logistics is the support of a scheduler working in a job-shop. Due to the existing requirements for a job-shop scheduling system the usage of flexible knowledge representation and processing techniques is necessary. Within this system the attempt was made to combine the advantages of symbolic AI-techniques with those of neural networks.

Hellingrath, Bernd; Robbach, Peter; Bayat-Sarmadi, Fahid; Marx, Andreas

1992-01-01

483

19 CFR 162.45a - Summary forfeiture of Schedule I and Schedule II controlled substances.

Code of Federal Regulations, 2010 CFR

...DEPARTMENT OF THE TREASURY (CONTINUED) INSPECTION, SEARCH, AND SEIZURE Treatment of Seized Merchandise § 162.45a Summary...881. See 21 U.S.C. 965. Accordingly, in the case of a seizure of Schedule I or Schedule II controlled...

2010-04-01

484

Scheduling and Event Services Guidelines Manual

Scheduling and Event Services Guidelines Manual A guide to holding an event at the University ................................................................................................... 2 About the Manual .............................................................................. 10 Audio Visual Options and Definitions

Cina, Jeff

485

Network-aware HEFT scheduling for grid.

We present a network-aware HEFT. The original HEFT does not take care of parallel network flows while designing its schedule for a computational environment where computing nodes are physically at distant locations. In the proposed mechanism, such data transfers are stretched to their realistic completion time. A HEFT schedule with stretched data transfers exhibits the realistic makespan of the schedule. It is shown how misleading a schedule can be if the impact of parallel data transfers that share a bottleneck is ignored. A network-aware HEFT can be used to yield a benefit for Grid applications. PMID:24587719

Yousaf, Muhammad Murtaza; Welzl, Michael

2014-01-01

486

Production scheduling and rescheduling with genetic algorithms.

A general model for job shop scheduling is described which applies to static, dynamic and non-deterministic production environments. Next, a Genetic Algorithm is presented which solves the job shop scheduling problem. This algorithm is tested in a dynamic environment under different workload situations. Thereby, a highly efficient decoding procedure is proposed which strongly improves the quality of schedules. Finally, this technique is tested for scheduling and rescheduling in a non-deterministic environment. It is shown by experiment that conventional methods of production control are clearly outperformed at reasonable run-time costs. PMID:10199993

Bierwirth, C; Mattfeld, D C

1999-01-01

487

Objective: To compare the efficacy of a short messaging service (SMS) text messaging and phone reminder to improve attendance rates at a health promotion center. Methods: A total of 1 859 participants who had scheduled appointments in the health promotion center of our hospital from April 2007 to May 2007 were enrolled in the study and randomly assigned into 3 groups: control (no reminder) group, SMS text messaging reminder group and telephone reminder group. Attendance rates and costs of interventions were collected. Results: A total of 1848 participants were eligible for analysis. Attendance rates of control, SMS and telephone groups were 80.5%, 87.5% and 88.3%, respectively. The attendance rates were significantly higher in SMS and telephone groups than that in the control group, with odds ratio 1.698, 95% confidence interval 1.224 to 2.316, P=0.001 in the SMS group, and odds ratio 1.829, 95% confidence interval 1.333 to 2.509, P<0.001 in the telephone group. However, there was no difference between the SMS group and the telephone group (P=0.670). The cost effectiveness analysis showed that the cost per attendance for the SMS group (0.31 Yuan) was significantly lower than that for the telephone group (0.48 Yuan). Conclusion: SMS and telephone are effective reminders for improving attendance rate at a health promotion center. SMS reminder may be more cost-effective compared with the telephone reminder. PMID:18196610

Chen, Zhou-wen; Fang, Li-zheng; Chen, Li-ying; Dai, Hong-lei

2008-01-01

488

Piercing d-intervals Noga Alon Abstract A (homogeneous) d-interval is a union of d closed;References [1] N. Alon and G. Kalai, Bounding the piercing number, Discrete and Computational Geometry 13 (1995), 245-256. [2] N. Alon and D. J. Kleitman, Piercing convex sets and the Hadwiger Debrunner (p, q

Shamir, Ron

489

Finding relevant sequences in time series containing crisp, interval, and fuzzy interval data.

Finding similar sequences in time series has received much attention and is a widely studied topic. Most existing approaches in the time series area focus on the efficiency of algorithms but seldom provide a means to handle imprecise data. In this paper, a more general approach is proposed to measure the distance of time sequences containing crisp values, intervals, and fuzzy intervals as well. The concept of distance measurement and its associated dynamic-programming-based algorithms are described. In addition to finding the sequences with similar evolving trends, a means of finding the sequences with opposite evolving tendencies is also proposed, which is usually omitted in current related research but could be of great interest to many users. PMID:15503502

Liao, Stephen Shaoyi; Tang, Tony Heng; Liu, Wei-Yi

2004-10-01

490

Function approximation using adaptive and overlapping intervals

A problem common to many disciplines is to approximate a function given only the values of the function at various points in input variable space. A method is proposed for approximating a function of several to one variable. The model takes the form of weighted averaging of overlapping basis functions defined over intervals. The number of such basis functions and their parameters (widths and centers) are automatically determined using given training data and a learning algorithm. The proposed algorithm can be seen as placing a nonuniform multidimensional grid in the input domain with overlapping cells. The non-uniformity and overlap of the cells is achieved by a learning algorithm to optimize a given objective function. This approach is motivated by the fuzzy modeling approach and a learning algorithms used for clustering and classification in pattern recognition. The basics of why and how the approach works are given. Few examples of nonlinear regression and classification are modeled. The relationship between the proposed technique, radial basis neural networks, kernel regression, probabilistic neural networks, and fuzzy modeling is explained. Finally advantages and disadvantages are discussed.

Patil, R.B.

1995-05-01

491

Statistical Coding and Decoding of Heartbeat Intervals

The heart integrates neuroregulatory messages into specific bands of frequency, such that the overall amplitude spectrum of the cardiac output reflects the variations of the autonomic nervous system. This modulatory mechanism seems to be well adjusted to the unpredictability of the cardiac demand, maintaining a proper cardiac regulation. A longstanding theory holds that biological organisms facing an ever-changing environment are likely to evolve adaptive mechanisms to extract essential features in order to adjust their behavior. The key question, however, has been to understand how the neural circuitry self-organizes these feature detectors to select behaviorally relevant information. Previous studies in computational perception suggest that a neural population enhances information that is important for survival by minimizing the statistical redundancy of the stimuli. Herein we investigate whether the cardiac system makes use of a redundancy reduction strategy to regulate the cardiac rhythm. Based on a network of neural filters optimized to code heartbeat intervals, we learn a population code that maximizes the information across the neural ensemble. The emerging population code displays filter tuning proprieties whose characteristics explain diverse aspects of the autonomic cardiac regulation, such as the compromise between fast and slow cardiac responses. We show that the filters yield responses that are quantitatively similar to observed heart rate responses during direct sympathetic or parasympathetic nerve stimulation. Our findings suggest that the heart decodes autonomic stimuli according to information theory principles analogous to how perceptual cues are encoded by sensory systems. PMID:21694763

Lucena, Fausto; Barros, Allan Kardec; Príncipe, José C.; Ohnishi, Noboru

2011-01-01

492

The Interval approach to braneworld gravity

Gravity in five-dimensional braneworld backgrounds may exhibit extra scalar degrees of freedom with problematic features, including kinetic ghosts and strong coupling behavior. Analysis of such effects is hampered by the standard heuristic approaches to braneworld gravity, which use the equations of motion as the starting point, supplemented by orbifold projections and junction conditions. Here we develop the interval approach to braneworld gravity, which begins with an action principle. This shows how to implement general covariance, despite allowing metric fluctuations that do not vanish on the boundaries. We reproduce simple Z{sub 2} orbifolds of gravity, even though in this approach we never perform a Z{sub 2} projection. We introduce a family of ''straight gauges'', which are bulk coordinate systems in which both branes appear as straight slices in a single coordinate patch. Straight gauges are extremely useful for analyzing metric fluctuations in braneworld models. By explicit gauge fixing, we show that a general AdS{sub 5}/AdS{sub 4} setup with two branes has at most a radion, but no physical ''brane-bending'' modes.

Carena, Marcela; /Fermilab; Lykken, Joseph D.; /Fermilab /Chicago U., EFI; Park, Minjoon; /Chicago U., EFI

2005-06-01

493

Balanced reconnection intervals: four case studies

NASA Astrophysics Data System (ADS)

During steady magnetospheric convection (SMC) events the magnetosphere is active, yet there are no data signatures of a large scale reconfiguration, such as a substorm. While this definition has been used for years it fails to elucidate the true physics that is occurring within the magnetosphere, which is that the dayside merging rate and the nightside reconnection rate balance. Thus, it is suggested that these events be renamed Balanced Reconnection Intervals (BRIs). This paper investigates four diverse BRI events that support the idea that