Science.gov

Sample records for expected utility maximization

  1. Expected Power-Utility Maximization Under Incomplete Information and with Cox-Process Observations

    SciTech Connect

    Fujimoto, Kazufumi; Nagai, Hideo; Runggaldier, Wolfgang J.

    2013-02-15

    We consider the problem of maximization of expected terminal power utility (risk sensitive criterion). The underlying market model is a regime-switching diffusion model where the regime is determined by an unobservable factor process forming a finite state Markov process. The main novelty is due to the fact that prices are observed and the portfolio is rebalanced only at random times corresponding to a Cox process where the intensity is driven by the unobserved Markovian factor process as well. This leads to a more realistic modeling for many practical situations, like in markets with liquidity restrictions; on the other hand it considerably complicates the problem to the point that traditional methodologies cannot be directly applied. The approach presented here is specific to the power-utility. For log-utilities a different approach is presented in Fujimoto et al. (Preprint, 2012).

  2. Robust estimation by expectation maximization algorithm

    NASA Astrophysics Data System (ADS)

    Koch, Karl Rudolf

    2013-02-01

    A mixture of normal distributions is assumed for the observations of a linear model. The first component of the mixture represents the measurements without gross errors, while each of the remaining components gives the distribution for an outlier. Missing data are introduced to deliver the information as to which observation belongs to which component. The unknown location parameters and the unknown scale parameter of the linear model are estimated by the EM algorithm, which is iteratively applied. The E (expectation) step of the algorithm determines the expected value of the likelihood function given the observations and the current estimate of the unknown parameters, while the M (maximization) step computes new estimates by maximizing the expectation of the likelihood function. In comparison to Huber's M-estimation, the EM algorithm does not only identify outliers by introducing small weights for large residuals but also estimates the outliers. They can be corrected by the parameters of the linear model freed from the distortions by gross errors. Monte Carlo methods with random variates from the normal distribution then give expectations, variances, covariances and confidence regions for functions of the parameters estimated by taking care of the outliers. The method is demonstrated by the analysis of measurements with gross errors of a laser scanner.

  3. Bridging Utility Maximization and Regret Minimization

    E-print Network

    Chiesa, Alessandro

    2013-12-03

    We relate the strategies obtained by (1) utility maximizers who use regret to refine their set of undominated strategies, and (2) regret minimizers who use weak domination to refine their sets of regret-minimizing strategies.

  4. Maximizing Resource Utilization in Video Streaming Systems

    ERIC Educational Resources Information Center

    Alsmirat, Mohammad Abdullah

    2013-01-01

    Video streaming has recently grown dramatically in popularity over the Internet, Cable TV, and wire-less networks. Because of the resource demanding nature of video streaming applications, maximizing resource utilization in any video streaming system is a key factor to increase the scalability and decrease the cost of the system. Resources to…

  5. Blood detection in wireless capsule endoscopy using expectation maximization clustering

    NASA Astrophysics Data System (ADS)

    Hwang, Sae; Oh, JungHwan; Cox, Jay; Tang, Shou Jiang; Tibbals, Harry F.

    2006-03-01

    Wireless Capsule Endoscopy (WCE) is a relatively new technology (FDA approved in 2002) allowing doctors to view most of the small intestine. Other endoscopies such as colonoscopy, upper gastrointestinal endoscopy, push enteroscopy, and intraoperative enteroscopy could be used to visualize up to the stomach, duodenum, colon, and terminal ileum, but there existed no method to view most of the small intestine without surgery. With the miniaturization of wireless and camera technologies came the ability to view the entire gestational track with little effort. A tiny disposable video capsule is swallowed, transmitting two images per second to a small data receiver worn by the patient on a belt. During an approximately 8-hour course, over 55,000 images are recorded to a worn device and then downloaded to a computer for later examination. Typically, a medical clinician spends more than two hours to analyze a WCE video. Research has been attempted to automatically find abnormal regions (especially bleeding) to reduce the time needed to analyze the videos. The manufacturers also provide the software tool to detect the bleeding called Suspected Blood Indicator (SBI), but its accuracy is not high enough to replace human examination. It was reported that the sensitivity and the specificity of SBI were about 72% and 85%, respectively. To address this problem, we propose a technique to detect the bleeding regions automatically utilizing the Expectation Maximization (EM) clustering algorithm. Our experimental results indicate that the proposed bleeding detection method achieves 92% and 98% of sensitivity and specificity, respectively.

  6. IMPROVED COLOR BARCODES VIA EXPECTATION MAXIMIZATION STYLE INTERFERENCE CANCELLATION

    E-print Network

    Sharma, Gaurav

    IMPROVED COLOR BARCODES VIA EXPECTATION MAXIMIZATION STYLE INTERFERENCE CANCELLATION Orhan Bulan- work for extending monochrome barcodes to color with increased data rates. The undesired absorption to cross-channel color interference that significantly dete- riorates the performance of the color barcode

  7. PEM-PCA: A Parallel Expectation-Maximization PCA Face Recognition Architecture

    PubMed Central

    Rujirakul, Kanokmon; Arnonkijpanich, Banchar

    2014-01-01

    Principal component analysis or PCA has been traditionally used as one of the feature extraction techniques in face recognition systems yielding high accuracy when requiring a small number of features. However, the covariance matrix and eigenvalue decomposition stages cause high computational complexity, especially for a large database. Thus, this research presents an alternative approach utilizing an Expectation-Maximization algorithm to reduce the determinant matrix manipulation resulting in the reduction of the stages' complexity. To improve the computational time, a novel parallel architecture was employed to utilize the benefits of parallelization of matrix computation during feature extraction and classification stages including parallel preprocessing, and their combinations, so-called a Parallel Expectation-Maximization PCA architecture. Comparing to a traditional PCA and its derivatives, the results indicate lower complexity with an insignificant difference in recognition precision leading to high speed face recognition systems, that is, the speed-up over nine and three times over PCA and Parallel PCA. PMID:24955405

  8. An Expectation-Maximization Method for Calibrating Synchronous Machine Models

    SciTech Connect

    Meng, Da; Zhou, Ning; Lu, Shuai; Lin, Guang

    2013-07-21

    The accuracy of a power system dynamic model is essential to its secure and efficient operation. Lower confidence in model accuracy usually leads to conservative operation and lowers asset usage. To improve model accuracy, this paper proposes an expectation-maximization (EM) method to calibrate the synchronous machine model using phasor measurement unit (PMU) data. First, an extended Kalman filter (EKF) is applied to estimate the dynamic states using measurement data. Then, the parameters are calculated based on the estimated states using maximum likelihood estimation (MLE) method. The EM method iterates over the preceding two steps to improve estimation accuracy. The proposed EM method’s performance is evaluated using a single-machine infinite bus system and compared with a method where both state and parameters are estimated using an EKF method. Sensitivity studies of the parameter calibration using EM method are also presented to show the robustness of the proposed method for different levels of measurement noise and initial parameter uncertainty.

  9. Polynomial-Time Approximation Schemes for Maximizing Gross Substitutes Utility

    E-print Network

    Shioura, Akiyoshi

    Polynomial-Time Approximation Schemes for Maximizing Gross Substitutes Utility under Budget@dais.is.tohoku.ac.jp Abstract. We consider the maximization of a gross substitutes utility function under budget constraints. This problem is a general- ization of the budgeted max-weight matroid intersection problem to the one

  10. Statistical Inference of DNA Translocation using Parallel Expectation Maximization

    NASA Astrophysics Data System (ADS)

    Emmett, Kevin; Rosenstein, Jacob; Pfau, David; Bamberger, Akiva; Shepard, Ken; Wiggins, Chris

    2013-03-01

    DNA translocation through a nanopore is an attractive candidate for a next-generation DNA sequencing platform, however the stochastic motion of the molecules within the pore, allowing both forward and backward movement, prevents easy inference of the true sequence from observed data. We model diffusion of an input DNA sequence through a nanopore as a biased random walk with noise, and describe an algorithm for efficient statistical reconstruction of the input sequence, given data consisting of a set of time series traces. The data is modeled as a Hidden Markov Model, and parallel expectation maximization is used to learn the most probable input sequence generating the observed traces. Bounds on inference accuracy are analyzed as a function of model parameters, including forward bias, error rate, and the number of traces. The number of traces is shown to have the strongest influence on algorithm performance, allowing for high inference accuracy even in extremely noisy environments. Incorrectly identified state transitions account for the majority of inference errors, and we introduce entropy-based metaheuristics for identifying and eliminating these errors. Inference is robust, fast, and scales to input sequences on the order of several kilobases.

  11. Fluorescence photobleaching correction for expectation-maximization algorithm

    NASA Astrophysics Data System (ADS)

    Conchello, Jose-Angel

    1995-03-01

    In 3D fluorescence microscopy, a series of 2D images is collected at different focal settings through the specimen. Each image in this series contains the in-focus plane plus contributions from out-of-focus structures that blur the image. Furthermore, as the series is collected the fluorescent dye in the specimen fades over time in response to the total excitation light dosage which progressively increase as more optical slices are collected. Thus the different optical slices are 2D images of different 3D objects, in the sense that at each time point, the object has a different overall intensity. To date, the approach to compensate for this decay has been to precondition the image by dividing the intensities in each optical slice by a decaying exponential before processing the image by any of a number of existing deblurring algorithms. We have now directly incorporated fluorescent decay into maximum-likelihood estimators for the 3D distribution of fluorescent dye. We derived a generalized expectation-maximization algorithm for the simultaneous estimation of the decay constant, considered homogeneous, and the distribution of fluorescent dye.

  12. Matching Pupils and Teachers to Maximize Expected Outcomes.

    ERIC Educational Resources Information Center

    Ward, Joe H., Jr.; And Others

    To achieve a good teacher-pupil match, it is necessary (1) to predict the learning outcomes that will result when each student is instructed by each teacher, (2) to use the predicted performance to compute an Optimality Index for each teacher-pupil combination to indicate the quality of each combination toward maximizing learning for all students,…

  13. Coding for Parallel Links to Maximize the Expected Value of Decodable Messages

    NASA Technical Reports Server (NTRS)

    Klimesh, Matthew A.; Chang, Christopher S.

    2011-01-01

    When multiple parallel communication links are available, it is useful to consider link-utilization strategies that provide tradeoffs between reliability and throughput. Interesting cases arise when there are three or more available links. Under the model considered, the links have known probabilities of being in working order, and each link has a known capacity. The sender has a number of messages to send to the receiver. Each message has a size and a value (i.e., a worth or priority). Messages may be divided into pieces arbitrarily, and the value of each piece is proportional to its size. The goal is to choose combinations of messages to send on the links so that the expected value of the messages decodable by the receiver is maximized. There are three parts to the innovation: (1) Applying coding to parallel links under the model; (2) Linear programming formulation for finding the optimal combinations of messages to send on the links; and (3) Algorithms for assisting in finding feasible combinations of messages, as support for the linear programming formulation. There are similarities between this innovation and methods developed in the field of network coding. However, network coding has generally been concerned with either maximizing throughput in a fixed network, or robust communication of a fixed volume of data. In contrast, under this model, the throughput is expected to vary depending on the state of the network. Examples of error-correcting codes that are useful under this model but which are not needed under previous models have been found. This model can represent either a one-shot communication attempt, or a stream of communications. Under the one-shot model, message sizes and link capacities are quantities of information (e.g., measured in bits), while under the communications stream model, message sizes and link capacities are information rates (e.g., measured in bits/second). This work has the potential to increase the value of data returned from spacecraft under certain conditions.

  14. Disconfirmation of Expectations of Utility in e-Learning

    ERIC Educational Resources Information Center

    Cacao, Rosario

    2013-01-01

    Using pre-training and post-training paired surveys in e-learning based training courses, we have compared the "expectations of utility," measured at the beginning of an e-learning course, with the "perceptions of utility," measured at the end of the course, and related it with the trainees' motivation. We have concluded…

  15. Optimal weight based on energy imbalance and utility maximization

    NASA Astrophysics Data System (ADS)

    Sun, Ruoyan

    2016-01-01

    This paper investigates the optimal weight for both male and female using energy imbalance and utility maximization. Based on the difference of energy intake and expenditure, we develop a state equation that reveals the weight gain from this energy gap. We ?construct an objective function considering food consumption, eating habits and survival rate to measure utility. Through applying mathematical tools from optimal control methods and qualitative theory of differential equations, we obtain some results. For both male and female, the optimal weight is larger than the physiologically optimal weight calculated by the Body Mass Index (BMI). We also study the corresponding trajectories to steady state weight respectively. Depending on the value of a few parameters, the steady state can either be a saddle point with a monotonic trajectory or a focus with dampened oscillations.

  16. Prediction on Travel-Time Distribution for Freeways Using Online Expectation Maximization Algorithm

    E-print Network

    Horowitz, Roberto

    Prediction on Travel-Time Distribution for Freeways Using Online Expectation Maximization Algorithm to freeway travel-time prediction. The approach uses the Link-Node Cell Transmission Model (LN-CTM) to model traffic and provides a probability distribution for travel time. On-ramp and mainline flow profiles

  17. J. Mol. Rid. (1992) 223, 159.,170 Expectation Maximization Algorithm for Identifying

    E-print Network

    Stormo, Gary

    1992-01-01

    May 1991; accepted .S #@ember 1991) An Expectation Maximization algorithm for ident'ification of DNA is necessary. The method is illustrated by application to 231 Exherichin coli DNA fragments known to contain promoters with variable spacings between their consensus regions. Maximum-likelihood tests

  18. A compact formulation for maximizing the expected number of transplants in kidney exchange programs

    NASA Astrophysics Data System (ADS)

    Alvelos, Filipe; Klimentova, Xenia; Rais, Abdur; Viana, Ana

    2015-05-01

    Kidney exchange programs (KEPs) allow the exchange of kidneys between incompatible donor-recipient pairs. Optimization approaches can help KEPs in defining which transplants should be made among all incompatible pairs according to some objective. The most common objective is to maximize the number of transplants. In this paper, we propose an integer programming model which addresses the objective of maximizing the expected number of transplants, given that there are equal probabilities of failure associated with vertices and arcs. The model is compact, i.e. has a polynomial number of decision variables and constraints, and therefore can be solved directly by a general purpose integer programming solver (e.g. Cplex).

  19. From Ambiguity Aversion to a Generalized Expected Utility. Modeling Preferences in a Quantum Probabilistic Framework

    E-print Network

    Diederik Aerts; Sandro Sozzo

    2015-10-30

    Ambiguity and ambiguity aversion have been widely studied in decision theory and economics both at a theoretical and an experimental level. After Ellsberg's seminal studies challenging subjective expected utility theory (SEUT), several (mainly normative) approaches have been put forward to reproduce ambiguity aversion and Ellsberg-type preferences. However, Machina and other authors have pointed out some fundamental difficulties of these generalizations of SEUT to cope with some variants of Ellsberg's thought experiments, which has recently been experimentally confirmed. Starting from our quantum modeling approach to human cognition, we develop here a general probabilistic framework to model human decisions under uncertainty. We show that our quantum theoretical model faithfully represents different sets of data collected on both the Ellsberg and the Machina paradox situations, and is flexible enough to describe different subjective attitudes with respect to ambiguity. Our approach opens the way toward a quantum-based generalization of expected utility theory (QEUT), where subjective probabilities depend on the state of the conceptual entity at play and its interaction with the decision-maker, while preferences between acts are determined by the maximization of this 'state-dependent expected utility'.

  20. Outlier Detection for the Nonlinear Gauss Helmert Model With Variance Components by the Expectation Maximization Algorithm

    NASA Astrophysics Data System (ADS)

    Koch, Karl-Rudolf

    2014-09-01

    Best invariant quadratic unbiased estimates (BIQUE) of the variance and covariance components for a nonlinear Gauss Helmert (GH) model are derived. To detect outliers, the expectation maximization (EM) algorithm based on the variance-inflation model and the mean-shift model is applied, which results in an iterative reweighting least squares. Each step of the iterations for the EM algorithm therefore includes first the iterations for linearizing the GH model and then the iterations for estimating the variance components. The method is applied to fit a surface in three-dimensional space to the three coordinates of points measured, for instance, by a laser scanner. The surface is represented by a polynomial of second degree and the variance components of the three coordinates are estimated. Outliers are detected by the EM algorithm based on the variance-inflation model and identified by the EM algorithm for the mean-shift model.

  1. Optimization in the utility maximization framework for conservation planning: a comparison of solution procedures in a study of multifunctional agriculture

    PubMed Central

    Stoms, David M.; Davis, Frank W.

    2014-01-01

    Quantitative methods of spatial conservation prioritization have traditionally been applied to issues in conservation biology and reserve design, though their use in other types of natural resource management is growing. The utility maximization problem is one form of a covering problem where multiple criteria can represent the expected social benefits of conservation action. This approach allows flexibility with a problem formulation that is more general than typical reserve design problems, though the solution methods are very similar. However, few studies have addressed optimization in utility maximization problems for conservation planning, and the effect of solution procedure is largely unquantified. Therefore, this study mapped five criteria describing elements of multifunctional agriculture to determine a hypothetical conservation resource allocation plan for agricultural land conservation in the Central Valley of CA, USA. We compared solution procedures within the utility maximization framework to determine the difference between an open source integer programming approach and a greedy heuristic, and find gains from optimization of up to 12%. We also model land availability for conservation action as a stochastic process and determine the decline in total utility compared to the globally optimal set using both solution algorithms. Our results are comparable to other studies illustrating the benefits of optimization for different conservation planning problems, and highlight the importance of maximizing the effectiveness of limited funding for conservation and natural resource management. PMID:25538868

  2. Optimization in the utility maximization framework for conservation planning: a comparison of solution procedures in a study of multifunctional agriculture

    USGS Publications Warehouse

    Kreitler, Jason R.; Stoms, David M.; Davis, Frank W.

    2014-01-01

    Quantitative methods of spatial conservation prioritization have traditionally been applied to issues in conservation biology and reserve design, though their use in other types of natural resource management is growing. The utility maximization problem is one form of a covering problem where multiple criteria can represent the expected social benefits of conservation action. This approach allows flexibility with a problem formulation that is more general than typical reserve design problems, though the solution methods are very similar. However, few studies have addressed optimization in utility maximization problems for conservation planning, and the effect of solution procedure is largely unquantified. Therefore, this study mapped five criteria describing elements of multifunctional agriculture to determine a hypothetical conservation resource allocation plan for agricultural land conservation in the Central Valley of CA, USA. We compared solution procedures within the utility maximization framework to determine the difference between an open source integer programming approach and a greedy heuristic, and find gains from optimization of up to 12%. We also model land availability for conservation action as a stochastic process and determine the decline in total utility compared to the globally optimal set using both solution algorithms. Our results are comparable to other studies illustrating the benefits of optimization for different conservation planning problems, and highlight the importance of maximizing the effectiveness of limited funding for conservation and natural resource management.

  3. Maximizing Light Utilization Efficiency and Hydrogen Production in Microalgal Cultures

    SciTech Connect

    Melis, Anastasios

    2014-12-31

    The project addressed the following technical barrier from the Biological Hydrogen Production section of the Fuel Cell Technologies Program Multi-Year Research, Development and Demonstration Plan: Low Sunlight Utilization Efficiency in Photobiological Hydrogen Production is due to a Large Photosystem Chlorophyll Antenna Size in Photosynthetic Microorganisms (Barrier AN: Light Utilization Efficiency).

  4. 76 FR 37376 - Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-27

    ... Management and Budget (67 FR 8452-8460), pursuant to section 515 of the Treasury and General Government... FR 8452-8460) that direct each federal agency to (1) Issue its own guidelines ensuring and maximizing... Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Information...

  5. An Expectation Maximization based Method for Subcellular Particle Tracking using Multi-angle TIRF Microscopy*

    PubMed Central

    Liang, Liang; Shen, Hongying; De Camilli, Pietro; Toomre, Derek K.; Duncan, James S.

    2013-01-01

    Multi-angle total internal reflection fluorescence microscopy (MA-TIRFM) is a new generation of TIRF microscopy to study cellular processes near dorsal cell membrane in 4 dimensions (3D+t). To perform quantitative analysis using MA-TIRFM, it is necessary to track subcellular particles in these processes. In this paper, we propose a method based on a MAP framework for automatic particle tracking and apply it to track clathrin coated pits (CCPs). The expectation maximization (EM) algorithm is employed to solve the MAP problem. To provide the initial estimations for the EM algorithm, we develop a forward filter based on the most probable trajectory (MPT) filter. Multiple linear models are used to model particle dynamics. For CCP tracking, we use two linear models to describe constrained Brownian motion and fluorophore variation according to CCP properties. The tracking method is evaluated on synthetic data and results show that it has high accuracy. The result on real data confirmed by human expert cell biologists is also presented. PMID:22003671

  6. Colocalization Estimation Using Graphical Modeling and Variational Bayesian Expectation Maximization: Towards a Parameter-Free Approach.

    PubMed

    Awate, Suyash P; Radhakrishnan, Thyagarajan

    2015-01-01

    In microscopy imaging, colocalization between two biological entities (e.g., protein-protein or protein-cell) refers to the (stochastic) dependencies between the spatial locations of the two entities in the biological specimen. Measuring colocalization between two entities relies on fluorescence imaging of the specimen using two fluorescent chemicals, each of which indicates the presence/absence of one of the entities at any pixel location. State-of-the-art methods for estimating colocalization rely on post-processing image data using an adhoc sequence of algorithms with many free parameters that are tuned visually. This leads to loss of reproducibility of the results. This paper proposes a brand-new framework for estimating the nature and strength of colocalization directly from corrupted image data by solving a single unified optimization problem that automatically deals with noise, object labeling, and parameter tuning. The proposed framework relies on probabilistic graphical image modeling and a novel inference scheme using variational Bayesian expectation maximization for estimating all model parameters, including colocalization, from data. Results on simulated and real-world data demonstrate improved performance over the state of the art. PMID:26221663

  7. Statistical models of synaptic transmission evaluated using the expectation-maximization algorithm.

    PubMed Central

    Stricker, C; Redman, S

    1994-01-01

    Amplitude fluctuations of evoked synaptic responses can be used to extract information on the probabilities of release at the active sites, and on the amplitudes of the synaptic responses generated by transmission at each active site. The parameters that describe this process must be obtained from an incomplete data set represented by the probability density of the evoked synaptic response. In this paper, the equations required to calculate these parameters using the Expectation-Maximization algorithm and the maximum likelihood criterion have been derived for a variety of statistical models of synaptic transmission. These models are ones where the probabilities associated with the different discrete amplitudes in the evoked responses are a) unconstrained, b) binomial, and c) compound binomial. The discrete amplitudes may be separated by equal (quantal) or unequal amounts, with or without quantal variance. Alternative models have been considered where the variance associated with the discrete amplitudes is sufficiently large such that no quantal amplitudes can be detected. These models involve the sum of a normal distribution (to represent failures) and a unimodal distribution (to represent the evoked responses). The implementation of the algorithm is described in each case, and its accuracy and convergence have been demonstrated. PMID:7948679

  8. The indexing ambiguity in serial femtosecond crystallography (SFX) resolved using an expectation maximization algorithm.

    PubMed

    Liu, Haiguang; Spence, John C H

    2014-11-01

    Crystallographic auto-indexing algorithms provide crystal orientations and unit-cell parameters and assign Miller indices based on the geometric relations between the Bragg peaks observed in diffraction patterns. However, if the Bravais symmetry is higher than the space-group symmetry, there will be multiple indexing options that are geometrically equivalent, and hence many ways to merge diffraction intensities from protein nanocrystals. Structure factor magnitudes from full reflections are required to resolve this ambiguity but only partial reflections are available from each XFEL shot, which must be merged to obtain full reflections from these 'stills'. To resolve this chicken-and-egg problem, an expectation maximization algorithm is described that iteratively constructs a model from the intensities recorded in the diffraction patterns as the indexing ambiguity is being resolved. The reconstructed model is then used to guide the resolution of the indexing ambiguity as feedback for the next iteration. Using both simulated and experimental data collected at an X-ray laser for photosystem I in the P63 space group (which supports a merohedral twinning indexing ambiguity), the method is validated. PMID:25485120

  9. A Local Scalable Distributed Expectation Maximization Algorithm for Large Peer-to-Peer Networks

    NASA Technical Reports Server (NTRS)

    Bhaduri, Kanishka; Srivastava, Ashok N.

    2009-01-01

    This paper offers a local distributed algorithm for expectation maximization in large peer-to-peer environments. The algorithm can be used for a variety of well-known data mining tasks in a distributed environment such as clustering, anomaly detection, target tracking to name a few. This technology is crucial for many emerging peer-to-peer applications for bioinformatics, astronomy, social networking, sensor networks and web mining. Centralizing all or some of the data for building global models is impractical in such peer-to-peer environments because of the large number of data sources, the asynchronous nature of the peer-to-peer networks, and dynamic nature of the data/network. The distributed algorithm we have developed in this paper is provably-correct i.e. it converges to the same result compared to a similar centralized algorithm and can automatically adapt to changes to the data and the network. We show that the communication overhead of the algorithm is very low due to its local nature. This monitoring algorithm is then used as a feedback loop to sample data from the network and rebuild the model when it is outdated. We present thorough experimental results to verify our theoretical claims.

  10. An online expectation maximization algorithm for exploring general structure in massive networks

    NASA Astrophysics Data System (ADS)

    Chai, Bianfang; Jia, Caiyan; Yu, Jian

    2015-11-01

    Mixture model and stochastic block model (SBM) for structure discovery employ a broad and flexible definition of vertex classes such that they are able to explore a wide variety of structure. Compared to the existing algorithms based on the SBM (their time complexities are O(mc2) , where m and c are the number of edges and clusters), the algorithms of mixture model are capable of dealing with networks with a large number of communities more efficiently due to their O(mc) time complexity. However, the algorithms of mixture model using expectation maximization (EM) technique are still too slow to deal with real million-node networks, since they compute hidden variables on the entire network in each iteration. In this paper, an online variational EM algorithm is designed to improve the efficiency of the EM algorithms. In each iteration, our online algorithm samples a node and estimates its cluster memberships only by its adjacency links, and model parameters are then estimated by the memberships of the sampled node and old model parameters obtained in the previous iteration. The provided online algorithm updates model parameters subsequently by the links of a new sampled node and explores the general structure of massive and growing networks with millions of nodes and hundreds of clusters in hours. Compared to the relevant algorithms on synthetic and real networks, the proposed online algorithm costs less with little or no degradation of accuracy. Results illustrate that the presented algorithm offers a good trade-off between precision and efficiency.

  11. Bandwidth utilization maximization of scientific RF communication systems

    SciTech Connect

    Rey, D.; Ryan, W.; Ross, M.

    1997-01-01

    A method for more efficiently utilizing the frequency bandwidth allocated for data transmission is presented. Current space and range communication systems use modulation and coding schemes that transmit 0.5 to 1.0 bits per second per Hertz of radio frequency bandwidth. The goal in this LDRD project is to increase the bandwidth utilization by employing advanced digital communications techniques. This is done with little or no increase in the transmit power which is usually very limited on airborne systems. Teaming with New Mexico State University, an implementation of trellis coded modulation (TCM), a coding and modulation scheme pioneered by Ungerboeck, was developed for this application and simulated on a computer. TCM provides a means for reliably transmitting data while simultaneously increasing bandwidth efficiency. The penalty is increased receiver complexity. In particular, the trellis decoder requires high-speed, application-specific digital signal processing (DSP) chips. A system solution based on the QualComm Viterbi decoder and the Graychip DSP receiver chips is presented.

  12. Association Studies with Imputed Variants Using Expectation-Maximization Likelihood-Ratio Tests

    PubMed Central

    Huang, Kuan-Chieh; Sun, Wei; Wu, Ying; Chen, Mengjie; Mohlke, Karen L.; Lange, Leslie A.; Li, Yun

    2014-01-01

    Genotype imputation has become standard practice in modern genetic studies. As sequencing-based reference panels continue to grow, increasingly more markers are being well or better imputed but at the same time, even more markers with relatively low minor allele frequency are being imputed with low imputation quality. Here, we propose new methods that incorporate imputation uncertainty for downstream association analysis, with improved power and/or computational efficiency. We consider two scenarios: I) when posterior probabilities of all potential genotypes are estimated; and II) when only the one-dimensional summary statistic, imputed dosage, is available. For scenario I, we have developed an expectation-maximization likelihood-ratio test for association based on posterior probabilities. When only imputed dosages are available (scenario II), we first sample the genotype probabilities from its posterior distribution given the dosages, and then apply the EM-LRT on the sampled probabilities. Our simulations show that type I error of the proposed EM-LRT methods under both scenarios are protected. Compared with existing methods, EM-LRT-Prob (for scenario I) offers optimal statistical power across a wide spectrum of MAF and imputation quality. EM-LRT-Dose (for scenario II) achieves a similar level of statistical power as EM-LRT-Prob and, outperforms the standard Dosage method, especially for markers with relatively low MAF or imputation quality. Applications to two real data sets, the Cebu Longitudinal Health and Nutrition Survey study and the Women’s Health Initiative Study, provide further support to the validity and efficiency of our proposed methods. PMID:25383782

  13. Automatic seed initialization for the expectation-maximization algorithm and its application in 3D medical imaging

    E-print Network

    Whelan, Paul F.

    Automatic seed initialization for the expectation-maximization algorithm and its application in 3D the number until the required seeds are found. The second method tries to optimize the sum of squares distribution curve into equal percentile cells. The *Corresponding author. Email: lynchm@eeng.dcu.ie #12;seeds

  14. Very Slow Search and Reach: Failure to Maximize Expected Gain in an Eye-Hand Coordination Task

    E-print Network

    Maloney, Laurence T.

    recorded human observers' eye movements and hand movements and compared them with the optimal strategyVery Slow Search and Reach: Failure to Maximize Expected Gain in an Eye-Hand Coordination Task Hang University, Cambridge, Massachusetts, United States of America Abstract We examined an eye-hand coordination

  15. Computational rationality: linking mechanism and behavior through bounded utility maximization.

    PubMed

    Lewis, Richard L; Howes, Andrew; Singh, Satinder

    2014-04-01

    We propose a framework for including information-processing bounds in rational analyses. It is an application of bounded optimality (Russell & Subramanian, 1995) to the challenges of developing theories of mechanism and behavior. The framework is based on the idea that behaviors are generated by cognitive mechanisms that are adapted to the structure of not only the environment but also the mind and brain itself. We call the framework computational rationality to emphasize the incorporation of computational mechanism into the definition of rational action. Theories are specified as optimal program problems, defined by an adaptation environment, a bounded machine, and a utility function. Such theories yield different classes of explanation, depending on the extent to which they emphasize adaptation to bounds, and adaptation to some ecology that differs from the immediate local environment. We illustrate this variation with examples from three domains: visual attention in a linguistic task, manual response ordering, and reasoning. We explore the relation of this framework to existing "levels" approaches to explanation, and to other optimality-based modeling approaches. PMID:24648415

  16. What Does Industry Expect From An Electrical Utility 

    E-print Network

    Jensen, C. V.

    1989-01-01

    The electric utility industry is an important supplier to Union Carbide and as such must become a proactive participant in our quality programs which are aimed at continuous improvement in everything we do. The essential ingredients in the supplier...

  17. Recursive expectation-maximization clustering: A method for identifying buffering mechanisms composed of phenomic modules

    NASA Astrophysics Data System (ADS)

    Guo, Jingyu; Tian, Dehua; McKinney, Brett A.; Hartman, John L.

    2010-06-01

    Interactions between genetic and/or environmental factors are ubiquitous, affecting the phenotypes of organisms in complex ways. Knowledge about such interactions is becoming rate-limiting for our understanding of human disease and other biological phenomena. Phenomics refers to the integrative analysis of how all genes contribute to phenotype variation, entailing genome and organism level information. A systems biology view of gene interactions is critical for phenomics. Unfortunately the problem is intractable in humans; however, it can be addressed in simpler genetic model systems. Our research group has focused on the concept of genetic buffering of phenotypic variation, in studies employing the single-cell eukaryotic organism, S. cerevisiae. We have developed a methodology, quantitative high throughput cellular phenotyping (Q-HTCP), for high-resolution measurements of gene-gene and gene-environment interactions on a genome-wide scale. Q-HTCP is being applied to the complete set of S. cerevisiae gene deletion strains, a unique resource for systematically mapping gene interactions. Genetic buffering is the idea that comprehensive and quantitative knowledge about how genes interact with respect to phenotypes will lead to an appreciation of how genes and pathways are functionally connected at a systems level to maintain homeostasis. However, extracting biologically useful information from Q-HTCP data is challenging, due to the multidimensional and nonlinear nature of gene interactions, together with a relative lack of prior biological information. Here we describe a new approach for mining quantitative genetic interaction data called recursive expectation-maximization clustering (REMc). We developed REMc to help discover phenomic modules, defined as sets of genes with similar patterns of interaction across a series of genetic or environmental perturbations. Such modules are reflective of buffering mechanisms, i.e., genes that play a related role in the maintenance of physiological homeostasis. To develop the method, 297 gene deletion strains were selected based on gene-drug interactions with hydroxyurea, an inhibitor of ribonucleotide reductase enzyme activity, which is critical for DNA synthesis. To partition the gene functions, these 297 deletion strains were challenged with growth inhibitory drugs known to target different genes and cellular pathways. Q-HTCP-derived growth curves were used to quantify all gene interactions, and the data were used to test the performance of REMc. Fundamental advantages of REMc include objective assessment of total number of clusters and assignment to each cluster a log-likelihood value, which can be considered an indicator of statistical quality of clusters. To assess the biological quality of clusters, we developed a method called gene ontology information divergence z-score (GOid_z). GOid_z summarizes total enrichment of GO attributes within individual clusters. Using these and other criteria, we compared the performance of REMc to hierarchical and K-means clustering. The main conclusion is that REMc provides distinct efficiencies for mining Q-HTCP data. It facilitates identification of phenomic modules, which contribute to buffering mechanisms that underlie cellular homeostasis and the regulation of phenotypic expression.

  18. 76 FR 49473 - Petition to Maximize Practical Utility of List 1 Chemicals Screened Through EPA's Endocrine...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-10

    ... AGENCY Petition to Maximize Practical Utility of List 1 Chemicals Screened Through EPA's Endocrine... decisions on data received in response to the test orders issued under the Endocrine Disruptor Screening...'' system, which means EPA will not know your identity or contact information unless you provide it in...

  19. Optimum Charging Profile for Lithium-ion Batteries to Maximize Energy Storage and Utilization

    E-print Network

    Subramanian, Venkat

    Optimum Charging Profile for Lithium-ion Batteries to Maximize Energy Storage and Utilization Ravi The optimal profile of charging current for a lithium-ion battery is estimated using dynamic optimization the system behavior of the Li-ion battery. Dynamic optimization is made possible due to the computationally

  20. 76 FR 51060 - Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ...The Marine Mammal Commission adopts these guidelines to ensure and maximize the quality, objectivity, utility, and integrity of information disseminated by the agency in accordance with the directive issued by the Office of Management and Budget (67 FR 8452-8460), pursuant to section 515 of the Treasury and General Government Appropriations Act for Fiscal Year...

  1. Power Utility Maximization for Multiple-Supply Systems by a Load-Matching Switch

    E-print Network

    Shinozuka, Masanobu

    Power Utility Maximization for Multiple-Supply Systems by a Load-Matching Switch Chulsung Park {chulsung,phchou}@uci.edu ABSTRACT For embedded systems that rely on multiple power sources (MPS), power management must distribute the power by matching the sup- ply and demand in conjunction with the traditional

  2. Crustacean hemolymph microbiota: Endemic, tightly controlled, and utilization expectable.

    PubMed

    Wang, Xian-Wei; Wang, Jin-Xing

    2015-12-01

    Increasing number of evidence suggests that the hemolymph of numerous apparently healthy invertebrates is unsterile. Investigation of hemolymph microbiota properties and the homeostasis between host and bacteria is helpful to reveal bacteria pathogenesis, host immunity, and possible utilization in disease control. Crustaceans represent a large family of aquatic animals. Therefore, crustacean fishery is of important economic value worldwide. Research related to crustacean hemolymph microbiota has been performed over the years. In the present study, we conclude currently available information and present a comprehensive analysis regarding homeostasis between host and bacteria. In general, the presence of microbiota in crustacean hemolymph is an endemic event and can be influenced by internal and external factors. Opportunistic bacteria may have generated some changes or mutations under hemolymph stress. Meanwhile, hosts suppress hemolymph microbiota proliferation with the help of some critical antimicrobial peptides and lectins. The hemolymph microbiota may be beneficial for hosts as resistance against external damages. In addition, the hemolymph microbiota may be utilized in aquaculture. PMID:26153452

  3. Expected Utility Illustrated: A Graphical Analysis of Gambles with More than Two Possible Outcomes

    ERIC Educational Resources Information Center

    Chen, Frederick H.

    2010-01-01

    The author presents a simple geometric method to graphically illustrate the expected utility from a gamble with more than two possible outcomes. This geometric result gives economics students a simple visual aid for studying expected utility theory and enables them to analyze a richer set of decision problems under uncertainty compared to what…

  4. Utility Maximization with Addictive Consumption Habit Formation in Incomplete Semimartingale Markets

    E-print Network

    Yu, Xiang

    2011-01-01

    This paper studies the problem of continuous time utility maximization of consumption together with addictive habit formation in general incomplete semimartingale financial markets. By introducing the auxiliary state processes and the modified dual space, we embed our original problem into an auxiliary time separable utility maximization problem with the shadow random endowment. We establish existence and uniqueness of the optimal solution using convex duality approach on the product space by defining the primal value function both on the initial wealth and initial habit. We also provide market independent sufficient conditions both on stochastic discounting processes for the habit formation process and on the utility function for the validity of several key assertions of our main results to hold true.

  5. OPTUM : Optimum Portfolio Tool for Utility Maximization documentation and user's guide.

    SciTech Connect

    VanKuiken, J. C.; Jusko, M. J.; Samsa, M. E.; Decision and Information Sciences

    2008-09-30

    The Optimum Portfolio Tool for Utility Maximization (OPTUM) is a versatile and powerful tool for selecting, optimizing, and analyzing portfolios. The software introduces a compact interface that facilitates problem definition, complex constraint specification, and portfolio analysis. The tool allows simple comparisons between user-preferred choices and optimized selections. OPTUM uses a portable, efficient, mixed-integer optimization engine (lp-solve) to derive the optimal mix of projects that satisfies the constraints and maximizes the total portfolio utility. OPTUM provides advanced features, such as convenient menus for specifying conditional constraints and specialized graphical displays of the optimal frontier and alternative solutions to assist in sensitivity visualization. OPTUM can be readily applied to other nonportfolio, resource-constrained optimization problems.

  6. Expectation Maximization Vibhav Gogate

    E-print Network

    Gogate, Vibhav

    's a Multivariate Gaussian? 2.What's a Mixture Model? #12;Mixtures of Gaussians (1) Old Faithful Data Set TimetoEruption Duration of Last Eruption #12;Mixtures of Gaussians (1) Old Faithful Data Set Single Gaussian Mixture

  7. 95Neuroeconomics: Decision Making and the Brain 2009, Elsevier Inc. The Expected Utility of Movement

    E-print Network

    Maloney, Laurence T.

    95Neuroeconomics: Decision Making and the Brain © 2009, Elsevier Inc. The Expected Utility Decisions 106 Movement Under Risk, Decision Making Under Risk 106 Neural Correlates of Motor and Cognitive is a form of decision making as we choose one of many possible movement strategies to accomplish any given

  8. Expectation-maximization algorithms for learning a finite mixture of univariate survival time distributions from partially specified class values

    SciTech Connect

    Lee, Youngrok

    2013-05-15

    Heterogeneity exists on a data set when samples from di#11;erent classes are merged into the data set. Finite mixture models can be used to represent a survival time distribution on heterogeneous patient group by the proportions of each class and by the survival time distribution within each class as well. The heterogeneous data set cannot be explicitly decomposed to homogeneous subgroups unless all the samples are precisely labeled by their origin classes; such impossibility of decomposition is a barrier to overcome for estimating #12;nite mixture models. The expectation-maximization (EM) algorithm has been used to obtain maximum likelihood estimates of #12;nite mixture models by soft-decomposition of heterogeneous samples without labels for a subset or the entire set of data. In medical surveillance databases we can find partially labeled data, that is, while not completely unlabeled there is only imprecise information about class values. In this study we propose new EM algorithms that take advantages of using such partial labels, and thus incorporate more information than traditional EM algorithms. We particularly propose four variants of the EM algorithm named EM-OCML, EM-PCML, EM-HCML and EM-CPCML, each of which assumes a specific mechanism of missing class values. We conducted a simulation study on exponential survival trees with five classes and showed that the advantages of incorporating substantial amount of partially labeled data can be highly signi#12;cant. We also showed model selection based on AIC values fairly works to select the best proposed algorithm on each specific data set. A case study on a real-world data set of gastric cancer provided by Surveillance, Epidemiology and End Results (SEER) program showed a superiority of EM-CPCML to not only the other proposed EM algorithms but also conventional supervised, unsupervised and semi-supervised learning algorithms.

  9. The role of data assimilation in maximizing the utility of geospace observations (Invited)

    NASA Astrophysics Data System (ADS)

    Matsuo, T.

    2013-12-01

    Data assimilation can facilitate maximizing the utility of existing geospace observations by offering an ultimate marriage of inductive (data-driven) and deductive (first-principles based) approaches to addressing critical questions in space weather. Assimilative approaches that incorporate dynamical models are, in particular, capable of making a diverse set of observations consistent with physical processes included in a first-principles model, and allowing unobserved physical states to be inferred from observations. These points will be demonstrated in the context of the application of an ensemble Kalman filter (EnKF) to a thermosphere and ionosphere general circulation model. An important attribute of this approach is that the feedback between plasma and neutral variables is self-consistently treated both in the forecast model as well as in the assimilation scheme. This takes advantage of the intimate coupling between the thermosphere and ionosphere described in general circulation models to enable the inference of unobserved thermospheric states from the relatively plentiful observations of the ionosphere. Given the ever-growing infrastructure for the global navigation satellite system, this is indeed a promising prospect for geospace data assimilation. In principle, similar approaches can be applied to any geospace observing systems to extract more geophysical information from a given set of observations than would otherwise be possible.

  10. MaxBin: an automated binning method to recover individual genomes from metagenomes using an expectation-maximization algorithm

    PubMed Central

    2014-01-01

    Background Recovering individual genomes from metagenomic datasets allows access to uncultivated microbial populations that may have important roles in natural and engineered ecosystems. Understanding the roles of these uncultivated populations has broad application in ecology, evolution, biotechnology and medicine. Accurate binning of assembled metagenomic sequences is an essential step in recovering the genomes and understanding microbial functions. Results We have developed a binning algorithm, MaxBin, which automates the binning of assembled metagenomic scaffolds using an expectation-maximization algorithm after the assembly of metagenomic sequencing reads. Binning of simulated metagenomic datasets demonstrated that MaxBin had high levels of accuracy in binning microbial genomes. MaxBin was used to recover genomes from metagenomic data obtained through the Human Microbiome Project, which demonstrated its ability to recover genomes from real metagenomic datasets with variable sequencing coverages. Application of MaxBin to metagenomes obtained from microbial consortia adapted to grow on cellulose allowed genomic analysis of new, uncultivated, cellulolytic bacterial populations, including an abundant myxobacterial population distantly related to Sorangium cellulosum that possessed a much smaller genome (5 MB versus 13 to 14 MB) but has a more extensive set of genes for biomass deconstruction. For the cellulolytic consortia, the MaxBin results were compared to binning using emergent self-organizing maps (ESOMs) and differential coverage binning, demonstrating that it performed comparably to these methods but had distinct advantages in automation, resolution of related genomes and sensitivity. Conclusions The automatic binning software that we developed successfully classifies assembled sequences in metagenomic datasets into recovered individual genomes. The isolation of dozens of species in cellulolytic microbial consortia, including a novel species of myxobacteria that has the smallest genome among all sequenced aerobic myxobacteria, was easily achieved using the binning software. This work demonstrates that the processes required for recovering genomes from assembled metagenomic datasets can be readily automated, an important advance in understanding the metabolic potential of microbes in natural environments. MaxBin is available at https://sourceforge.net/projects/maxbin/. PMID:25136443

  11. Optimal Decision-Making of Countermeasures by Estimating Their Expected Utilities

    NASA Astrophysics Data System (ADS)

    Park, So Ryoung; Noh, Sanguk

    This paper investigates the autonomous decision-making process of the selection of alternative countermeasures against threats in electronic warfare settings. We introduce a threat model, which represents a specific threat pattern, and a methodology that decides the best countermeasure against real-time threats using the decision theory. To determine the optimal countermeasure, we model the probabilities of the effects of countermeasures, if executed, and combine the probabilities with their utilities. This methodology based upon the inductive threat model calculates the expected utilities of countermeasures which are applicable given a situation, and provide an intelligent command and control agent with the best countermeasure to threats. We present empirical results that demonstrate the agent's capabilities of choosing countermeasures to threats in simulated electronic warfare settings.

  12. Expected Utility Based Decision Making under Z-Information and Its Application

    PubMed Central

    Aliev, Rashad R.; Mraiziq, Derar Atallah Talal; Huseynov, Oleg H.

    2015-01-01

    Real-world decision relevant information is often partially reliable. The reasons are partial reliability of the source of information, misperceptions, psychological biases, incompetence, and so forth. Z-numbers based formalization of information (Z-information) represents a natural language (NL) based value of a variable of interest in line with the related NL based reliability. What is important is that Z-information not only is the most general representation of real-world imperfect information but also has the highest descriptive power from human perception point of view as compared to fuzzy number. In this study, we present an approach to decision making under Z-information based on direct computation over Z-numbers. This approach utilizes expected utility paradigm and is applied to a benchmark decision problem in the field of economics. PMID:26366163

  13. American trends in expectant management utilization for prostate cancer from 2000 to 2009

    PubMed Central

    Maurice, Matthew J.; Abouassaly, Robert; Zhu, Hui

    2014-01-01

    Introducton: The overtreatment of early prostate cancer has become a major public health concern. Expectant management (EM) is a strategy to minimize overtreatment, but little is known about its pattern of use. We sought to examine national EM utilization over the preceding decade. Methods: We examined prostate cancer treatment utilization from 2000 to 2009 using the National Cancer Database. EM use was analyzed in relation to other treatments and by cancer stage, age group, Charlson score, and hospital practice setting. Results: Overall, 109 997 (8.2%) men were managed initially with EM. EM usage remained stable at 7.6% to 9.5% from 2000 to 2009 with no appreciable increase for low-stage cancers. Usage was only slightly higher in elderly patients and in patients with multiple comorbidities. Veterans Affairs and low-volume hospitals had a much higher and increasing EM rate (range: 18.8%–29.8% and 15.1%–24.2%, respectively), compared to community hospitals, comprehensive cancer centres, and teaching hospitals, which showed no increased adoption. On further analysis, EM use remained high for low-stage cancers at Veterans Affairs and low-volume hospitals (24.0% and 19.1%, respectively), regardless of age or comorbidity, a pattern not shared by other practice settings. Conclusions: EM utilization remained low and stable last decade, regardless of disease or patient characteristics. Conversely, Veterans Affairs and low-volume hospitals led the trend in national EM adoption, particularly in men with low-stage cancers and limited life expectancies. The limitations of this dataset preclude any determination of the appropriateness of EM utilization. Nonetheless, further study is needed to identify factors influencing EM adoption to ensure its proper use in the future. PMID:25485003

  14. Maximizing coupling-efficiency of high-power diode lasers utilizing hybrid assembly technology

    NASA Astrophysics Data System (ADS)

    Zontar, D.; Dogan, M.; Fulghum, S.; Müller, T.; Haag, S.; Brecher, C.

    2015-03-01

    In this paper, we present hybrid assembly technology to maximize coupling efficiency for spatially combined laser systems. High quality components, such as center-turned focusing units, as well as suitable assembly strategies are necessary to obtain highest possible output ratios. Alignment strategies are challenging tasks due to their complexity and sensitivity. Especially in low-volume production fully automated systems are economically at a disadvantage, as operator experience is often expensive. However reproducibility and quality of automatically assembled systems can be superior. Therefore automated and manual assembly techniques are combined to obtain high coupling efficiency while preserving maximum flexibility. The paper will describe necessary equipment and software to enable hybrid assembly processes. Micromanipulator technology with high step-resolution and six degrees of freedom provide a large number of possible evaluation points. Automated algorithms are necess ary to speed-up data gathering and alignment to efficiently utilize available granularity for manual assembly processes. Furthermore, an engineering environment is presented to enable rapid prototyping of automation tasks with simultaneous data ev aluation. Integration with simulation environments, e.g. Zemax, allows the verification of assembly strategies in advance. Data driven decision making ensures constant high quality, documents the assembly process and is a basis for further improvement. The hybrid assembly technology has been applied on several applications for efficiencies above 80% and will be discussed in this paper. High level coupling efficiency has been achieved with minimized assembly as a result of semi-automated alignment. This paper will focus on hybrid automation for optimizing and attaching turning mirrors and collimation lenses.

  15. From Ambiguity Aversion to a Generalized Expected Utility. Modeling Preferences in a Quantum Probabilistic Framework

    E-print Network

    Aerts, Diederik

    2015-01-01

    Ambiguity and ambiguity aversion have been widely studied in decision theory and economics both at a theoretical and an experimental level. After Ellsberg's seminal studies challenging subjective expected utility theory (SEUT), several (mainly normative) approaches have been put forward to reproduce ambiguity aversion and Ellsberg-type preferences. However, Machina and other authors have pointed out some fundamental difficulties of these generalizations of SEUT to cope with some variants of Ellsberg's thought experiments, which has recently been experimentally confirmed. Starting from our quantum modeling approach to human cognition, we develop here a general probabilistic framework to model human decisions under uncertainty. We show that our quantum theoretical model faithfully represents different sets of data collected on both the Ellsberg and the Machina paradox situations, and is flexible enough to describe different subjective attitudes with respect to ambiguity. Our approach opens the way toward a quan...

  16. The behavioral economics of consumer brand choice: patterns of reinforcement and utility maximization.

    PubMed

    Foxall, Gordon R; Oliveira-Castro, Jorge M; Schrezenmaier, Teresa C

    2004-06-30

    Purchasers of fast-moving consumer goods generally exhibit multi-brand choice, selecting apparently randomly among a small subset or "repertoire" of tried and trusted brands. Their behavior shows both matching and maximization, though it is not clear just what the majority of buyers are maximizing. Each brand attracts, however, a small percentage of consumers who are 100%-loyal to it during the period of observation. Some of these are exclusively buyers of premium-priced brands who are presumably maximizing informational reinforcement because their demand for the brand is relatively price-insensitive or inelastic. Others buy exclusively the cheapest brands available and can be assumed to maximize utilitarian reinforcement since their behavior is particularly price-sensitive or elastic. Between them are the majority of consumers whose multi-brand buying takes the form of selecting a mixture of economy -- and premium-priced brands. Based on the analysis of buying patterns of 80 consumers for 9 product categories, the paper examines the continuum of consumers so defined and seeks to relate their buying behavior to the question of how and what consumers maximize. PMID:15157975

  17. Social and Professional Participation of Individuals Who Are Deaf: Utilizing the Psychosocial Potential Maximization Framework

    ERIC Educational Resources Information Center

    Jacobs, Paul G.; Brown, P. Margaret; Paatsch, Louise

    2012-01-01

    This article documents a strength-based understanding of how individuals who are deaf maximize their social and professional potential. This exploratory study was conducted with 49 adult participants who are deaf (n = 30) and who have typical hearing (n = 19) residing in America, Australia, England, and South Africa. The findings support a…

  18. Illustrating Caffeine's Pharmacological and Expectancy Effects Utilizing a Balanced Placebo Design.

    ERIC Educational Resources Information Center

    Lotshaw, Sandra C.; And Others

    1996-01-01

    Hypothesizes that pharmacological and expectancy effects may be two principles that govern caffeine consumption in the same way they affect other drug use. Tests this theory through a balanced placebo design on 100 male undergraduate students. Expectancy set and caffeine content appeared equally powerful, and worked additionally, to affect…

  19. Maximizing the utility of monitoring to the adaptive management of natural resources

    USGS Publications Warehouse

    Kendall, William L.; Moore, Clinton T.

    2012-01-01

    Data collection is an important step in any investigation about the structure or processes related to a natural system. In a purely scientific investigation (experiments, quasi-experiments, observational studies), data collection is part of the scientific method, preceded by the identification of hypotheses and the design of any manipulations of the system to test those hypotheses. Data collection and the manipulations that precede it are ideally designed to maximize the information that is derived from the study. That is, such investigations should be designed for maximum power to evaluate the relative validity of the hypotheses posed. When data collection is intended to inform the management of ecological systems, we call it monitoring. Note that our definition of monitoring encompasses a broader range of data-collection efforts than some alternative definitions – e.g. Chapter 3. The purpose of monitoring as we use the term can vary, from surveillance or “thumb on the pulse” monitoring (see Nichols and Williams 2006), intended to detect changes in a system due to any non-specified source (e.g. the North American Breeding Bird Survey), to very specific and targeted monitoring of the results of specific management actions (e.g. banding and aerial survey efforts related to North American waterfowl harvest management). Although a role of surveillance monitoring is to detect unanticipated changes in a system, the same result is possible from a collection of targeted monitoring programs distributed across the same spatial range (Box 4.1). In the face of limited budgets and many specific management questions, tying monitoring as closely as possible to management needs is warranted (Nichols and Williams 2006). Adaptive resource management (ARM; Walters 1986, Williams 1997, Kendall 2001, Moore and Conroy 2006, McCarthy and Possingham 2007, Conroy et al. 2008a) provides a context and specific purpose for monitoring: to evaluate decisions with respect to achievement of specific management objectives; and to evaluate the relative validity of predictive system models. This latter purpose is analogous to the role of data collection within the scientific method, in a research context.

  20. Restoration of lost frequency in OpenPET imaging: comparison between the method of convex projections and the maximum likelihood expectation maximization method.

    PubMed

    Tashima, Hideaki; Katsunuma, Takayuki; Kudo, Hiroyuki; Murayama, Hideo; Obi, Takashi; Suga, Mikio; Yamaya, Taiga

    2014-07-01

    We are developing a new PET scanner based on the "OpenPET" geometry, which consists of two detector rings separated by a gap. One item to which attention must be paid is that OpenPET image reconstruction is classified into an incomplete inverse problem, where low-frequency components are truncated. In our previous simulations and experiments, however, the OpenPET imaging was made feasible by application of iterative image reconstruction methods. Therefore, we expect that iterative methods have a restorative effect to compensate for the lost frequency. There are two types of reconstruction methods for improving image quality when data truncation exists: one is the iterative methods such as the maximum-likelihood expectation maximization (ML-EM) and the other is an analytical image reconstruction method followed by the method of convex projections, which has not been employed for the OpenPET. In this study, therefore, we propose a method for applying the latter approach to the OpenPET image reconstruction and compare it with the ML-EM. We found that the proposed analytical method could reduce the occurrence of image artifacts caused by the lost frequency. A similar tendency for this restoration effect was observed in ML-EM image reconstruction where no additional restoration method was applied. Therefore, we concluded that the method of convex projections and the ML-EM had a similar restoration effect to compensate for the lost frequency. PMID:24879065

  1. Utilization of negative beat-frequencies for maximizing the update-rate of OFDR

    NASA Astrophysics Data System (ADS)

    Gabai, Haniel; Botsev, Yakov; Hahami, Meir; Eyal, Avishay

    2015-07-01

    In traditional OFDR systems, the backscattered profile of a sensing fiber is inefficiently duplicated to the negative band of spectrum. In this work, we present a new OFDR design and algorithm that remove this redundancy and make use of negative beat frequencies. In contrary to conventional OFDR designs, it facilitates efficient use of the available system bandwidth and enables distributed sensing with the maximum allowable interrogation update-rate for a given fiber length. To enable the reconstruction of negative beat frequencies an I/Q type receiver is used. In this receiver, both the in-phase (I) and quadrature (Q) components of the backscatter field are detected. Following detection, both components are digitally combined to produce a complex backscatter signal. Accordingly, due to its asymmetric nature, the produced spectrum will not be corrupted by the appearance of negative beat-frequencies. Here, via a comprehensive computer simulation, we show that in contrast to conventional OFDR systems, I/Q OFDR can be operated at maximum interrogation update-rate for a given fiber length. In addition, we experimentally demonstrate, for the first time, the ability of I/Q OFDR to utilize negative beat-frequencies for long-range distributed sensing.

  2. Unsupervised learning applied in MER and ECG signals through Gaussians mixtures with the Expectation-Maximization algorithm and Variational Bayesian Inference.

    PubMed

    Vargas Cardona, Hernán Darío; Orozco, Álvaro Ángel; Álvarez, Mauricio A

    2013-01-01

    Automatic identification of biosignals is one of the more studied fields in biomedical engineering. In this paper, we present an approach for the unsupervised recognition of biomedical signals: Microelectrode Recordings (MER) and Electrocardiography signals (ECG). The unsupervised learning is based in classic and bayesian estimation theory. We employ gaussian mixtures models with two estimation methods. The first is derived from the frequentist estimation theory, known as Expectation-Maximization (EM) algorithm. The second is obtained from bayesian probabilistic estimation and it is called variational inference. In this framework, both methods are used for parameters estimation of Gaussian mixtures. The mixtures models are used for unsupervised pattern classification, through the responsibility matrix. The algorithms are applied in two real databases acquired in Parkinson's disease surgeries and electrocardiograms. The results show an accuracy over 85% in MER and 90% in ECG for identification of two classes. These results are statistically equal or even better than parametric (Naive Bayes) and nonparametric classifiers (K-nearest neighbor). PMID:24110690

  3. GUIDELINES FOR ENSURING AND MAXIMIZING THE QUALITY, OBJECTIVITY, UTILITY, AND INTEGRITY OF INFORMATION DISSEMINATED BY THE ENVIRONMENTAL PROTECTION AGENCY

    EPA Science Inventory

    Developed in response to guidelines issued by the Office of Management and Budget (OMB)1 under Section 515(a) of the Treasury and General Government Appropriations Act for Fiscal Year 2001 (Public Law 106-554; H.R. 5658), the Guidelines for Ensuring and Maximizing the Quality, Ob...

  4. Image reconstruction of single photon emission computed tomography (SPECT) on a pebble bed reactor (PBR) using expectation maximization and exact inversion algorithms: Comparison study by means of numerical phantom

    NASA Astrophysics Data System (ADS)

    Razali, Azhani Mohd; Abdullah, Jaafar

    2015-04-01

    Single Photon Emission Computed Tomography (SPECT) is a well-known imaging technique used in medical application, and it is part of medical imaging modalities that made the diagnosis and treatment of disease possible. However, SPECT technique is not only limited to the medical sector. Many works are carried out to adapt the same concept by using high-energy photon emission to diagnose process malfunctions in critical industrial systems such as in chemical reaction engineering research laboratories, as well as in oil and gas, petrochemical and petrochemical refining industries. Motivated by vast applications of SPECT technique, this work attempts to study the application of SPECT on a Pebble Bed Reactor (PBR) using numerical phantom of pebbles inside the PBR core. From the cross-sectional images obtained from SPECT, the behavior of pebbles inside the core can be analyzed for further improvement of the PBR design. As the quality of the reconstructed image is largely dependent on the algorithm used, this work aims to compare two image reconstruction algorithms for SPECT, namely the Expectation Maximization Algorithm and the Exact Inversion Formula. The results obtained from the Exact Inversion Formula showed better image contrast and sharpness, and shorter computational time compared to the Expectation Maximization Algorithm.

  5. Image reconstruction of single photon emission computed tomography (SPECT) on a pebble bed reactor (PBR) using expectation maximization and exact inversion algorithms: Comparison study by means of numerical phantom

    SciTech Connect

    Razali, Azhani Mohd Abdullah, Jaafar

    2015-04-29

    Single Photon Emission Computed Tomography (SPECT) is a well-known imaging technique used in medical application, and it is part of medical imaging modalities that made the diagnosis and treatment of disease possible. However, SPECT technique is not only limited to the medical sector. Many works are carried out to adapt the same concept by using high-energy photon emission to diagnose process malfunctions in critical industrial systems such as in chemical reaction engineering research laboratories, as well as in oil and gas, petrochemical and petrochemical refining industries. Motivated by vast applications of SPECT technique, this work attempts to study the application of SPECT on a Pebble Bed Reactor (PBR) using numerical phantom of pebbles inside the PBR core. From the cross-sectional images obtained from SPECT, the behavior of pebbles inside the core can be analyzed for further improvement of the PBR design. As the quality of the reconstructed image is largely dependent on the algorithm used, this work aims to compare two image reconstruction algorithms for SPECT, namely the Expectation Maximization Algorithm and the Exact Inversion Formula. The results obtained from the Exact Inversion Formula showed better image contrast and sharpness, and shorter computational time compared to the Expectation Maximization Algorithm.

  6. Expectant Mothers Maximizing Opportunities: Maternal Characteristics Moderate Multifactorial Prenatal Stress in the Prediction of Birth Weight in a Sample of Children Adopted at Birth

    PubMed Central

    Brotnow, Line; Reiss, David; Stover, Carla S.; Ganiban, Jody; Leve, Leslie D.; Neiderhiser, Jenae M.; Shaw, Daniel S.; Stevens, Hanna E.

    2015-01-01

    Background Mothers’ stress in pregnancy is considered an environmental risk factor in child development. Multiple stressors may combine to increase risk, and maternal personal characteristics may offset the effects of stress. This study aimed to test the effect of 1) multifactorial prenatal stress, integrating objective “stressors” and subjective “distress” and 2) the moderating effects of maternal characteristics (perceived social support, self-esteem and specific personality traits) on infant birthweight. Method Hierarchical regression modeling was used to examine cross-sectional data on 403 birth mothers and their newborns from an adoption study. Results Distress during pregnancy showed a statistically significant association with birthweight (R2 = 0.032, F(2, 398) = 6.782, p = .001). The hierarchical regression model revealed an almost two-fold increase in variance of birthweight predicted by stressors as compared with distress measures (R2? = 0.049, F(4, 394) = 5.339, p < .001). Further, maternal characteristics moderated this association (R2? = 0.031, F(4, 389) = 3.413, p = .009). Specifically, the expected benefit to birthweight as a function of higher SES was observed only for mothers with lower levels of harm-avoidance and higher levels of perceived social support. Importantly, the results were not better explained by prematurity, pregnancy complications, exposure to drugs, alcohol or environmental toxins. Conclusions The findings support multidimensional theoretical models of prenatal stress. Although both objective stressors and subjectively measured distress predict birthweight, they should be considered distinct and cumulative components of stress. This study further highlights that jointly considering risk factors and protective factors in pregnancy improves the ability to predict birthweight. PMID:26544958

  7. Expected Sequence Similarity Maximization Cyril Allauzen1

    E-print Network

    Mohri, Mehryar

    -up by two orders of magnitude with respect to the original method of Tromble et al. (2008) and by a factor-vocabulary speech recognition (Goel and Byrne, 2000) and machine translation (Kumar and Byrne, 2004; Tromble et al

  8. Managing Expectations: Results from Case Studies of US Water Utilities on Preparing for, Coping with, and Adapting to Extreme Events

    NASA Astrophysics Data System (ADS)

    Beller-Simms, N.; Metchis, K.

    2014-12-01

    Water utilities, reeling from increased impacts of successive extreme events such as floods, droughts, and derechos, are taking a more proactive role in preparing for future incursions. A recent study by Federal and water foundation investigators, reveals how six US water utilities and their regions prepared for, responded to, and coped with recent extreme weather and climate events and the lessons they are using to plan future adaptation and resilience activities. Two case studies will be highlighted. (1) Sonoma County, CA, has had alternating floods and severe droughts. In 2009, this area, home to competing water users, namely, agricultural crops, wineries, tourism, and fisheries faced a three-year drought, accompanied at the end by intense frosts. Competing uses of water threatened the grape harvest, endangered the fish industry and resulted in a series of regulations, and court cases. Five years later, new efforts by partners in the entire watershed have identified mutual opportunities for increased basin sustainability in the face of a changing climate. (2) Washington DC had a derecho in late June 2012, which curtailed water, communications, and power delivery during a record heat spell that impacted hundreds of thousands of residents and lasted over the height of the tourist-intensive July 4th holiday. Lessons from this event were applied three months later in anticipation of an approaching Superstorm Sandy. This study will help other communities in improving their resiliency in the face of future climate extremes. For example, this study revealed that (1) communities are planning with multiple types and occurrences of extreme events which are becoming more severe and frequent and are impacting communities that are expanding into more vulnerable areas and (2) decisions by one sector can not be made in a vacuum and require the scientific, sectoral and citizen communities to work towards sustainable solutions.

  9. FINANCIAL MARKETS WITH MEMORY II: INNOVATION PROCESSES AND EXPECTED

    E-print Network

    Inoue, Akihiko

    above, the financial market with S(·) is complete and the Black­Scholes formula holds in it. The difference between the market with S(·) and the Black­Scholes market is illustratFINANCIAL MARKETS WITH MEMORY II: INNOVATION PROCESSES AND EXPECTED UTILITY MAXIMIZATION V. ANH, A

  10. On deciding to have a lobotomy: either lobotomies were justified or decisions under risk should not always seek to maximise expected utility.

    PubMed

    Cooper, Rachel

    2014-02-01

    In the 1940s and 1950s thousands of lobotomies were performed on people with mental disorders. These operations were known to be dangerous, but thought to offer great hope. Nowadays, the lobotomies of the 1940s and 1950s are widely condemned. The consensus is that the practitioners who employed them were, at best, misguided enthusiasts, or, at worst, evil. In this paper I employ standard decision theory to understand and assess shifts in the evaluation of lobotomy. Textbooks of medical decision making generally recommend that decisions under risk are made so as to maximise expected utility (MEU) I show that using this procedure suggests that the 1940s and 1950s practice of psychosurgery was justifiable. In making sense of this finding we have a choice: Either we can accept that psychosurgery was justified, in which case condemnation of the lobotomists is misplaced. Or, we can conclude that the use of formal decision procedures, such as MEU, is problematic. PMID:24449251

  11. Evidence for surprise minimization over value maximization in choice behavior.

    PubMed

    Schwartenbeck, Philipp; FitzGerald, Thomas H B; Mathys, Christoph; Dolan, Ray; Kronbichler, Martin; Friston, Karl

    2015-01-01

    Classical economic models are predicated on the idea that the ultimate aim of choice is to maximize utility or reward. In contrast, an alternative perspective highlights the fact that adaptive behavior requires agents' to model their environment and minimize surprise about the states they frequent. We propose that choice behavior can be more accurately accounted for by surprise minimization compared to reward or utility maximization alone. Minimizing surprise makes a prediction at variance with expected utility models; namely, that in addition to attaining valuable states, agents attempt to maximize the entropy over outcomes and thus 'keep their options open'. We tested this prediction using a simple binary choice paradigm and show that human decision-making is better explained by surprise minimization compared to utility maximization. Furthermore, we replicated this entropy-seeking behavior in a control task with no explicit utilities. These findings highlight a limitation of purely economic motivations in explaining choice behavior and instead emphasize the importance of belief-based motivations. PMID:26564686

  12. Evidence for surprise minimization over value maximization in choice behavior

    PubMed Central

    Schwartenbeck, Philipp; FitzGerald, Thomas H. B.; Mathys, Christoph; Dolan, Ray; Kronbichler, Martin; Friston, Karl

    2015-01-01

    Classical economic models are predicated on the idea that the ultimate aim of choice is to maximize utility or reward. In contrast, an alternative perspective highlights the fact that adaptive behavior requires agents’ to model their environment and minimize surprise about the states they frequent. We propose that choice behavior can be more accurately accounted for by surprise minimization compared to reward or utility maximization alone. Minimizing surprise makes a prediction at variance with expected utility models; namely, that in addition to attaining valuable states, agents attempt to maximize the entropy over outcomes and thus ‘keep their options open’. We tested this prediction using a simple binary choice paradigm and show that human decision-making is better explained by surprise minimization compared to utility maximization. Furthermore, we replicated this entropy-seeking behavior in a control task with no explicit utilities. These findings highlight a limitation of purely economic motivations in explaining choice behavior and instead emphasize the importance of belief-based motivations. PMID:26564686

  13. Prognostic utility of predischarge dipyridamole-thallium imaging compared to predischarge submaximal exercise electrocardiography and maximal exercise thallium imaging after uncomplicated acute myocardial infarction

    SciTech Connect

    Gimple, L.W.; Hutter, A.M. Jr.; Guiney, T.E.; Boucher, C.A. )

    1989-12-01

    The prognostic value of predischarge dipyridamole-thallium scanning after uncomplicated myocardial infarction was determined by comparison with submaximal exercise electrocardiography and 6-week maximal exercise thallium imaging and by correlation with clinical events. Two endpoints were defined: cardiac events and severe ischemic potential. Of the 40 patients studied, 8 had cardiac events within 6 months (1 died, 3 had myocardial infarction and 4 had unstable angina requiring hospitalization). The finding of any redistribution on dipyridamole-thallium scanning was common (77%) in these patients and had poor specificity (29%). Redistribution outside of the infarct zone, however, had equivalent sensitivity (63%) and better specificity (75%) for events (p less than 0.05). Both predischarge dipyridamole-thallium and submaximal exercise electrocardiography identified 5 of the 8 events (p = 0.04 and 0.07, respectively). The negative predictive accuracy for events for both dipyridamole-thallium and submaximal exercise electrocardiography was 88%. In addition to the 8 patients with events, 16 other patients had severe ischemic potential (6 had coronary bypass surgery, 1 had inoperable 3-vessel disease and 9 had markedly abnormal 6-week maximal exercise tests). Predischarge dipyridamole-thallium and submaximal exercise testing also identified 8 and 7 of these 16 patients with severe ischemic potential, respectively. Six of the 8 cardiac events occurred before 6-week follow-up. A maximal exercise thallium test at 6 weeks identified 1 of the 2 additional events within 6 months correctly. Thallium redistribution after dipyridamole in coronary territories outside the infarct zone is a sensitive and specific predictor of subsequent cardiac events and identifies patients with severe ischemic potential.

  14. Aging and loss decision making: increased risk aversion and decreased use of maximizing information, with correlated rationality and value maximization

    PubMed Central

    Kurnianingsih, Yoanna A.; Sim, Sam K. Y.; Chee, Michael W. L.; Mullette-Gillman, O’Dhaniel A.

    2015-01-01

    We investigated how adult aging specifically alters economic decision-making, focusing on examining alterations in uncertainty preferences (willingness to gamble) and choice strategies (what gamble information influences choices) within both the gains and losses domains. Within each domain, participants chose between certain monetary outcomes and gambles with uncertain outcomes. We examined preferences by quantifying how uncertainty modulates choice behavior as if altering the subjective valuation of gambles. We explored age-related preferences for two types of uncertainty, risk, and ambiguity. Additionally, we explored how aging may alter what information participants utilize to make their choices by comparing the relative utilization of maximizing and satisficing information types through a choice strategy metric. Maximizing information was the ratio of the expected value of the two options, while satisficing information was the probability of winning. We found age-related alterations of economic preferences within the losses domain, but no alterations within the gains domain. Older adults (OA; 61–80 years old) were significantly more uncertainty averse for both risky and ambiguous choices. OA also exhibited choice strategies with decreased use of maximizing information. Within OA, we found a significant correlation between risk preferences and choice strategy. This linkage between preferences and strategy appears to derive from a convergence to risk neutrality driven by greater use of the effortful maximizing strategy. As utility maximization and value maximization intersect at risk neutrality, this result suggests that OA are exhibiting a relationship between enhanced rationality and enhanced value maximization. While there was variability in economic decision-making measures within OA, these individual differences were unrelated to variability within examined measures of cognitive ability. Our results demonstrate that aging alters economic decision-making for losses through changes in both individual preferences and the strategies individuals employ. PMID:26029092

  15. DEVELOPMENT OF A VALIDATED MODEL FOR USE IN MINIMIZING NOx EMISSIONS AND MAXIMIZING CARBON UTILIZATION WHEN CO-FIRING BIOMASS WITH COAL

    SciTech Connect

    Larry G. Felix; P. Vann Bush

    2003-01-29

    This is the ninth Quarterly Technical Report for DOE Cooperative Agreement No. DE-FC26-00NT40895. A statement of the project objectives is included in the Introduction of this report. The pilot-scale testing phase of the project has been completed. Calculations are essentially completed for implementing a modeling approach to combine reaction times and temperature distributions from computational fluid dynamic models of the pilot-scale combustion furnace with char burnout and chemical reaction kinetics to predict NO{sub x} emissions and unburned carbon levels in the furnace exhaust. The REI Configurable Fireside Simulator (CFS) has proven to be an essential component to provide input for these calculations. Niksa Energy Associates expects to deliver their final report in February 2003. Work has continued on the project final report.

  16. DEVELOPMENT OF A VALIDATED MODEL FOR USE IN MINIMIZING NOx EMISSIONS AND MAXIMIZING CARBON UTILIZATION WHEN CO-FIRING BIOMASS WITH COAL

    SciTech Connect

    Larry G. Felix; P. Vann Bush; Stephen Niksa

    2003-04-30

    In full-scale boilers, the effect of biomass cofiring on NO{sub x} and unburned carbon (UBC) emissions has been found to be site-specific. Few sets of field data are comparable and no consistent database of information exists upon which cofiring fuel choice or injection system design can be based to assure that NOX emissions will be minimized and UBC be reduced. This report presents the results of a comprehensive project that generated an extensive set of pilot-scale test data that were used to validate a new predictive model for the cofiring of biomass and coal. All testing was performed at the 3.6 MMBtu/hr (1.75 MW{sub t}) Southern Company Services/Southern Research Institute Combustion Research Facility where a variety of burner configurations, coals, biomasses, and biomass injection schemes were utilized to generate a database of consistent, scalable, experimental results (422 separate test conditions). This database was then used to validate a new model for predicting NO{sub x} and UBC emissions from the cofiring of biomass and coal. This model is based on an Advanced Post-Processing (APP) technique that generates an equivalent network of idealized reactor elements from a conventional CFD simulation. The APP reactor network is a computational environment that allows for the incorporation of all relevant chemical reaction mechanisms and provides a new tool to quantify NOx and UBC emissions for any cofired combination of coal and biomass.

  17. Maximizing the utilization of Laminaria japonica as biomass via improvement of alginate lyase activity in a two-phase fermentation system.

    PubMed

    Oh, Yuri; Xu, Xu; Kim, Ji Young; Park, Jong Moon

    2015-08-01

    Brown seaweed contains up to 67% of carbohydrates by dry weight and presents high potential as a polysaccharide feedstock for biofuel production. To effectively use brown seaweed as a biomass, degradation of alginate is the major challenge due to its complicated structure and low solubility in water. This study focuses on the isolation of alginate degrading bacteria, determining of the optimum fermentation conditions, as well as comparing the conventional single fermentation system with the two-phase fermentation system which is separately using alginate and mannitol extracted from Laminaria japonica. Maximum yield of organic acids production and volatile solids reduction obtained were 0.516 g/g and 79.7%, respectively, using the two-phase fermentation system in which alginate fermentation was carried out at pH 7 and mannitol fermentation at pH 8. The two-phase fermentation system increased the yield of organic acids production by 1.14 times and led to a 1.45-times reduction of VS when compared to the conventional single fermentation system at pH 8. The results show that the two-phase fermentation system improved the utilization of alginate by separating alginate from mannitol leading to enhanced alginate lyase activity. PMID:26098412

  18. Maximally natural supersymmetry.

    PubMed

    Dimopoulos, Savas; Howe, Kiel; March-Russell, John

    2014-09-12

    We consider 4D weak scale theories arising from 5D supersymmetric (SUSY) theories with maximal Scherk-Schwarz breaking at a Kaluza-Klein scale of several TeV. Many of the problems of conventional SUSY are avoided. Apart from 3rd family sfermions the SUSY spectrum is heavy, with only ?50% tuning at a gluino mass of ?2??TeV and a stop mass of ?650??GeV. A single Higgs doublet acquires a vacuum expectation value, so the physical Higgs boson is automatically standard-model-like. A new U(1)^{'} interaction raises m_{h} to 126 GeV. For minimal tuning the associated Z^{'}, as well as the 3rd family sfermions, must be accessible to LHC13. A gravitational wave signal consistent with hints from BICEP2 is possible if inflation occurs when the extra dimensions are small. PMID:25259967

  19. Maximally Expressive Modeling

    NASA Technical Reports Server (NTRS)

    Jaap, John; Davis, Elizabeth; Richardson, Lea

    2004-01-01

    Planning and scheduling systems organize tasks into a timeline or schedule. Tasks are logically grouped into containers called models. Models are a collection of related tasks, along with their dependencies and requirements, that when met will produce the desired result. One challenging domain for a planning and scheduling system is the operation of on-board experiments for the International Space Station. In these experiments, the equipment used is among the most complex hardware ever developed; the information sought is at the cutting edge of scientific endeavor; and the procedures are intricate and exacting. Scheduling is made more difficult by a scarcity of station resources. The models to be fed into the scheduler must describe both the complexity of the experiments and procedures (to ensure a valid schedule) and the flexibilities of the procedures and the equipment (to effectively utilize available resources). Clearly, scheduling International Space Station experiment operations calls for a maximally expressive modeling schema.

  20. Aspects Of Utility Maximization With Habit Formation

    E-print Network

    Karatzas, Ioannis

    and Ioanna Egglezos, and my older brother Alexandros Egglezos for their unfailing love and unconditional understanding and priceless contribution, both emotional and scientific, to the completion of the thesis. ii #12

  1. Generation and Transmission Maximization Model

    Energy Science and Technology Software Center (ESTSC)

    2001-04-05

    GTMax was developed to study complex marketing and system operational issues facing electric utility power systems. The model maximizes the value of the electric system taking into account not only a single system''s limited energy and transmission resources but also firm contracts, independent power producer (IPP) agreements, and bulk power transaction opportunities on the spot market. GTMax maximizes net revenues of power systems by finding a solution that increases income while keeping expenses at amore »minimum. It does this while ensuring that market transactions and system operations are within the physical and institutional limitations of the power system. When multiple systems are simulated, GTMax identifies utilities that can successfully compete on the market by tracking hourly energy transactions, costs, and revenues. Some limitations that are modeled are power plant seasonal capabilities and terms specified in firm and IPP contracts. GTMax also considers detaile operational limitations such as power plant ramp rates and hydropower reservoir constraints.« less

  2. Maximally incompatible quantum observables

    E-print Network

    Teiko Heinosaari; Jussi Schultz; Alessandro Toigo; Mario Ziman

    2013-12-12

    The existence of maximally incompatible quantum observables in the sense of a minimal joint measurability region is investigated. Employing the universal quantum cloning device it is argued that only infinite dimensional quantum systems can accommodate maximal incompatibility. It is then shown that two of the most common pairs of complementary observables (position and momentum; number and phase) are maximally incompatible.

  3. Branch and bound algorithms for maximizing expected improvement functions

    E-print Network

    Ranjan, Pritam

    a r t i c l e i n f o Article history: Received 19 October 2009 Received in revised form 11 May 2010.g., Jones et al., 1998; Villemonteix et al., 2006; Forrester and Jones, 2008) Contents lists available to be efficient if the initial design is not too sparse or deceptive (Forrester and Jones, 2008). The form

  4. Maximization, learning, and economic behavior

    PubMed Central

    Erev, Ido; Roth, Alvin E.

    2014-01-01

    The rationality assumption that underlies mainstream economic theory has proved to be a useful approximation, despite the fact that systematic violations to its predictions can be found. That is, the assumption of rational behavior is useful in understanding the ways in which many successful economic institutions function, although it is also true that actual human behavior falls systematically short of perfect rationality. We consider a possible explanation of this apparent inconsistency, suggesting that mechanisms that rest on the rationality assumption are likely to be successful when they create an environment in which the behavior they try to facilitate leads to the best payoff for all agents on average, and most of the time. Review of basic learning research suggests that, under these conditions, people quickly learn to maximize expected return. This review also shows that there are many situations in which experience does not increase maximization. In many cases, experience leads people to underweight rare events. In addition, the current paper suggests that it is convenient to distinguish between two behavioral approaches to improve economic analyses. The first, and more conventional approach among behavioral economists and psychologists interested in judgment and decision making, highlights violations of the rational model and proposes descriptive models that capture these violations. The second approach studies human learning to clarify the conditions under which people quickly learn to maximize expected return. The current review highlights one set of conditions of this type and shows how the understanding of these conditions can facilitate market design. PMID:25024182

  5. Maximally nonlocal theories cannot be maximally random.

    PubMed

    de la Torre, Gonzalo; Hoban, Matty J; Dhara, Chirag; Prettico, Giuseppe; Acín, Antonio

    2015-04-24

    Correlations that violate a Bell inequality are said to be nonlocal; i.e., they do not admit a local and deterministic explanation. Great effort has been devoted to study how the amount of nonlocality (as measured by a Bell inequality violation) serves to quantify the amount of randomness present in observed correlations. In this work we reverse this research program and ask what do the randomness certification capabilities of a theory tell us about the nonlocality of that theory. We find that, contrary to initial intuition, maximal randomness certification cannot occur in maximally nonlocal theories. We go on and show that quantum theory, in contrast, permits certification of maximal randomness in all dichotomic scenarios. We hence pose the question of whether quantum theory is optimal for randomness; i.e., is it the most nonlocal theory that allows maximal randomness certification? We answer this question in the negative by identifying a larger-than-quantum set of correlations capable of this feat. Not only are these results relevant to understanding quantum mechanics' fundamental features, but also put fundamental restrictions on device-independent protocols based on the no-signaling principle. PMID:25955039

  6. Maximizing Multi-Information

    E-print Network

    Nihat Ay; Andreas Knauf

    2007-02-01

    Stochastic interdependence of a probablility distribution on a product space is measured by its Kullback-Leibler distance from the exponential family of product distributions (called multi-information). Here we investigate low-dimensional exponential families that contain the maximizers of stochastic interdependence in their closure. Based on a detailed description of the structure of probablility distributions with globally maximal multi-information we obtain our main result: The exponential family of pure pair-interactions contains all global maximizers of the multi-information in its closure.

  7. Network Maximal Correlation

    E-print Network

    Feizi, Soheil

    2015-09-21

    Identifying nonlinear relationships in large datasets is a daunting task particularly when the form of the nonlinearity is unknown. Here, we introduce Network Maximal Correlation (NMC) as a fundamental measure to capture ...

  8. BIOMASS UTILIZATION

    EPA Science Inventory

    The biomass utilization task consists of the evaluation of a biomass conversion technology including research and development initiatives. The project is expected to provide information on co-control of pollutants, as well as, to prove the feasibility of biomass conversion techn...

  9. Uplifting Maximal Gauged Supergravities

    E-print Network

    Baron, Walter H

    2015-01-01

    Which theories have a higher dimensional origin in String/M-theory is a non trivial question and it is still far from being understood in the constrained scenario of maximal supergravities. After 35 years of progress in this direction we have found supporting evidence in favor of the idea that every electric maximal supergravity in 4 dimensions can be uplifted to M-theory. We will review the current understanding of this problem with special emphasis in the uplifting of non compact supergravities and their relation with Exceptional Generalised Geometry.

  10. Uplifting Maximal Gauged Supergravities

    E-print Network

    Walter H. Baron

    2015-12-17

    Which theories have a higher dimensional origin in String/M-theory is a non trivial question and it is still far from being understood in the constrained scenario of maximal supergravities. After 35 years of progress in this direction we have found supporting evidence in favor of the idea that every electric maximal supergravity in 4 dimensions can be uplifted to M-theory. We will review the current understanding of this problem with special emphasis in the uplifting of non compact supergravities and their relation with Exceptional Generalised Geometry.

  11. How To: Maximize Google

    ERIC Educational Resources Information Center

    Branzburg, Jeffrey

    2004-01-01

    Google is shaking out to be the leading Web search engine, with recent research from Nielsen NetRatings reporting about 40 percent of all U.S. households using the tool at least once in January 2004. This brief article discusses how teachers and students can maximize their use of Google.

  12. Information Coverage Maximization in Social Networks

    E-print Network

    Wang, Zhefeng; Liu, Qi; Yang, Yu; Ge, Yong; Chang, Biao

    2015-01-01

    Social networks, due to their popularity, have been studied extensively these years. A rich body of these studies is related to influence maximization, which aims to select a set of seed nodes for maximizing the expected number of active nodes at the end of the process. However, the set of active nodes can not fully represent the true coverage of information propagation. A node may be informed of the information when any of its neighbours become active and try to activate it, though this node (namely informed node) is still inactive. Therefore, we need to consider both active nodes and informed nodes that are aware of the information when we study the coverage of information propagation in a network. Along this line, in this paper we propose a new problem called Information Coverage Maximization that aims to maximize the expected number of both active nodes and informed ones. After we prove that this problem is NP-hard and submodular in the independent cascade model and the linear threshold model, we design t...

  13. Creating a Bridge between Data Collection and Program Planning: A Technical Assistance Model to Maximize the Use of HIV/AIDS Surveillance and Service Utilization Data for Planning Purposes

    ERIC Educational Resources Information Center

    Logan, Jennifer A.; Beatty, Maile; Woliver, Renee; Rubinstein, Eric P.; Averbach, Abigail R.

    2005-01-01

    Over time, improvements in HIV/AIDS surveillance and service utilization data have increased their usefulness for planning programs, targeting resources, and otherwise informing HIV/AIDS policy. However, community planning groups, service providers, and health department staff often have difficulty in interpreting and applying the wide array of…

  14. Infrared Maximally Abelian Gauge

    SciTech Connect

    Mendes, Tereza; Cucchieri, Attilio; Mihara, Antonio

    2007-02-27

    The confinement scenario in Maximally Abelian gauge (MAG) is based on the concepts of Abelian dominance and of dual superconductivity. Recently, several groups pointed out the possible existence in MAG of ghost and gluon condensates with mass dimension 2, which in turn should influence the infrared behavior of ghost and gluon propagators. We present preliminary results for the first lattice numerical study of the ghost propagator and of ghost condensation for pure SU(2) theory in the MAG.

  15. Quantum-Inspired Maximizer

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2008-01-01

    A report discusses an algorithm for a new kind of dynamics based on a quantum- classical hybrid-quantum-inspired maximizer. The model is represented by a modified Madelung equation in which the quantum potential is replaced by different, specially chosen 'computational' potential. As a result, the dynamics attains both quantum and classical properties: it preserves superposition and entanglement of random solutions, while allowing one to measure its state variables, using classical methods. Such optimal combination of characteristics is a perfect match for quantum-inspired computing. As an application, an algorithm for global maximum of an arbitrary integrable function is proposed. The idea of the proposed algorithm is very simple: based upon the Quantum-inspired Maximizer (QIM), introduce a positive function to be maximized as the probability density to which the solution is attracted. Then the larger value of this function will have the higher probability to appear. Special attention is paid to simulation of integer programming and NP-complete problems. It is demonstrated that the problem of global maximum of an integrable function can be found in polynomial time by using the proposed quantum- classical hybrid. The result is extended to a constrained maximum with applications to integer programming and TSP (Traveling Salesman Problem).

  16. Changing expectancies: cognitive mechanisms and context effects.

    PubMed

    Wiers, Reinout W; Wood, Mark D; Darkes, Jack; Corbin, William R; Jones, Barry T; Sher, Kenneth J

    2003-02-01

    This article presents the proceedings of a symposium at the 2002 RSA Meeting in San Francisco, organized by Reinout W. Wiers and Mark D. Wood. The symposium combined two topics of recent interest in studies of alcohol expectancies: cognitive mechanisms in expectancy challenge studies, and context-related changes of expectancies. With increasing recognition of the substantial role played by alcohol expectancies in drinking, investigators have begun to develop and evaluate expectancy challenge procedures as a potentially promising new prevention strategy. The two major issues addressed in the symposium were whether expectancy challenges result in changes in expectancies that mediate intervention (outcome relations), and the influence of simulated bar environments ("bar labs," in which challenges are usually done) on expectancies. The presentations were (1) An introduction, by Jack Darkes; (2) Investigating the utility of alcohol expectancy challenge with heavy drinking college students, by Mark D. Wood; (3) Effects of an expectancy challenge on implicit and explicit expectancies and drinking, by Reinout W. Wiers; (4) Effects of graphic feedback and simulated bar assessments on alcohol expectancies and consumption, by William R. Corbin; (5) Implicit alcohol associations and context, by Barry T Jones; and (6) A discussion by Kenneth J. Sher, who pointed out that it is important not only to study changes of expectancies in the paradigm of an expectancy challenge but also to consider the role of changing expectancies in natural development and in treatments not explicitly aimed at changing expectancies. PMID:12605068

  17. No Mikheyev-Smirnov-Wolfenstein Effect in Maximal Mixing

    E-print Network

    P. F. Harrison; D. H. Perkins; W. G. Scott

    1996-01-26

    We investigate the possible influence of the MSW effect on the expectations for the solar neutrino experiments in the maximal mixing scenario suggested by the atmospheric neutrino data. A direct numerical calculation of matter induced effects in the Sun shows that the naive vacuum predictions are left completely undisturbed in the particular case of maximal mixing, so that the MSW effect turns out to be unobservable. We give a qualitative explanation of this result.

  18. Maximizing relationship possibilities: relational maximization in romantic relationships.

    PubMed

    Mikkelson, Alan C; Pauley, Perry M

    2013-01-01

    Using Rusbult's (1980) investment model and Schwartz's (2000) conceptualization of decision maximization, we sought to understand how an individual's propensity to maximize his or her decisions factored into investment, satisfaction, and awareness of alternatives in romantic relationships. In study one, 275 participants currently involved in romantic relationships completed measures of maximization, satisfaction, investment size, quality of alternatives, and commitment. In study two, 343 participants were surveyed as part of the creation of a scale of relational maximization. Results from both studies revealed that the tendency to maximize (in general and in relationships specifically) was negatively correlated with satisfaction, investment, and commitment, and positively correlated with quality of alternatives. Furthermore, we found that satisfaction and investments mediated the relationship between maximization and relationship commitment. PMID:23951952

  19. Outside the Expected.

    ERIC Educational Resources Information Center

    Dienstfrey, Harris

    1968-01-01

    In examining the findings of "Pygmalion in the Classroom," an experimental study of the positive effects of favorable teacher expectations on the intellectual development of disadvantaged elementary school students, this review speculates about why the experimental students, whom the teachers expected to improve, and the control students, who were…

  20. Reflections on Expectations

    ERIC Educational Resources Information Center

    Santini, Joseph

    2014-01-01

    This article describes a teachers reflections on the matter of student expectations. Santini begins with a common understanding of the "Pygmalion effect" from research projects conducted in earlier years that intimated "people's expectations could influence other people in the world around them." In the world of deaf…

  1. An Unexpected Expected Value.

    ERIC Educational Resources Information Center

    Schwartzman, Steven

    1993-01-01

    Discusses the surprising result that the expected number of marbles of one color drawn from a set of marbles of two colors after two draws without replacement is the same as the expected number of that color marble after two draws with replacement. Presents mathematical models to help explain this phenomenon. (MDH)

  2. Maximizing Brightness in Photoinjectors

    SciTech Connect

    Limborg-Deprey, C.; Tomizawa, H.; /JAERI-RIKEN, Hyogo

    2011-11-30

    If the laser pulse driving photoinjectors could be arbitrarily shaped, the emittance growth induced by space charge effects could be totally compensated for. In particular, for RF guns the photo-electron distribution leaving the cathode should have a 3D-ellipsoidal shape. The emittance at the end of the injector could be as small as the cathode emittance. We explore how the emittance and the brightness can be optimized for photoinjector based on RF gun depending on the peak current requirements. Techniques available to produce those ideal laser pulse shapes are also discussed. If the laser pulse driving photoinjectors could be arbitrarily shaped, the emittance growth induced by space charge effects could be totally compensated for. In particular, for RF guns, the photo-electron distribution leaving the cathode should be close to a uniform distribution contained in a 3D-ellipsoid contour. For photo-cathodes which have very fast emission times, and assuming a perfectly uniform emitting surface, this could be achieved by shaping the laser in a pulse of constant fluence and limited in space by a 3D-ellipsoid contour. Simulations show that in such conditions, with the standard linear emittance compensation, the emittance at the end of the photo-injector beamline approaches the minimum value imposed by the cathode emittance. Brightness, which is expressed as the ratio of peak current over the product of the two transverse emittance, seems to be maximized for small charges. Numerical simulations also show that for very high charge per bunch (10nC), emittances as small as 2 mm-mrad could be reached by using 3D-ellipsoidal laser pulses in an S-Band gun. The production of 3D-ellipsoidal pulses is very challenging, but seems worthwhile the effort. We briefly discuss some of the present ideas and difficulties of achieving such pulses.

  3. COPD: maximization of bronchodilation.

    PubMed

    Nardini, Stefano; Camiciottoli, Gianna; Locicero, Salvatore; Maselli, Rosario; Pasqua, Franco; Passalacqua, Giovanni; Pela, Riccardo; Pesci, Alberto; Sebastiani, Alfredo; Vatrella, Alessandro

    2014-01-01

    The most recent guidelines define COPD in a multidimensional way, nevertheless the diagnosis is still linked to the limitation of airflow, usually measured by the reduction in the FEV1/FVC ratio below 70%. However, the severity of obstruction is not directly correlated to symptoms or to invalidity determined by COPD. Thus, besides respiratory function, COPD should be evaluated based on symptoms, frequency and severity of exacerbations, patient's functional status and health related quality of life (HRQoL). Therapy is mainly aimed at increasing exercise tolerance and reducing dyspnea, with improvement of daily activities and HRQoL. This can be accomplished by a drug-induced reduction of pulmonary hyperinflation and exacerbations frequency and severity. All guidelines recommend bronchodilators as baseline therapy for all stages of COPD, and long-acting inhaled bronchodilators, both beta-2 agonist (LABA) and antimuscarinic (LAMA) drugs, are the most effective in regular treatment in the clinically stable phase. The effectiveness of bronchodilators should be evaluated in terms of functional (relief of bronchial obstruction and pulmonary hyperinflation), symptomatic (exercise tolerance and HRQoL), and clinical improvement (reduction in number or severity of exacerbations), while the absence of a spirometric response is not a reason for interrupting treatment, if there is subjective improvement in symptoms. Because LABA and LAMA act via different mechanisms of action, when administered in combination they can exert additional effects, thus optimizing (i.e. maximizing) sustained bronchodilation in COPD patients with severe airflow limitation, who cannot benefit (or can get only partial benefit) by therapy with a single bronchodilator. Recently, a fixed combination of ultra LABA/LAMA (indacaterol/glycopyrronium) has shown that it is possible to get a stable and persistent bronchodilation, which can help in avoiding undesirable fluctuations of bronchial calibre. PMID:25364503

  4. Health expectancy indicators.

    PubMed Central

    Robine, J. M.; Romieu, I.; Cambois, E.

    1999-01-01

    An outline is presented of progress in the development of health expectancy indicators, which are growing in importance as a means of assessing the health status of populations and determining public health priorities. PMID:10083720

  5. Expectations across entertainment media

    E-print Network

    Austin, Alexander Chance

    2007-01-01

    An audience's satisfaction with an entertainment product is dependent on how well their expectations are fulfilled. This study delves into the implicit contract that is formed between the purveyor of an entertainment ...

  6. Maximal quantum Fisher information for general su(2) parametrization processes

    NASA Astrophysics Data System (ADS)

    Jing, Xiao-Xing; Liu, Jing; Xiong, Heng-Na; Wang, Xiaoguang

    2015-07-01

    Quantum Fisher information is a key concept in the field of quantum metrology, which aims to enhance the accuracy of parameter estimation by using quantum resources. In this paper, utilizing a representation of quantum Fisher information for a general unitary parametrization process, we study unitary parametrization processes governed by su(2) dynamics. We obtain the analytical expression for the Hermitian operator of the parametrization and the maximal quantum Fisher information. We find that the maximal quantum Fisher information over the parameter space consists of two parts; one is quadratic in time and the other oscillates with time. We apply our result to the estimation of a magnetic field and obtained the maximal quantum Fisher information. We further discuss a driving field with a time-dependent Hamiltonian and find that the maximal quantum Fisher information of the driving frequency attains the optimum when it is in resonance with the atomic frequency.

  7. Utilizing Partnerships to Maximize Resources in College Counseling Services

    ERIC Educational Resources Information Center

    Stewart, Allison; Moffat, Meridith; Travers, Heather; Cummins, Douglas

    2015-01-01

    Research indicates an increasing number of college students are experiencing severe psychological problems that are impacting their academic performance. However, many colleges and universities operate with constrained budgets that limit their ability to provide adequate counseling services for their student population. Moreover, accessing…

  8. Unified Utility Maximization Framework for Resource Selection Language Technology Inst.

    E-print Network

    Callan, Jamie

    centralized database for the purpose of indexing. Distributed information retrieval, also known as federated search [1,4,7,11,14,22] is different from ad-hoc information retrieval as it addresses the cases when

  9. Utility Maximization Under a Shortfall Risk Constraint Anne Gundel

    E-print Network

    Weber, Stefan

    -time financial market model under a joint budget and downside risk constraint. The risk constraint is given and management of downside risk is a key issue. Value at Risk (VaR) has emerged as the industry standard for risk in order to make a position acceptable from a risk management perspective. Second, a risk measure should

  10. Using Debate to Maximize Learning Potential: A Case Study

    ERIC Educational Resources Information Center

    Firmin, Michael W.; Vaughn, Aaron; Dye, Amanda

    2007-01-01

    Following a review of the literature, an educational case study is provided for the benefit of faculty preparing college courses. In particular, we provide a transcribed debate utilized in a General Psychology course as a best practice example of how to craft a debate which maximizes student learning. The work is presented as a model for the…

  11. Performance expectation plan

    SciTech Connect

    Ray, P.E.

    1998-09-04

    This document outlines the significant accomplishments of fiscal year 1998 for the Tank Waste Remediation System (TWRS) Project Hanford Management Contract (PHMC) team. Opportunities for improvement to better meet some performance expectations have been identified. The PHMC has performed at an excellent level in administration of leadership, planning, and technical direction. The contractor has met and made notable improvement of attaining customer satisfaction in mission execution. This document includes the team`s recommendation that the PHMC TWRS Performance Expectation Plan evaluation rating for fiscal year 1998 be an Excellent.

  12. Maximize x(a - x)

    ERIC Educational Resources Information Center

    Lange, L. H.

    1974-01-01

    Five different methods for determining the maximizing condition for x(a - x) are presented. Included is the ancient Greek version and a method attributed to Fermat. None of the proofs use calculus. (LS)

  13. Constrained expectation-maximization (EM), dynamic analysis, linear quadratic tracking, and nonlinear constrained expectation-maximation (EM) for the analysis of genetic regulatory networks and signal transduction networks 

    E-print Network

    Xiong, Hao

    2009-05-15

    Despite the immense progress made by molecular biology in cataloging andcharacterizing molecular elements of life and the success in genome sequencing, therehave not been comparable advances in the functional study of ...

  14. Great Expectations. [Lesson Plan].

    ERIC Educational Resources Information Center

    Devine, Kelley

    Based on Charles Dickens' novel "Great Expectations," this lesson plan presents activities designed to help students understand the differences between totalitarianism and democracy; and a that a writer of a story considers theme, plot, characters, setting, and point of view. The main activity of the lesson involves students working in groups to…

  15. Relief expectation and sleep.

    PubMed

    Laverdure-Dupont, Danièle; Rainville, Pierre; Montplaisir, Jacques; Lavigne, Gilles

    2010-01-01

    Originally, a role for sleep in learning and memory has been advocated following the observation of sleep-dependent performance enhancements at simple procedural tasks. With the investigation of a variety of cognitive and behavioral abilities, multiple stages of memory were further suggested to benefit from the off-line reprocessing believed to occur during specific sleep stages. In particular, REM sleep has been implicated in the integration of new information into associative networks as well as in the abstraction and generalization of implicit rules allowing adaptive behaviors. In a recent study, we extended these observations by demonstrating that the mediating effect of expectation on placebo-induced analgesia is strengthened by sleep, and that the individual amount of REM sleep is predictive of the relief expected on the next morning. However, this relation is strongly modulated by the level of concordance between expectations and sensory information available prior to sleep. As placebo responses derive from the learned association between contextual cues and subsequent relief, these results are discussed in relation to the proposed roles of REM sleep in the integrative stages of memory processing. In light of the responsiveness of REM sleep to waking events, its expression is also proposed to reflect the cognitive demand associated with the offline reprocessing of information necessary for the assimilation of new expectations to one's belief system. PMID:21280456

  16. Parenting with High Expectations

    ERIC Educational Resources Information Center

    Timperlake, Benna Hull; Sanders, Genelle Timperlake

    2014-01-01

    In some ways raising deaf or hard of hearing children is no different than raising hearing children; expectations must be established and periodically tweaked. Benna Hull Timperlake, who with husband Roger, raised two hearing children in addition to their deaf daughter, Genelle Timperlake Sanders, and Genelle, now a deaf professional, share their…

  17. Maintaining High Expectations

    ERIC Educational Resources Information Center

    Williams, Roger; Williams, Sherry

    2014-01-01

    Author and husband, Roger Williams, is hearing and signs fluently, and author and wife, Sherry Williams, is deaf and uses both speech and signs, although she is most comfortable signing. As parents of six children--deaf and hearing--they are determined to encourage their children to do their best, and they always set their expectations high. They…

  18. Maximizing TDRS Command Load Lifetime

    NASA Technical Reports Server (NTRS)

    Brown, Aaron J.

    2002-01-01

    The GNC software onboard ISS utilizes TORS command loads, and a simplistic model of TORS orbital motion to generate onboard TORS state vectors. Each TORS command load contains five "invariant" orbital elements which serve as inputs to the onboard propagation algorithm. These elements include semi-major axis, inclination, time of last ascending node crossing, right ascension of ascending node, and mean motion. Running parallel to the onboard software is the TORS Command Builder Tool application, located in the JSC Mission Control Center. The TORS Command Builder Tool is responsible for building the TORS command loads using a ground TORS state vector, mirroring the onboard propagation algorithm, and assessing the fidelity of current TORS command loads onboard ISS. The tool works by extracting a ground state vector at a given time from a current TORS ephemeris, and then calculating the corresponding "onboard" TORS state vector at the same time using the current onboard TORS command load. The tool then performs a comparison between these two vectors and displays the relative differences in the command builder tool GUI. If the RSS position difference between these two vectors exceeds the tolerable lim its, a new command load is built using the ground state vector and uplinked to ISS. A command load's lifetime is therefore defined as the time from when a command load is built to the time the RSS position difference exceeds the tolerable limit. From the outset of TORS command load operations (STS-98), command load lifetime was limited to approximately one week due to the simplicity of both the onboard propagation algorithm, and the algorithm used by the command builder tool to generate the invariant orbital elements. It was soon desired to extend command load lifetime in order to minimize potential risk due to frequent ISS commanding. Initial studies indicated that command load lifetime was most sensitive to changes in mean motion. Finding a suitable value for mean motion was therefore the key to achieving this goal. This goal was eventually realized through development of an Excel spreadsheet tool called EMMIE (Excel Mean Motion Interactive Estimation). EMMIE utilizes ground ephemeris nodal data to perform a least-squares fit to inferred mean anomaly as a function of time, thus generating an initial estimate for mean motion. This mean motion in turn drives a plot of estimated downtrack position difference versus time. The user can then manually iterate the mean motion, and determine an optimal value that will maximize command load lifetime. Once this optimal value is determined, the mean motion initially calculated by the command builder tool is overwritten with the new optimal value, and the command load is built for uplink to ISS. EMMIE also provides the capability for command load lifetime to be tracked through multiple TORS ephemeris updates. Using EMMIE, TORS command load lifetimes of approximately 30 days have been achieved.

  19. AN EXPECTED UTILITY MODEL WITH ENDOGENOUSLY DETERMINED GOALS*

    E-print Network

    Tesfatsion, Leigh

    contingency plans), functionally mapping states into consequences, is now common in economics. Yet, as has the realized outcome and the desired outcome (goal). An interesting complementary relationship exists between

  20. MAXIMALLY EQUIDISTRIBUTED COMBINED TAUSWORTHE GENERATORS

    E-print Network

    L'Ecuyer, Pierre

    MAXIMALLY EQUIDISTRIBUTED COMBINED TAUSWORTHE GENERATORS PIERRE L'ECUYER Abstract. Tausworthe random number generators based on a primitive tri­ nomial allow an easy and fast implementation when their parameters obey certain restrictions. However, such generators, with those restrictions, have bad statistical

  1. Orthogonal Procrustes Rotation Maximizing Congruence.

    ERIC Educational Resources Information Center

    Brokken, Frank B.

    1983-01-01

    Procedures for assessing the invariance of factors across data sets often use the least squares criterion, which appears to be too restrictive. Tucker's coefficient of congruence is proposed as an alternative. A method that maximizes the sum of the coefficients of congruence between two matrices of loadings is presented. (Author/JKS)

  2. Sociology of Low Expectations

    PubMed Central

    Samuel, Gabrielle; Williams, Clare

    2015-01-01

    Social scientists have drawn attention to the role of hype and optimistic visions of the future in providing momentum to biomedical innovation projects by encouraging innovation alliances. In this article, we show how less optimistic, uncertain, and modest visions of the future can also provide innovation projects with momentum. Scholars have highlighted the need for clinicians to carefully manage the expectations of their prospective patients. Using the example of a pioneering clinical team providing deep brain stimulation to children and young people with movement disorders, we show how clinicians confront this requirement by drawing on their professional knowledge and clinical expertise to construct visions of the future with their prospective patients; visions which are personalized, modest, and tainted with uncertainty. We refer to this vision-constructing work as recalibration, and we argue that recalibration enables clinicians to manage the tension between the highly optimistic and hyped visions of the future that surround novel biomedical interventions, and the exigencies of delivering those interventions in a clinical setting. Drawing on work from science and technology studies, we suggest that recalibration enrolls patients in an innovation alliance by creating a shared understanding of how the “effectiveness” of an innovation shall be judged. PMID:26527846

  3. Expectations and speech intelligibility.

    PubMed

    Babel, Molly; Russell, Jamie

    2015-05-01

    Socio-indexical cues and paralinguistic information are often beneficial to speech processing as this information assists listeners in parsing the speech stream. Associations that particular populations speak in a certain speech style can, however, make it such that socio-indexical cues have a cost. In this study, native speakers of Canadian English who identify as Chinese Canadian and White Canadian read sentences that were presented to listeners in noise. Half of the sentences were presented with a visual-prime in the form of a photo of the speaker and half were presented in control trials with fixation crosses. Sentences produced by Chinese Canadians showed an intelligibility cost in the face-prime condition, whereas sentences produced by White Canadians did not. In an accentedness rating task, listeners rated White Canadians as less accented in the face-prime trials, but Chinese Canadians showed no such change in perceived accentedness. These results suggest a misalignment between an expected and an observed speech signal for the face-prime trials, which indicates that social information about a speaker can trigger linguistic associations that come with processing benefits and costs. PMID:25994710

  4. Trust Maximization in Social Networks

    NASA Astrophysics Data System (ADS)

    Zhan, Justin; Fang, Xing

    Trust is a human-related phenomenon in social networks. Trust research on social networks has gained much attention on its usefulness, and on modeling propagations. There is little focus on finding maximum trust in social networks which is particularly important when a social network is oriented by certain tasks. In this paper, we propose a trust maximization algorithm based on the task-oriented social networks.

  5. Online Influence Maximization (Extended Version)

    E-print Network

    Lei, Siyu; Mo, Luyi; Cheng, Reynold; Senellart, Pierre

    2015-01-01

    Social networks are commonly used for marketing purposes. For example, free samples of a product can be given to a few influential social network users (or "seed nodes"), with the hope that they will convince their friends to buy it. One way to formalize marketers' objective is through influence maximization (or IM), whose goal is to find the best seed nodes to activate under a fixed budget, so that the number of people who get influenced in the end is maximized. Recent solutions to IM rely on the influence probability that a user influences another one. However, this probability information may be unavailable or incomplete. In this paper, we study IM in the absence of complete information on influence probability. We call this problem Online Influence Maximization (OIM) since we learn influence probabilities at the same time we run influence campaigns. To solve OIM, we propose a multiple-trial approach, where (1) some seed nodes are selected based on existing influence information; (2) an influence campaign ...

  6. Great Expectations: Temporal Expectation Modulates Perceptual Processing Speed

    ERIC Educational Resources Information Center

    Vangkilde, Signe; Coull, Jennifer T.; Bundesen, Claus

    2012-01-01

    In a crowded dynamic world, temporal expectations guide our attention in time. Prior investigations have consistently demonstrated that temporal expectations speed motor behavior. We explore effects of temporal expectation on "perceptual" speed in three nonspeeded, cued recognition paradigms. Different hazard rate functions for the cue-stimulus…

  7. ANATOMICAL GUIDED SEGMENTATION WITH NON-STATIONARY TISSUE CLASS DISTRIBUTIONS IN AN EXPECTATION-MAXIMIZATION FRAMEWORK

    E-print Network

    of a probabilistic atlas also increases the risk of systematical biases, such as an over or underesti- mation with a probability atlas. Its smooth bor- ders increase the likelihood of bridges or cavities across ad- jacent banks

  8. SEMI-BLIND CHANNEL IDENTIFICATION AND EQUALIZATION IN OFDM: AN EXPECTATION-MAXIMIZATION APPROACH

    E-print Network

    Al-Naffouri, Tareq Y.

    of many standards including digital audio and video broadcasting in Europe and high speed transmission] and [3] to perform channel esti- mation. The redundancy due to the presence of the cyclic prefix (CP

  9. Measuring Alcohol Expectancies in Youth

    ERIC Educational Resources Information Center

    Randolph, Karen A.; Gerend, Mary A.; Miller, Brenda A.

    2006-01-01

    Beliefs about the consequences of using alcohol, alcohol expectancies, are powerful predictors of underage drinking. The Alcohol Expectancies Questionnaire-Adolescent form (AEQ-A) has been widely used to measure expectancies in youth. Despite its broad use, the factor structure of the AEQ-A has not been firmly established. It is also not known…

  10. Reading Expectancy and Regression Formulas.

    ERIC Educational Resources Information Center

    Brown, Robert M.; Byrd, Carol C.

    To determine the most accurate means of estimating the expected grade or reading level of 61 sixth grade students, three expectancy formulas were compared to a custom regression equation from a user friendly microcomputer program. The expectancy formulas, based in part on student I.Q. and chronological age, and the regression equations were…

  11. Knowledge discovery by accuracy maximization

    PubMed Central

    Cacciatore, Stefano; Luchinat, Claudio; Tenori, Leonardo

    2014-01-01

    Here we describe KODAMA (knowledge discovery by accuracy maximization), an unsupervised and semisupervised learning algorithm that performs feature extraction from noisy and high-dimensional data. Unlike other data mining methods, the peculiarity of KODAMA is that it is driven by an integrated procedure of cross-validation of the results. The discovery of a local manifold’s topology is led by a classifier through a Monte Carlo procedure of maximization of cross-validated predictive accuracy. Briefly, our approach differs from previous methods in that it has an integrated procedure of validation of the results. In this way, the method ensures the highest robustness of the obtained solution. This robustness is demonstrated on experimental datasets of gene expression and metabolomics, where KODAMA compares favorably with other existing feature extraction methods. KODAMA is then applied to an astronomical dataset, revealing unexpected features. Interesting and not easily predictable features are also found in the analysis of the State of the Union speeches by American presidents: KODAMA reveals an abrupt linguistic transition sharply separating all post-Reagan from all pre-Reagan speeches. The transition occurs during Reagan’s presidency and not from its beginning. PMID:24706821

  12. Maximally coherent mixed states: Complementarity between maximal coherence and mixedness

    NASA Astrophysics Data System (ADS)

    Singh, Uttam; Bera, Manabendra Nath; Dhar, Himadri Shekhar; Pati, Arun Kumar

    2015-05-01

    Quantum coherence is a key element in topical research on quantum resource theories and a primary facilitator for design and implementation of quantum technologies. However, the resourcefulness of quantum coherence is severely restricted by environmental noise, which is indicated by the loss of information in a quantum system, measured in terms of its purity. In this work, we derive the limits imposed by the mixedness of a quantum system on the amount of quantum coherence that it can possess. We obtain an analytical trade-off between the two quantities that upperbound the maximum quantum coherence for fixed mixedness in a system. This gives rise to a class of quantum states, "maximally coherent mixed states," whose coherence cannot be increased further under any purity-preserving operation. For the above class of states, quantum coherence and mixedness satisfy a complementarity relation, which is crucial to understand the interplay between a resource and noise in open quantum systems.

  13. Labview utilities

    Energy Science and Technology Software Center (ESTSC)

    2011-09-30

    The software package provides several utilities written in LabView. These utilities don't form independent programs, but rather can be used as a library or controls in other labview programs. The utilities include several new controls (xcontrols), VIs for input and output routines, as well as other 'helper'-functions not provided in the standard LabView environment.

  14. Maximal dinucleotide and trinucleotide circular codes.

    PubMed

    Michel, Christian J; Pellegrini, Marco; Pirillo, Giuseppe

    2016-01-21

    We determine here the number and the list of maximal dinucleotide and trinucleotide circular codes. We prove that there is no maximal dinucleotide circular code having strictly less than 6 elements (maximum size of dinucleotide circular codes). On the other hand, a computer calculus shows that there are maximal trinucleotide circular codes with less than 20 elements (maximum size of trinucleotide circular codes). More precisely, there are maximal trinucleotide circular codes with 14, 15, 16, 17, 18 and 19 elements and no maximal trinucleotide circular code having less than 14 elements. We give the same information for the maximal self-complementary dinucleotide and trinucleotide circular codes. The amino acid distribution of maximal trinucleotide circular codes is also determined. PMID:26382231

  15. Vacuum Expectation Values of Twisted Mass Fermion Operators

    E-print Network

    Abdou Abdel-Rehim; Randy Lewis; Walter Wilcox

    2007-10-23

    Using noise methods on a quenched $20^3 \\times 32$ lattice at $\\beta=6.0$, we have investigated vacuum expectation values and relative linear correlations among the various Wilson and twisted mass scalar and pseudoscalar disconnected loop operators. We show results near the maximal twist lines in $\\kappa$, $\\mu$ parameter space, either defined as the absence of parity mixing or the vanishing of the PCAC quark mass.

  16. Maximally Expressive Modeling of Operations Tasks

    NASA Technical Reports Server (NTRS)

    Jaap, John; Richardson, Lea; Davis, Elizabeth

    2002-01-01

    Planning and scheduling systems organize "tasks" into a timeline or schedule. The tasks are defined within the scheduling system in logical containers called models. The dictionary might define a model of this type as "a system of things and relations satisfying a set of rules that, when applied to the things and relations, produce certainty about the tasks that are being modeled." One challenging domain for a planning and scheduling system is the operation of on-board experiments for the International Space Station. In these experiments, the equipment used is among the most complex hardware ever developed, the information sought is at the cutting edge of scientific endeavor, and the procedures are intricate and exacting. Scheduling is made more difficult by a scarcity of station resources. The models to be fed into the scheduler must describe both the complexity of the experiments and procedures (to ensure a valid schedule) and the flexibilities of the procedures and the equipment (to effectively utilize available resources). Clearly, scheduling International Space Station experiment operations calls for a "maximally expressive" modeling schema.

  17. Maximal acceleration and radiative processes

    E-print Network

    Giorgio Papini

    2015-07-08

    We derive the radiation characteristics of an accelerated, charged particle in a model due to Caianiello in which the proper acceleration of a particle of mass $m$ has the upper limit $\\mathcal{A}_m=2mc^3/\\hbar$. We find two power laws, one applicable to lower accelerations, the other more suitable for accelerations closer to $\\mathcal{A}_m$ and to the related physical singularity in the Ricci scalar. Geometrical constraints and power spectra are also discussed. By comparing the power laws due to the maximal acceleration with that for particles in gravitational fields, we find that the model of Caianiello allows, in principle, the use of charged particles as tools to distinguish inertial from gravitational fields locally.

  18. Maximal acceleration and radiative processes

    NASA Astrophysics Data System (ADS)

    Papini, Giorgio

    2015-08-01

    We derive the radiation characteristics of an accelerated, charged particle in a model due to Caianiello in which the proper acceleration of a particle of mass m has the upper limit 𝒜m = 2mc3/?. We find two power laws, one applicable to lower accelerations, the other more suitable for accelerations closer to 𝒜m and to the related physical singularity in the Ricci scalar. Geometrical constraints and power spectra are also discussed. By comparing the power laws due to the maximal acceleration (MA) with that for particles in gravitational fields, we find that the model of Caianiello allows, in principle, the use of charged particles as tools to distinguish inertial from gravitational fields locally.

  19. Expected Graduation Date _______________ ____ Original ____ Revision

    E-print Network

    Selmic, Sandra

    Expected Graduation Date _______________ ____ Original ____ Revision PLAN OF STUDY The Graduate ___________________________________________________________________ ____________________ Signature of Student Date Approved: __________________________________ _________ __________________________________ _________ Chairman, Advisory Committee Date Department Head Date

  20. Does mental exertion alter maximal muscle activation?

    PubMed

    Rozand, Vianney; Pageaux, Benjamin; Marcora, Samuele M; Papaxanthis, Charalambos; Lepers, Romuald

    2014-01-01

    Mental exertion is known to impair endurance performance, but its effects on neuromuscular function remain unclear. The purpose of this study was to test the hypothesis that mental exertion reduces torque and muscle activation during intermittent maximal voluntary contractions of the knee extensors. Ten subjects performed in a randomized order three separate mental exertion conditions lasting 27 min each: (i) high mental exertion (incongruent Stroop task), (ii) moderate mental exertion (congruent Stroop task), (iii) low mental exertion (watching a movie). In each condition, mental exertion was combined with 10 intermittent maximal voluntary contractions of the knee extensor muscles (one maximal voluntary contraction every 3 min). Neuromuscular function was assessed using electrical nerve stimulation. Maximal voluntary torque, maximal muscle activation and other neuromuscular parameters were similar across mental exertion conditions and did not change over time. These findings suggest that mental exertion does not affect neuromuscular function during intermittent maximal voluntary contractions of the knee extensors. PMID:25309404

  1. Institutional Differences: Expectations and Perceptions.

    ERIC Educational Resources Information Center

    Silver, Harold

    1982-01-01

    The history of higher education has paid scant attention to the attitudes and expectations of its customers, students, and employers of graduates. Recent research on student and employer attitudes toward higher education sectors has not taken into account these expectations in the context of recent higher education history. (Author/MSE)

  2. FastStats: Life Expectancy

    MedlinePLUS

    ... in 2010? Life expectancy at age 25, by sex and education level, Health, United States, 2011, figure 32 [PDF - 9.8 MB] Life expectancy at birth, at 65 and 75 years of age by sex, race and Hispanic origin Health, United States 2014, ...

  3. Teacher Expectations and Classroom Behaviors.

    ERIC Educational Resources Information Center

    Wilkins, William E.

    To test the effect of teacher expectation on pupil achievement, subjects in 24 classrooms, grades 1-6, a) completed the Metropolitan Analysis of Learning Potential and the Metropolitan Achievement Test; b) completed a questionnaire regarding perceptions of teachers' differential expectations and treatment of students; and c) were ranked by their…

  4. A maximally supersymmetric Kondo model

    E-print Network

    Sarah Harrison; Shamit Kachru; Gonzalo Torroba

    2012-02-17

    We study the maximally supersymmetric Kondo model obtained by adding a fermionic impurity to N=4 supersymmetric Yang-Mills theory. While the original Kondo problem describes a defect interacting with a free Fermi liquid of itinerant electrons, here the ambient theory is an interacting CFT, and this introduces qualitatively new features into the system. The model arises in string theory by considering the intersection of a stack of M D5-branes with a stack of N D3-branes, at a point in the D3 worldvolume. We analyze the theory holographically, and propose a dictionary between the Kondo problem and antisymmetric Wilson loops in N=4 SYM. We perform an explicit calculation of the D5 fluctuations in the D3 geometry and determine the spectrum of defect operators. This establishes the stability of the Kondo fixed point together with its basic thermodynamic properties. Known supergravity solutions for Wilson loops allow us to go beyond the probe approximation: the D5s disappear and are replaced by three-form flux piercing a new topologically non-trivial three-sphere in the corrected geometry. This describes the Kondo model in terms of a geometric transition. A dual matrix model reflects the basic properties of the corrected gravity solution in its eigenvalue distribution.

  5. A Maximally Supersymmetric Kondo Model

    SciTech Connect

    Harrison, Sarah; Kachru, Shamit; Torroba, Gonzalo; /Stanford U., Phys. Dept. /SLAC

    2012-02-17

    We study the maximally supersymmetric Kondo model obtained by adding a fermionic impurity to N = 4 supersymmetric Yang-Mills theory. While the original Kondo problem describes a defect interacting with a free Fermi liquid of itinerant electrons, here the ambient theory is an interacting CFT, and this introduces qualitatively new features into the system. The model arises in string theory by considering the intersection of a stack of M D5-branes with a stack of N D3-branes, at a point in the D3 worldvolume. We analyze the theory holographically, and propose a dictionary between the Kondo problem and antisymmetric Wilson loops in N = 4 SYM. We perform an explicit calculation of the D5 fluctuations in the D3 geometry and determine the spectrum of defect operators. This establishes the stability of the Kondo fixed point together with its basic thermodynamic properties. Known supergravity solutions for Wilson loops allow us to go beyond the probe approximation: the D5s disappear and are replaced by three-form flux piercing a new topologically non-trivial S3 in the corrected geometry. This describes the Kondo model in terms of a geometric transition. A dual matrix model reflects the basic properties of the corrected gravity solution in its eigenvalue distribution.

  6. [What do psychiatric patients expect of inpatient psychiatric hospital treatment?].

    PubMed

    Fleischmann, Heribert

    2003-05-01

    Patients are mostly passive utilizer of the health-care-system. They are confronted with a supply of medical service and they are allowed to show their satisfaction with it retrospectively. Our medical system has in future to develop itself from an effective perspective to an utilizer orientated medicine. Orientation to the utilizers means to ask for the expectations of the patients for supply (at customer's option). Aim of our investigation was to check the subjective expectations of the patients before the beginning of in-patient treatment: 1. What is their opinion about the label of the disorder, they are suffering. 2. Of what therapeutic measures do they expect help for theirselves. 3. Do they want to play a part in planning of therapeutic measures. 209 of 344 (61%) of the patients were at admission ready for answering a self designed questionnaire. Only 4% of the patients said, that their disorder is called insanity. They preferred labels like mental illness (45%), somatic illness (43%) and mental health problem (42%). A pharmacological therapy expected in totally 61% of the patients. Mostly were expected drugs against depressive disorders (32%), drugs against addiction (31%) and tranquilizers (29%). Only 10% of the patients expected to get antipsychotic drugs. A verbal therapeutic intervention expected 76% of the patients. To have a speak with the doctor is with 69% a first rank desire, followed by speaking with the psychologist (60%), the nurses (58%) and the patients comrades (56%). Psychotherapy in a narrower sense expect only 40% of the patients. Furthermore there are privacy and recreation through promenades in front of the expectations (69%), followed by relaxation (59%), occupational therapy (55%) and sports or active exercise therapy (54%). 75% of the patients want to be informed about the therapy. 69% want to cooperate with planning of the therapy. Only 21% commit the therapy to the doctor. About one third of the patients expect a consultation with their relatives, the custodians and their family doctor. PMID:14509060

  7. Physical activity extends life expectancy

    Cancer.gov

    Leisure-time physical activity is associated with longer life expectancy, even at relatively low levels of activity and regardless of body weight, according to a study by a team of researchers led by the NCI.

  8. Status Characteristics and Expectation States 

    E-print Network

    Berger, Joseph; Cohen, Bernard P.; Zelditch, Morris Jr.

    2015-07-19

    -Type text/plain; charset=UTF-8 STATUS CHARACTERISTICS AND EXPECTATION STATES* Joseph Berger Bernard P. Cohen Morris Zelditch, Jr. Research for this paper was conducted with the support of NSF grant #G23990? for investigation of authority...

  9. Specificity of a Maximal Step Exercise Test

    ERIC Educational Resources Information Center

    Darby, Lynn A.; Marsh, Jennifer L.; Shewokis, Patricia A.; Pohlman, Roberta L.

    2007-01-01

    To adhere to the principle of "exercise specificity" exercise testing should be completed using the same physical activity that is performed during exercise training. The present study was designed to assess whether aerobic step exercisers have a greater maximal oxygen consumption (max VO sub 2) when tested using an activity specific, maximal step…

  10. An Information Maximization Model of Eye Movements

    E-print Network

    Coughlan, James M.

    An Information Maximization Model of Eye Movements Laura Walker Renninger, James Coughlan, Preeti maximization model as a general strategy for programming eye movements. The model reconstructs high) about the stimulus. By comparing our model performance to human eye movement data and to predictions

  11. Diurnal Variations in Maximal Oxygen Uptake.

    ERIC Educational Resources Information Center

    McClellan, Powell D.

    A study attempted to determine if diurnal (daily cyclical) variations were present during maximal exercise. The subjects' (30 female undergraduate physical education majors) oxygen consumption and heart rates were monitored while they walked on a treadmill on which the grade was raised every minute. Each subject was tested for maximal oxygen…

  12. Efficiently Mining Maximal Frequent Itemsets Karam Gouda

    E-print Network

    Fiat, Amos

    Efficiently Mining Maximal Frequent Itemsets Karam Gouda and Mohammed J. Zaki ¡ ComputerMax, a backtrack search based algorithm for mining maximal frequent itemsets. GenMax uses a num- ber based on dataset characteristics. We found GenMax to be a highly efficient method to mine the exact set

  13. Statistical mechanics of maximal independent sets.

    PubMed

    Dall'Asta, Luca; Pin, Paolo; Ramezanpour, Abolfazl

    2009-12-01

    The graph theoretic concept of maximal independent set arises in several practical problems in computer science as well as in game theory. A maximal independent set is defined by the set of occupied nodes that satisfy some packing and covering constraints. It is known that finding minimum and maximum-density maximal independent sets are hard optimization problems. In this paper, we use cavity method of statistical physics and Monte Carlo simulations to study the corresponding constraint satisfaction problem on random graphs. We obtain the entropy of maximal independent sets within the replica symmetric and one-step replica symmetry breaking frameworks, shedding light on the metric structure of the landscape of solutions and suggesting a class of possible algorithms. This is of particular relevance for the application to the study of strategic interactions in social and economic networks, where maximal independent sets correspond to pure Nash equilibria of a graphical game of public goods allocation. PMID:20365147

  14. Maximal independent sets for sparse graphs D. Eppstein, UC Irvine, SODA 2005 All Maximal Independent Sets

    E-print Network

    Eppstein, David

    Maximal independent sets for sparse graphs D. Eppstein, UC Irvine, SODA 2005 All Maximal. Eppstein, UC Irvine, SODA 2005 Problem: list all maximal independent sets of an undirected graph. Eppstein, UC Irvine, SODA 2005 Previously known results O(3n/3) matches lower bound on max possible output

  15. Utility Green Pricing Programs: A Statistical Analysis of Program Effectiveness

    SciTech Connect

    Wiser, R.; Olson, S.; Bird, L.; Swezey, B.

    2004-02-01

    This report analyzes actual utility green pricing program data to provide further insight into which program features might help maximize both customer participation in green pricing programs and the amount of renewable energy purchased by customers in those programs.

  16. Biomass utilization

    SciTech Connect

    Cote, W.A.

    1983-01-01

    Proceedings are given of the NATO Advanced Study Institute on biomass utilization. The course was introduced by discussion of the basic concepts of biomass utilization. Then the raw material (forest biomass, agricultural resources, aquatic resources and municipal solid waste) was considered from the point of view of its availability, assessment, preparation and general suitability. The structure and chemical composition of the biomass were addressed by a number of speakers before the conversion methods were presented. Biological and thermochemical routes for conversion of biomass to energy, chemicals or food were discussed for several days as this is the main thrust of biomass utilization today. Finally, the engineering aspects and the economics of biomass utilization were taken up in order to examine the feasibility of the various elements that comprise this multidisciplinary field. Separate abstracts have been prepared for items within the scope of the Energy Data Base.

  17. Inflation in maximal gauged supergravities

    NASA Astrophysics Data System (ADS)

    Kodama, Hideo; Nozawa, Masato

    2015-05-01

    We discuss the dynamics of multiple scalar fields and the possibility of realistic inflation in the maximal gauged supergravity. In this paper, we address this problem in the framework of recently discovered 1-parameter deformation of SO(4,4) and SO(5,3) dyonic gaugings, for which the base point of the scalar manifold corresponds to an unstable de Sitter critical point. In the gauge-field frame where the embedding tensor takes the value in the sum of the {36} and {36'} representations of SL(8), we present a scheme that allows us to derive an analytic expression for the scalar potential. With the help of this formalism, we derive the full potential and gauge coupling functions in analytic forms for the SO(3)× SO(3)-invariant subsectors of SO(4,4) and SO(5,3) gaugings, and argue that there exist no new critical points in addition to those discovered so far. For the SO(4,4) gauging, we also study the behavior of 6-dimensional scalar fields in this sector near the Dall'Agata-Inverso de Sitter critical point at which the negative eigenvalue of the scalar mass square with the largest modulus goes to zero as the deformation parameter s approaches a critical value sc. We find that when the deformation parameter s is taken sufficiently close to the critical value, inflation lasts more than 60 e-folds even if the initial point of the inflaton allows an O(0.1) deviation in Planck units from the Dall'Agata-Inverso critical point. It turns out that the spectral index ns of the curvature perturbation at the time of the 60 e-folding number is always about 0.96 and within the 1? range ns=0.9639±0.0047 obtained by Planck, irrespective of the value of the ? parameter at the critical saddle point. The tensor-scalar ratio predicted by this model is around 10?3 and is close to the value in the Starobinsky model.

  18. Formation Control of the MAXIM L2 Libration Orbit Mission

    NASA Technical Reports Server (NTRS)

    Folta, David; Hartman, Kate; Howell, Kathleen; Marchand, Belinda

    2004-01-01

    The Micro-Arcsecond X-ray Imaging Mission (MAXIM), a proposed concept for the Structure and Evolution of the Universe (SEU) Black Hole Imager mission, is designed to make a ten million-fold improvement in X-ray image clarity of celestial objects by providing better than 0.1 micro-arcsecond imaging. Currently the mission architecture comprises 25 spacecraft, 24 as optics modules and one as the detector, which will form sparse sub-apertures of a grazing incidence X-ray interferometer covering the 0.3-10 keV bandpass. This formation must allow for long duration continuous science observations and also for reconfiguration that permits re-pointing of the formation. To achieve these mission goals, the formation is required to cooperatively point at desired targets. Once pointed, the individual elements of the MAXIM formation must remain stable, maintaining their relative positions and attitudes below a critical threshold. These pointing and formation stability requirements impact the control and design of the formation. In this paper, we provide analysis of control efforts that are dependent upon the stability and the configuration and dimensions of the MAXIM formation. We emphasize the utilization of natural motions in the Lagrangian regions to minimize the control efforts and we address continuous control via input feedback linearization (IFL). Results provide control cost, configuration options, and capabilities as guidelines for the development of this complex mission.

  19. Are One Man's Rags Another Man's Riches? Identifying Adaptive Expectations Using Panel Data

    ERIC Educational Resources Information Center

    Burchardt, Tania

    2005-01-01

    One of the motivations frequently cited by Sen and Nussbaum for moving away from a utility metric towards a capabilities framework is a concern about adaptive preferences or conditioned expectations. If utility is related to the satisfaction of aspirations or expectations, and if these are affected by the individual's previous experience of…

  20. Career Expectations of Accounting Students

    ERIC Educational Resources Information Center

    Elam, Dennis; Mendez, Francis

    2010-01-01

    The demographic make-up of accounting students is dramatically changing. This study sets out to measure how well the profession is ready to accommodate what may be very different needs and expectations of this new generation of students. Non-traditional students are becoming more and more of a tradition in the current college classroom.…

  1. Evaluation of Behavioral Expectation Scales.

    ERIC Educational Resources Information Center

    Zedeck, Sheldon; Baker, Henry T.

    Behavioral Expectation Scales developed by Smith and Kendall were evaluated. Results indicated slight interrater reliability between Head Nurses and Supervisors, moderate dependence among five performance dimensions, and correlation between two scales and tenure. Results are discussed in terms of procedural problems, critical incident problems,…

  2. Rehabilitation Professionals' Participation Intensity and Expectations of Transition Roles

    ERIC Educational Resources Information Center

    Oertle, Kathleen Marie

    2009-01-01

    In this mixed-methods study, an on-line survey and interviews were utilized to gather data regarding the level of participation and expectations rehabilitation professionals have of teachers, youth with disabilities, parents, and themselves during the transition process. The survey response rate was 73.0% (N = 46). Six were selected for interviews…

  3. Managing Milk Composition: Maximizing Rumen Function 

    E-print Network

    Stokes, Sandra R.; Jordan, Ellen R.; Looper, Mike; Waldner, Dan

    2000-12-11

    Feeding strategies that optimize rumen function also maximize milk production and milk component percentages and yield. This publication offers guidelines for feeding forage, grain protein and ration fiber to enhance rumen function....

  4. Random maximal isotropic subspaces and Selmer groups

    E-print Network

    Rains, Eric

    Under suitable hypotheses, we construct a probability measure on the set of closed maximal isotropic subspaces of a locally compact quadratic space over F[subscript p]. A random subspace chosen with respect to this measure ...

  5. When do nonlinear filters achieve maximal accuracy?

    E-print Network

    van Handel, Ramon

    2009-01-01

    The nonlinear filter for an ergodic signal observed in white noise is said to achieve maximal accuracy if the stationary filtering error vanishes as the signal to noise ratio diverges. We give a general characterization of the maximal accuracy property in terms of various systems theoretic notions. When the signal state space is a finite set explicit necessary and sufficient conditions are obtained, while the linear Gaussian case reduces to a classic result of Kwakernaak and Sivan (1972).

  6. Increasing inspection equipment productivity by utilizing factory automation SW on TeraScan 5XX systems

    NASA Astrophysics Data System (ADS)

    Jakubski, Thomas; Piechoncinski, Michal; Moses, Raphael; Bugata, Bharathi; Schmalfuss, Heiko; Köhler, Ines; Lisowski, Jan; Klobes, Jens; Fenske, Robert

    2009-01-01

    Especially for advanced masks the reticle inspection operation is a very significant cost factor, since it is a time consuming process and inspection tools are becoming disproportionately expensive. Analyzing and categorizing historical equipment utilization times of the reticle inspection tools however showed a significant amount of time which can be classified as non productive. In order to reduce the inspection costs the equipment utilization needed to be improved. The main contributors to non productive time were analyzed and several use cases identified, where automation utilizing a SECS1 equipment interface was expected to help to reduce these non productive times. The paper demonstrates how real time access to equipment utilization data can be applied to better control manufacturing resources. Scenarios are presented where remote monitoring and control of the inspection equipment can be used to avoid setup errors or save inspection time by faster response to problem situations. Additionally a solution to the second important need, the maximization of tool utilization in cases where not all of the intended functions are available, is explained. Both the models and the software implementation are briefly explained. For automation of the so called inspection strategy a new approach which allows separation of the business rules from the automation infrastructure was chosen. Initial results of inspection equipment performance data tracked through the SECS interface are shown. Furthermore a system integration overview is presented and examples of how the inspection strategy rules are implemented and managed are given.

  7. Understanding violations of Gricean maxims in preschoolers and adults

    PubMed Central

    Okanda, Mako; Asada, Kosuke; Moriguchi, Yusuke; Itakura, Shoji

    2015-01-01

    This study used a revised Conversational Violations Test to examine Gricean maxim violations in 4- to 6-year-old Japanese children and adults. Participants' understanding of the following maxims was assessed: be informative (first maxim of quantity), avoid redundancy (second maxim of quantity), be truthful (maxim of quality), be relevant (maxim of relation), avoid ambiguity (second maxim of manner), and be polite (maxim of politeness). Sensitivity to violations of Gricean maxims increased with age: 4-year-olds' understanding of maxims was near chance, 5-year-olds understood some maxims (first maxim of quantity and maxims of quality, relation, and manner), and 6-year-olds and adults understood all maxims. Preschoolers acquired the maxim of relation first and had the greatest difficulty understanding the second maxim of quantity. Children and adults differed in their comprehension of the maxim of politeness. The development of the pragmatic understanding of Gricean maxims and implications for the construction of developmental tasks from early childhood to adulthood are discussed. PMID:26191018

  8. Resources and energetics determined dinosaur maximal size

    PubMed Central

    McNab, Brian K.

    2009-01-01

    Some dinosaurs reached masses that were ?8 times those of the largest, ecologically equivalent terrestrial mammals. The factors most responsible for setting the maximal body size of vertebrates are resource quality and quantity, as modified by the mobility of the consumer, and the vertebrate's rate of energy expenditure. If the food intake of the largest herbivorous mammals defines the maximal rate at which plant resources can be consumed in terrestrial environments and if that limit applied to dinosaurs, then the large size of sauropods occurred because they expended energy in the field at rates extrapolated from those of varanid lizards, which are ?22% of the rates in mammals and 3.6 times the rates of other lizards of equal size. Of 2 species having the same energy income, the species that uses the most energy for mass-independent maintenance of necessity has a smaller size. The larger mass found in some marine mammals reflects a greater resource abundance in marine environments. The presumptively low energy expenditures of dinosaurs potentially permitted Mesozoic communities to support dinosaur biomasses that were up to 5 times those found in mammalian herbivores in Africa today. The maximal size of predatory theropods was ?8 tons, which if it reflected the maximal capacity to consume vertebrates in terrestrial environments, corresponds in predatory mammals to a maximal mass less than a ton, which is what is observed. Some coelurosaurs may have evolved endothermy in association with the evolution of feathered insulation and a small mass. PMID:19581600

  9. Maximal Holevo Quantity Based on Weak Measurements.

    PubMed

    Wang, Yao-Kun; Fei, Shao-Ming; Wang, Zhi-Xi; Cao, Jun-Peng; Fan, Heng

    2015-01-01

    The Holevo bound is a keystone in many applications of quantum information theory. We propose " maximal Holevo quantity for weak measurements" as the generalization of the maximal Holevo quantity which is defined by the optimal projective measurements. The scenarios that weak measurements is necessary are that only the weak measurements can be performed because for example the system is macroscopic or that one intentionally tries to do so such that the disturbance on the measured system can be controlled for example in quantum key distribution protocols. We evaluate systematically the maximal Holevo quantity for weak measurements for Bell-diagonal states and find a series of results. Furthermore, we find that weak measurements can be realized by noise and project measurements. PMID:26090962

  10. An information maximization model of eye movements

    NASA Technical Reports Server (NTRS)

    Renninger, Laura Walker; Coughlan, James; Verghese, Preeti; Malik, Jitendra

    2005-01-01

    We propose a sequential information maximization model as a general strategy for programming eye movements. The model reconstructs high-resolution visual information from a sequence of fixations, taking into account the fall-off in resolution from the fovea to the periphery. From this framework we get a simple rule for predicting fixation sequences: after each fixation, fixate next at the location that minimizes uncertainty (maximizes information) about the stimulus. By comparing our model performance to human eye movement data and to predictions from a saliency and random model, we demonstrate that our model is best at predicting fixation locations. Modeling additional biological constraints will improve the prediction of fixation sequences. Our results suggest that information maximization is a useful principle for programming eye movements.

  11. Energy Band Calculations for Maximally Even Superlattices

    NASA Astrophysics Data System (ADS)

    Krantz, Richard; Byrd, Jason

    2007-03-01

    Superlattices are multiple-well, semiconductor heterostructures that can be described by one-dimensional potential wells separated by potential barriers. We refer to a distribution of wells and barriers based on the theory of maximally even sets as a maximally even superlattice. The prototypical example of a maximally even set is the distribution of white and black keys on a piano keyboard. Black keys may represent wells and the white keys represent barriers. As the number of wells and barriers increase, efficient and stable methods of calculation are necessary to study these structures. We have implemented a finite-element method using the discrete variable representation (FE-DVR) to calculate E versus k for these superlattices. Use of the FE-DVR method greatly reduces the amount of calculation necessary for the eigenvalue problem.

  12. Malaria epidemic expected in Mozambique.

    PubMed

    Sidley, P

    2000-03-11

    Health experts fear epidemics of several infectious diseases in Mozambique as floods recede and mosquitoes begin breeding. According to Pierre Kahozi of WHO, malaria is already endemic in the region but there are fears that a much greater outbreak might occur. Scores of suspected cases of cholera were reported and more are expected, along with cases of other diarrheal conditions. Neil Cameron, chief director of communicable diseases at the health department in South Africa, said that more cases are expected within a month when the breeding cycle of mosquitoes is renewed. He reported that the number of malaria cases in South Africa increased from 12,000 in 1995 to 50,000 in 1999, and a number of people had been dying from this disease. The increase could be attributed partly to climatic changes and resistance to certain drugs. DDT had been used in the past to control mosquitoes, and it¿s possible that it will be used again in Mozambique. The issues involved in tackling malaria are now being considered as part of a special development initiative on infectious diseases that is being undertaken jointly by the health departments of three countries: South Africa, Mozambique, and Swaziland. PMID:10710569

  13. Price of anarchy is maximized at the percolation threshold

    NASA Astrophysics Data System (ADS)

    Skinner, Brian

    2015-05-01

    When many independent users try to route traffic through a network, the flow can easily become suboptimal as a consequence of congestion of the most efficient paths. The degree of this suboptimality is quantified by the so-called price of anarchy (POA), but so far there are no general rules for when to expect a large POA in a random network. Here I address this question by introducing a simple model of flow through a network with randomly placed congestible and incongestible links. I show that the POA is maximized precisely when the fraction of congestible links matches the percolation threshold of the lattice. Both the POA and the total cost demonstrate critical scaling near the percolation threshold.

  14. On a nonstandard Brownian motion and its maximal function

    NASA Astrophysics Data System (ADS)

    Andrade, Bernardo B. de

    2015-07-01

    This article uses Radically Elementary Probability Theory (REPT) to prove results about the Wiener walk (the radically elementary Brownian motion) without the technical apparatus required by stochastic integration. The techniques used replace measure-theoretic tools by discrete probability and the rigorous use of infinitesimals. Specifically, REPT is applied to the results in Palacios (The American Statistician, 2008) to calculate certain expectations related to the Wiener walk and its maximal function. Because Palacios uses mostly combinatorics and no measure theory his results carry over through REPT with minimal changes. The paper also presents a construction of the Wiener walk which is intended to mimic the construction of Brownian motion from "continuous" white noise. A brief review of the nonstandard model on which REPT is based is given in the Appendix in order to minimize the need for previous exposure to the subject.

  15. Graded expectations: Predictive processing and the adjustment of expectations during spoken language comprehension.

    PubMed

    Boudewyn, Megan A; Long, Debra L; Swaab, Tamara Y

    2015-09-01

    The goal of this study was to investigate the use of the local and global contexts for incoming words during listening comprehension. Local context was manipulated by presenting a target noun (e.g., "cake," "veggies") that was preceded by a word that described a prototypical or atypical feature of the noun (e.g., "sweet," "healthy"). Global context was manipulated by presenting the noun in a scenario that was consistent or inconsistent with the critical noun (e.g., a birthday party). Event-related potentials (ERPs) were examined at the feature word and at the critical noun. An N400 effect was found at the feature word, reflecting the effect of compatibility with the global context. Global predictability and the local feature word consistency interacted at the critical noun: A larger N200 was found to nouns that mismatched predictions when the context was maximally constraining, relative to nouns in the other conditions. A graded N400 response was observed at the critical noun, modulated by global predictability and feature consistency. Finally, post-N400 positivity effects of context updating were observed to nouns that were supported by one contextual cue (global/local) but were unsupported by the other. These results indicate that (1)?incoming words that are compatible with context-based expectations receive a processing benefit; (2)?when the context is sufficiently constraining, specific lexical items may be activated; and (3)?listeners dynamically adjust their expectations when input is inconsistent with their predictions, provided that the inconsistency has some level of support from either the global or the local context. PMID:25673006

  16. Maximize crude unit No. 2 oil yield design and operation

    SciTech Connect

    Sloley, A.W.

    1997-05-01

    Recent refinery industry trends are to optimize crude unit operation with advanced control technology such as real-time-optimization. One potential crude unit optimization objective could be maximizing diesel product yields and minimizing the quantity of diesel boiling range material in the FCC feed. Appropriately designed advanced process control technology for a crude unit can be used to fully utilize existing equipment performance. The advanced process control scheme (or operator) can adjust the appropriate process variables to optimize the diesel yields against the current unit limitations. Process and equipment design changes may nevertheless be required to fully implement the diesel product optimization, depending on the crude unit equipment limitations. Therefore, crude unit process variable optimization and potential equipment design issues should be carefully addressed. While each refinery crude unit`s design, operation and equipment constraints are different, the fundamental operating variables and the process and equipment design issues are common to all crude units.

  17. Uplink Array Calibration via Far-Field Power Maximization

    NASA Technical Reports Server (NTRS)

    Vilnrotter, V.; Mukai, R.; Lee, D.

    2006-01-01

    Uplink antenna arrays have the potential to greatly increase the Deep Space Network s high-data-rate uplink capabilities as well as useful range, and to provide additional uplink signal power during critical spacecraft emergencies. While techniques for calibrating an array of receive antennas have been addressed previously, proven concepts for uplink array calibration have yet to be demonstrated. This article describes a method of utilizing the Moon as a natural far-field reflector for calibrating a phased array of uplink antennas. Using this calibration technique, the radio frequency carriers transmitted by each antenna of the array are optimally phased to ensure that the uplink power received by the spacecraft is maximized.

  18. Expectation-Based Control of Noise and Chaos

    NASA Technical Reports Server (NTRS)

    Zak, Michael

    2006-01-01

    A proposed approach to control of noise and chaos in dynamic systems would supplement conventional methods. The approach is based on fictitious forces composed of expectations governed by Fokker-Planck or Liouville equations that describe the evolution of the probability densities of the controlled parameters. These forces would be utilized as feedback control forces that would suppress the undesired diffusion of the controlled parameters. Examples of dynamic systems in which the approach is expected to prove beneficial include spacecraft, electronic systems, and coupled lasers.

  19. PRODCOST: an electric utility generation simulation code

    SciTech Connect

    Hudson, II, C. R.; Reynolds, T. M.; Smolen, G. R.

    1981-02-01

    The PRODCOST computer code simulates the operation of an electric utility generation system. Through a probabilistic simulation the expected energy production, fuel consumption, and cost of operation for each plant are determined. Total system fuel consumption, energy generation by type, total generation costs, as well as system loss of load probability and expected unserved energy are also calculated.

  20. Supersymmetric generalization of the maximal Abelian gauge

    E-print Network

    M. A. L. Capri; H. C. Toledo; J. A. Helayël-Neto

    2015-04-17

    We present an N=1 supersymmetric generalization of the Maximal Abelian Gauge (MAG) in its superfield and component-field formulations, and discuss its Faddeev-Popov quantization, the associated BRST symmetry and the corresponding Slavnov-Taylor identity in superspace approach.

  1. Landscapes of the Maximal Constraint Satisfaction Problem

    E-print Network

    Hao, Jin-Kao

    Landscapes of the Maximal Constraint Satisfaction Problem Meriema Belaidouni 1;2 and Jin­Kao Hao 2. Landscape is an important notion to study the difficulty of a combinatorial problem and the behavior of heuristics . In this paper, two new measures for landscape analysis are introduced. These measures are based

  2. How to Generate Good Profit Maximization Problems

    ERIC Educational Resources Information Center

    Davis, Lewis

    2014-01-01

    In this article, the author considers the merits of two classes of profit maximization problems: those involving perfectly competitive firms with quadratic and cubic cost functions. While relatively easy to develop and solve, problems based on quadratic cost functions are too simple to address a number of important issues, such as the use of…

  3. More Is Better: Maximizing Natural Learning Opportunities.

    ERIC Educational Resources Information Center

    Jung, Lee Ann

    2003-01-01

    This article discusses the increased emphasis on providing early intervention services within natural environments and how to maximize natural learning opportunities by using everyday activities that children experience and embedding intervention in daily routines. Guidelines for deciding the frequency of services, changing services, and the level…

  4. Maximizing the Spectacle of Water Fountains

    ERIC Educational Resources Information Center

    Simoson, Andrew J.

    2009-01-01

    For a given initial speed of water from a spigot or jet, what angle of the jet will maximize the visual impact of the water spray in the fountain? This paper focuses on fountains whose spigots are arranged in circular fashion, and couches the measurement of the visual impact in terms of the surface area and the volume under the fountain's natural…

  5. Amoebas of maximal area. Grigory Mikhalkin

    E-print Network

    Mikhalkin, Grigory

    Amoebas of maximal area. Grigory Mikhalkin #3; Department of Mathematics University of Utah Salt A in (C #3; ) 2 one may associate a closed in#12;nite region A in R 2 called the amoeba of A. The amoebas of di#11;erent curves of the same degree come in di#11;erent shapes and sizes. All amoebas in (R #3; ) 2

  6. Does evolution lead to maximizing behavior?

    PubMed

    Lehmann, Laurent; Alger, Ingela; Weibull, Jörgen

    2015-07-01

    A long-standing question in biology and economics is whether individual organisms evolve to behave as if they were striving to maximize some goal function. We here formalize this "as if" question in a patch-structured population in which individuals obtain material payoffs from (perhaps very complex multimove) social interactions. These material payoffs determine personal fitness and, ultimately, invasion fitness. We ask whether individuals in uninvadable population states will appear to be maximizing conventional goal functions (with population-structure coefficients exogenous to the individual's behavior), when what is really being maximized is invasion fitness at the genetic level. We reach two broad conclusions. First, no simple and general individual-centered goal function emerges from the analysis. This stems from the fact that invasion fitness is a gene-centered multigenerational measure of evolutionary success. Second, when selection is weak, all multigenerational effects of selection can be summarized in a neutral type-distribution quantifying identity-by-descent between individuals within patches. Individuals then behave as if they were striving to maximize a weighted sum of material payoffs (own and others). At an uninvadable state it is as if individuals would freely choose their actions and play a Nash equilibrium of a game with a goal function that combines self-interest (own material payoff), group interest (group material payoff if everyone does the same), and local rivalry (material payoff differences). PMID:26082379

  7. General Database Statistics Using Entropy Maximization

    E-print Network

    Suciu, Dan

    towards such a general, principled theory of how optimizers should make use of statistics. The lack engines. The key object of our study is a statistical program, which is a set of pairs (v, d), where vGeneral Database Statistics Using Entropy Maximization Raghav Kaushik1 , Christopher R´e2 , and Dan

  8. Protein Function Prediction Using Dependence Maximization

    E-print Network

    Domeniconi, Carlotta

    Protein Function Prediction Using Dependence Maximization Guoxian Yu1 , Carlotta Domeniconi2@scut.edu.cn Abstract. Protein function prediction is one of the fundamental tasks in the post genomic era. The vast amount of available proteomic data makes it possible to computationally annotate proteins. Most computa

  9. -Limit for Transition Paths of Maximal Probability

    E-print Network

    Theil, Florian

    . The computations in [5] were performed at low temperature over fixed long intervals. To gain mathematical insight-Limit for Transition Paths of Maximal Probability F.J. Pinski Physics Department University the temperature is small and the transition time scales as the inverse temperature. 1 Introduction In this paper

  10. Seamless Kernel Updates Maxim Siniavine, Ashvin Goel

    E-print Network

    Goel, Ashvin

    component of the software stack that require significant time for updates and lose state after an update. Existing kernel update systems work at varying granularity. Dynamic patching performs updates at function1 Seamless Kernel Updates Maxim Siniavine, Ashvin Goel University of Toronto Abstract

  11. Maximal rank of extremal marginal tracial states

    SciTech Connect

    Ohno, Hiromichi

    2010-09-15

    States on the coupled quantum system M{sub n}(C) x M{sub n}(C) whose restrictions to each subsystem are the normalized traces are called marginal tracial states. We investigate extremal marginal tracial states and compute their maximal rank. Diagonal marginal tracial states are also considered.

  12. Maximizing the Motivated Mind for Emergent Giftedness.

    ERIC Educational Resources Information Center

    Rea, Dan

    2001-01-01

    This article explains how the theory of the motivated mind conceptualizes the productive interaction of intelligence, creativity, and achievement motivation and how this theory can help educators to maximize students' emergent potential for giftedness. It discusses the integration of cold-order thinking and hot-chaotic thinking into fluid-adaptive…

  13. Supply Chain Network Design Under Profit Maximization

    E-print Network

    Nagurney, Anna

    Chain Anna Nagurney Supply Chain Network Design Under Competition #12;Examples of Supply Chains food;Food Supply Chains Anna Nagurney Supply Chain Network Design Under Competition #12;High Tech ProductsSupply Chain Network Design Under Profit Maximization and Oligopolistic Competition Anna Nagurney

  14. Maximal dinucleotide comma-free codes.

    PubMed

    Fimmel, Elena; Strüngmann, Lutz

    2016-01-21

    The problem of retrieval and maintenance of the correct reading frame plays a significant role in RNA transcription. Circular codes, and especially comma-free codes, can help to understand the underlying mechanisms of error-detection in this process. In recent years much attention has been paid to the investigation of trinucleotide circular codes (see, for instance, Fimmel et al., 2014; Fimmel and Strüngmann, 2015a; Michel and Pirillo, 2012; Michel et al., 2012, 2008), while dinucleotide codes had been touched on only marginally, even though dinucleotides are associated to important biological functions. Recently, all maximal dinucleotide circular codes were classified (Fimmel et al., 2015; Michel and Pirillo, 2013). The present paper studies maximal dinucleotide comma-free codes and their close connection to maximal dinucleotide circular codes. We give a construction principle for such codes and provide a graphical representation that allows them to be visualized geometrically. Moreover, we compare the results for dinucleotide codes with the corresponding situation for trinucleotide maximal self-complementary C(3)-codes. Finally, the results obtained are discussed with respect to Crick?s hypothesis about frame-shift-detecting codes without commas. PMID:26562635

  15. Introduction Proper irrigation timing can maximize sugar-

    E-print Network

    O'Laughlin, Jay

    Introduction Proper irrigation timing can maximize sugar- beet yields while minimizing disease- cially with furrow irrigation. Root diseases such as rhizomania and rhizoctonia root and crown rots will be reduced. Unnecessary irrigations can be reduced if grow- ers use information on water status at deeper

  16. Maximizing Human Capital by Developing Multicultural Competence.

    ERIC Educational Resources Information Center

    Shaffer, Leigh S.

    1998-01-01

    Examines the growing demand for multicultural competence in college graduates, describes the course content and academic-advising activities recommended to develop it, and comments on the limits and inherent dangers of providing multicultural exposure universally. Academic advisors are urged to help students maximize their human capital by adding…

  17. Ehrenfest's Lottery--Time and Entropy Maximization

    ERIC Educational Resources Information Center

    Ashbaugh, Henry S.

    2010-01-01

    Successful teaching of the Second Law of Thermodynamics suffers from limited simple examples linking equilibrium to entropy maximization. I describe a thought experiment connecting entropy to a lottery that mixes marbles amongst a collection of urns. This mixing obeys diffusion-like dynamics. Equilibrium is achieved when the marble distribution is…

  18. Comparing maximal mean values on different scales

    E-print Network

    Thomas Havenith; Sebastian Scholtes

    2015-01-20

    When computing the average speed of a car over different time periods from given GPS data, it is conventional wisdom that the maximal average speed over all time intervals of fixed length decreases if the interval length increases. However, this intuition is wrong. We investigate this phenomenon and make rigorous in which sense this intuition is still true.

  19. A Model of College Tuition Maximization

    ERIC Educational Resources Information Center

    Bosshardt, Donald I.; Lichtenstein, Larry; Zaporowski, Mark P.

    2009-01-01

    This paper develops a series of models for optimal tuition pricing for private colleges and universities. The university is assumed to be a profit maximizing, price discriminating monopolist. The enrollment decision of student's is stochastic in nature. The university offers an effective tuition rate, comprised of stipulated tuition less financial…

  20. Maximize Retirement Income Preserve Accumulated Wealth

    E-print Network

    Mootha, Vamsi K.

    Maximize Retirement Income and Preserve Accumulated Wealth #12;Welcome! Steven M. Dalton, CFP® 40 worker's PIA if started at full retirement age Example: · John's PIA is $2,000 · Jane's PIA is $800 · If Jane applies at FRA, her benefit will be $1,000 (50% of John's PIA) 14 #12;Rules for spousal benefits

  1. North Dome decision expected soon

    SciTech Connect

    Not Available

    1981-08-01

    Decisions soon will be made which will set in motion the development of Qatar's huge North Dome gas field. The government and state company, Qatar General Petroleum Corp. (QGPC) is studying the results of 2 feasibility studies on the economics of LNG export, although initially North Dome exploitation will be aimed at the domestic market. Decisions on the nature and timing of the North Dome development are the most important that have had to be faced in the short 10-yr history of the small Gulf state. The country's oil production is currently running at approximately 500,000 bpd, with 270,000 bpd originating from 3 offshore fields. Output is expected to decline through 1990, and it generally is accepted that there is little likelihood of further major crude discoveries. Therefore, Qatar has to begin an adjustment from an economy based on oil to one based on gas, while adhering to the underlying tenets of long-term conservation and industrial diversification.

  2. Space Utilization InitiativeSpace Utilization Initiative Space UtilizationSpace Utilization

    E-print Network

    Space Utilization InitiativeSpace Utilization Initiative July 2010 1 #12;Space UtilizationSpace Utilization Executive Committee · SVP Jones, VP O'Brien, CFO Pfutzenreuter Team Members · Space Management #12;Space UtilizationSpace Utilization Charge: · Improve the utilization of University space

  3. Expected geoneutrino signal at JUNO

    NASA Astrophysics Data System (ADS)

    Strati, Virginia; Baldoncini, Marica; Callegari, Ivan; Mantovani, Fabio; McDonough, William F.; Ricci, Barbara; Xhixha, Gerti

    2015-12-01

    Constraints on the Earth's composition and on its radiogenic energy budget come from the detection of geoneutrinos. The Kamioka Liquid scintillator Antineutrino Detector (KamLAND) and Borexino experiments recently reported the geoneutrino flux, which reflects the amount and distribution of U and Th inside the Earth. The Jiangmen Underground Neutrino Observatory (JUNO) neutrino experiment, designed as a 20 kton liquid scintillator detector, will be built in an underground laboratory in South China about 53 km from the Yangjiang and Taishan nuclear power plants, each one having a planned thermal power of approximately 18 GW. Given the large detector mass and the intense reactor antineutrino flux, JUNO aims not only to collect high statistics antineutrino signals from reactors but also to address the challenge of discriminating the geoneutrino signal from the reactor background. The predicted geoneutrino signal at JUNO is terrestrial neutrino unit (TNU), based on the existing reference Earth model, with the dominant source of uncertainty coming from the modeling of the compositional variability in the local upper crust that surrounds (out to approximately 500 km) the detector. A special focus is dedicated to the 6° × 4° local crust surrounding the detector which is estimated to contribute for the 44% of the signal. On the basis of a worldwide reference model for reactor antineutrinos, the ratio between reactor antineutrino and geoneutrino signals in the geoneutrino energy window is estimated to be 0.7 considering reactors operating in year 2013 and reaches a value of 8.9 by adding the contribution of the future nuclear power plants. In order to extract useful information about the mantle's composition, a refinement of the abundance and distribution of U and Th in the local crust is required, with particular attention to the geochemical characterization of the accessible upper crust where 47% of the expected geoneutrino signal originates and this region contributes the major source of uncertainty.

  4. Combustion Research Aboard the ISS Utilizing the Combustion Integrated Rack and Microgravity Science Glovebox

    NASA Technical Reports Server (NTRS)

    Sutliff, Thomas J.; Otero, Angel M.; Urban, David L.

    2002-01-01

    The Physical Sciences Research Program of NASA sponsors a broad suite of peer-reviewed research investigating fundamental combustion phenomena and applied combustion research topics. This research is performed through both ground-based and on-orbit research capabilities. The International Space Station (ISS) and two facilities, the Combustion Integrated Rack and the Microgravity Science Glovebox, are key elements in the execution of microgravity combustion flight research planned for the foreseeable future. This paper reviews the Microgravity Combustion Science research planned for the International Space Station implemented from 2003 through 2012. Examples of selected research topics, expected outcomes, and potential benefits will be provided. This paper also summarizes a multi-user hardware development approach, recapping the progress made in preparing these research hardware systems. Within the description of this approach, an operational strategy is presented that illustrates how utilization of constrained ISS resources may be maximized dynamically to increase science through design decisions made during hardware development.

  5. Hamiltonian formalism and path entropy maximization

    NASA Astrophysics Data System (ADS)

    Davis, Sergio; González, Diego

    2015-10-01

    Maximization of the path information entropy is a clear prescription for constructing models in non-equilibrium statistical mechanics. Here it is shown that, following this prescription under the assumption of arbitrary instantaneous constraints on position and velocity, a Lagrangian emerges which determines the most probable trajectory. Deviations from the probability maximum can be consistently described as slices in time by a Hamiltonian, according to a nonlinear Langevin equation and its associated Fokker-Planck equation. The connections unveiled between the maximization of path entropy and the Langevin/Fokker-Planck equations imply that missing information about the phase space coordinate never decreases in time, a purely information-theoretical version of the second law of thermodynamics. All of these results are independent of any physical assumptions, and thus valid for any generalized coordinate as a function of time, or any other parameter. This reinforces the view that the second law is a fundamental property of plausible inference.

  6. Consistent 4-form fluxes for maximal supergravity

    NASA Astrophysics Data System (ADS)

    Godazgar, Hadi; Godazgar, Mahdi; Krüger, Olaf; Nicolai, Hermann

    2015-10-01

    We derive new ansätze for the 4-form field strength of D = 11 supergravity corresponding to uplifts of four-dimensional maximal gauged supergravity. In particular, the ansätze directly yield the components of the 4-form field strength in terms of the scalars and vectors of the four-dimensional maximal gauged supergravity — in this way they provide an explicit uplift of all four-dimensional consistent truncations of D = 11 supergravity. The new ansätze provide a substantially simpler method for uplifting d = 4 flows compared to the previously available method using the 3-form and 6-form potential ansätze. The ansatz for the Freund-Rubin term allows us to conjecture a `master formula' for the latter in terms of the scalar potential of d = 4 gauged supergravity and its first derivative. We also resolve a long-standing puzzle concerning the antisymmetry of the flux obtained from uplift ansätze.

  7. Paracrine communication maximizes cellular response fidelity in wound signaling

    PubMed Central

    Handly, L Naomi; Pilko, Anna; Wollman, Roy

    2015-01-01

    Population averaging due to paracrine communication can arbitrarily reduce cellular response variability. Yet, variability is ubiquitously observed, suggesting limits to paracrine averaging. It remains unclear whether and how biological systems may be affected by such limits of paracrine signaling. To address this question, we quantify the signal and noise of Ca2+ and ERK spatial gradients in response to an in vitro wound within a novel microfluidics-based device. We find that while paracrine communication reduces gradient noise, it also reduces the gradient magnitude. Accordingly we predict the existence of a maximum gradient signal to noise ratio. Direct in vitro measurement of paracrine communication verifies these predictions and reveals that cells utilize optimal levels of paracrine signaling to maximize the accuracy of gradient-based positional information. Our results demonstrate the limits of population averaging and show the inherent tradeoff in utilizing paracrine communication to regulate cellular response fidelity. DOI: http://dx.doi.org/10.7554/eLife.09652.001 PMID:26448485

  8. Postactivation Potentiation Biases Maximal Isometric Strength Assessment

    PubMed Central

    Lima, Leonardo Coelho Rabello; Oliveira, Felipe Bruno Dias; Oliveira, Thiago Pires; Assumpção, Claudio de Oliveira; Greco, Camila Coelho; Cardozo, Adalgiso Croscato; Denadai, Benedito Sérgio

    2014-01-01

    Postactivation potentiation (PAP) is known to enhance force production. Maximal isometric strength assessment protocols usually consist of two or more maximal voluntary isometric contractions (MVCs). The objective of this study was to determine if PAP would influence isometric strength assessment. Healthy male volunteers (n = 23) performed two five-second MVCs separated by a 180-seconds interval. Changes in isometric peak torque (IPT), time to achieve it (tPTI), contractile impulse (CI), root mean square of the electromyographic signal during PTI (RMS), and rate of torque development (RTD), in different intervals, were measured. Significant increases in IPT (240.6?±?55.7?N·m versus 248.9?±?55.1?N·m), RTD (746?±?152?N·m·s?1versus 727?±?158?N·m·s?1), and RMS (59.1?±?12.2% RMSMAX??versus 54.8?±?9.4% RMSMAX) were found on the second MVC. tPTI decreased significantly on the second MVC (2373?±?1200?ms versus 2784?±?1226?ms). We conclude that a first MVC leads to PAP that elicits significant enhancements in strength-related variables of a second MVC performed 180 seconds later. If disconsidered, this phenomenon might bias maximal isometric strength assessment, overestimating some of these variables. PMID:25133157

  9. Splitting an Arbitrary Two-ubit State Via a Seven-qubit Maximally Entangled State

    NASA Astrophysics Data System (ADS)

    Chen, Yan

    2015-05-01

    We investigate the usefulness of a recently introduced seven-qubit maximally entangled state by Zha et al. (J. Phys. A: Math. Theor. 45:255-302, [10]) for quantum information splitting. It is shown that such a seven-qubit entangled state can be utilized for quantum information splitting of an arbitrary two-qubit state by performing only the Bell-state measurements.

  10. Race Gap in Life Expectancy Is Narrowing

    MedlinePLUS

    ... medlineplus/news/fullstory_155566.html Race Gap in Life Expectancy Is Narrowing: U.S. Study Difference is now ... Black Americans are catching up to whites in life expectancy -- largely due to declining rates of death ...

  11. Siting Samplers to Minimize Expected Time to Detection

    SciTech Connect

    Walter, Travis; Lorenzetti, David M.; Sohn, Michael D.

    2012-05-02

    We present a probabilistic approach to designing an indoor sampler network for detecting an accidental or intentional chemical or biological release, and demonstrate it for a real building. In an earlier paper, Sohn and Lorenzetti(1) developed a proof of concept algorithm that assumed samplers could return measurements only slowly (on the order of hours). This led to optimal detect to treat architectures, which maximize the probability of detecting a release. This paper develops a more general approach, and applies it to samplers that can return measurements relatively quickly (in minutes). This leads to optimal detect to warn architectures, which minimize the expected time to detection. Using a model of a real, large, commercial building, we demonstrate the approach by optimizing networks against uncertain release locations, source terms, and sampler characteristics. Finally, we speculate on rules of thumb for general sampler placement.

  12. Interpersonal Expectancy Effects: A Forty Year Perspective.

    ERIC Educational Resources Information Center

    Rosenthal, Robert

    Interpersonal expectancy effects--the unintentional expectations that experimenters, teachers, and authority figures bring to experiments, classrooms, and other situations--can wield significant influence on individuals. Some of the issues surrounding expectancy effects are detailed in this paper. The effect itself has been recreated in…

  13. Are Grade Expectations Rational? A Classroom Experiment

    ERIC Educational Resources Information Center

    Hossain, Belayet; Tsigaris, Panagiotis

    2015-01-01

    This study examines students' expectations about their final grade. An attempt is made to determine whether students form expectations rationally. Expectations in economics, rational or otherwise, carry valuable information and have important implications in terms of both teaching effectiveness and the role of grades as an incentive structure…

  14. STOCK MARKET CRASH AND EXPECTATIONS OF AMERICAN HOUSEHOLDS.

    PubMed

    Hudomiet, Péter; Kézdi, Gábor; Willis, Robert J

    2011-01-01

    This paper utilizes data on subjective probabilities to study the impact of the stock market crash of 2008 on households' expectations about the returns on the stock market index. We use data from the Health and Retirement Study that was fielded in February 2008 through February 2009. The effect of the crash is identified from the date of the interview, which is shown to be exogenous to previous stock market expectations. We estimate the effect of the crash on the population average of expected returns, the population average of the uncertainty about returns (subjective standard deviation), and the cross-sectional heterogeneity in expected returns (disagreement). We show estimates from simple reduced-form regressions on probability answers as well as from a more structural model that focuses on the parameters of interest and separates survey noise from relevant heterogeneity. We find a temporary increase in the population average of expectations and uncertainty right after the crash. The effect on cross-sectional heterogeneity is more significant and longer lasting, which implies substantial long-term increase in disagreement. The increase in disagreement is larger among the stockholders, the more informed, and those with higher cognitive capacity, and disagreement co-moves with trading volume and volatility in the market. PMID:21547244

  15. Effect of fatigue on maximal velocity and maximal torque during short exhausting cycling.

    PubMed

    Buttelli, O; Seck, D; Vandewalle, H; Jouanin, J C; Monod, H

    1996-01-01

    A group of 24 subjects performed on a cycle ergometer a fatigue test consisting of four successive all-out sprints against the same braking torque. The subjects were not allowed time to recover between sprints and consequently the test duration was shorter than 30 s. The pedal velocity was recorded every 10 ms from a disc fixed to the flywheel with 360 slots passing in front of a photo-electric cell linked to a microcomputer which processed the data. Taking into account the variation of kinetic energy of the ergometer flywheel, it was possible to determine the linear torque velocity relationship from data obtained during the all-out cycling exercise by computing torque and velocity from zero velocity to peak velocity according to a method proposed previously. The maximal theoretical velocity (v(0)) and the maximal theoretical torque (T(0)) were estimated by extrapolation of each torque-velocity relationship. Maximal power (P(max)) was calculated from the values of T(0) and v(0) (P(max) = 0.25v(0)T(0). The kinetics of v(0), T(0) and P(max) was assumed to express the effects of fatigue on the muscle contractile properties (maximal shortening velocity, maximal muscle strength and maximal power). Fatigue induced a parallel shift to the left of the torque-velocity relationships. The v( 0), T(0) and P(max) decreases were equal to 16.3 percent, 17.3 percent and 31 percent, respectively. The magnitude of the decrease was similar for v(0) and T(0) which suggested that P max decreased because of a slowing of maximal shortening velocity as well as a loss in maximal muscle strength. However, the interpretation of a decrease in cycling v(0) which has the dimension of a maximal cycling frequency is made difficult by the possible interactions between the agonistic and the antagonistic muscles and could also be explained by a slowing of the muscle relaxation rate. PMID:8861688

  16. Information and Transformation at Swiss Re: Maximizing Economic Value

    E-print Network

    Beath, Cynthia M.

    2007-12-01

    In 2007 Swiss Re was striving to maximize economic value, a metric that would allow the company to assess its performance over time despite the volatility of the reinsurance industry. Maximizing economic value required ...

  17. Lifetime Maximization in Wireless Sensor Networks by Distributed Binary Search

    E-print Network

    scenarios include monitoring some environmental parameters (temperature, humidity, chemical concentrationsLifetime Maximization in Wireless Sensor Networks by Distributed Binary Search Andr´e Schumacher power assignment that maximizes the lifetime of a data-gathering wire- less sensor network

  18. Upgrading controls will maximize power plant operations

    SciTech Connect

    Smith, D.J.

    1995-03-01

    This article describes how, in order to remain competitive, electric utilities are optimizing power plant operations by upgrading and/or installing state-of-the-art control systems. With deregulation moving at a full pace, the US`s electric utility industry no longer has a monopoly on the generation of electricity. Because of competition from non-utility electric generators, and the need to meet new and ever tighter emission standards, electric utilities are trying to improve the reliability and availability of their existing power plants. As a power plant ages maintenance tends to increase while breakdowns become more frequent. This is also true for instrument and controls systems. However, one of the most cost effective solutions for improving the reliability, availability, and operation of older electric power generation plants is to upgrade and modernize a plant`s instruments and controls.

  19. Note On The Maximal Primes Gaps

    E-print Network

    N. A. Carella

    2015-02-05

    This note presents a result on the maximal prime gap of the form p_(n+1) - p_n e), where C > 0 is a constant, for any arbitrarily small real number e > 0, and all sufficiently large integer n > n_0. Equivalently, the result shows that any short interval [x, x + y], y => C(log x)^(1+e), contains prime numbers for all sufficiently large real numbers x => x_0 unconditionally. An application demonstrates that a prime p => x > 2 can be determined in deterministic polynomial time O(log(x)^8).

  20. Maximally entangled mixed states and conditional entropies

    NASA Astrophysics Data System (ADS)

    Batle, J.; Casas, M.; Plastino, A.; Plastino, A. R.

    2005-02-01

    The maximally entangled mixed states of Munro [Phys. Rev. A 64, 030302 (2001)] are shown to exhibit interesting features vis á vis conditional entropic measures. The same happens with the Ishizaka and Hiroshima states [Phys. Rev. A 62, 022310 (2000)], whose entanglement degree cannot be increased by acting on them with logic gates. Special types of entangled states that do not violate classical entropic inequalities are seen to exist in the space of two qubits. Special meaning can be assigned to the Munro special participation ratio of 1.8.

  1. Electromagnetically induced grating with maximal atomic coherence

    SciTech Connect

    Carvalho, Silvania A.; Araujo, Luis E. E. de

    2011-10-15

    We describe theoretically an atomic diffraction grating that combines an electromagnetically induced grating with a coherence grating in a double-{Lambda} atomic system. With the atom in a condition of maximal coherence between its lower levels, the combined gratings simultaneously diffract both the incident probe beam as well as the signal beam generated through four-wave mixing. A special feature of the atomic grating is that it will diffract any beam resonantly tuned to any excited state of the atom accessible by a dipole transition from its ground state.

  2. The superposition invariance of unitary operators and maximally entangled state

    E-print Network

    Xin-Wei Zha; Yun-Guang Zhang; Jian-Xia Qi

    2015-10-15

    In this paper, we study the superposition invariance of unitary operators and maximally entangled state respectively. Furthermore, we discuss the set of orthogonal maximally entangled states. We find that orthogonal basis of maximally entangled states can be divided into k subspaces. It is shown that some entanglement properties of superposed state in every subspace are invariant.

  3. Prior expectations facilitate metacognition for perceptual decision.

    PubMed

    Sherman, M T; Seth, A K; Barrett, A B; Kanai, R

    2015-09-01

    The influential framework of 'predictive processing' suggests that prior probabilistic expectations influence, or even constitute, perceptual contents. This notion is evidenced by the facilitation of low-level perceptual processing by expectations. However, whether expectations can facilitate high-level components of perception remains unclear. We addressed this question by considering the influence of expectations on perceptual metacognition. To isolate the effects of expectation from those of attention we used a novel factorial design: expectation was manipulated by changing the probability that a Gabor target would be presented; attention was manipulated by instructing participants to perform or ignore a concurrent visual search task. We found that, independently of attention, metacognition improved when yes/no responses were congruent with expectations of target presence/absence. Results were modeled under a novel Bayesian signal detection theoretic framework which integrates bottom-up signal propagation with top-down influences, to provide a unified description of the mechanisms underlying perceptual decision and metacognition. PMID:25973773

  4. Optimum Drop Height for Maximizing Power Output in Drop Jump: The Effect of Maximal Muscle Strength.

    PubMed

    Matic, Milan S; Pazin, Nemanja R; Mrdakovic, Vladimir D; Jankovic, Nenad N; Ilic, Dusko B; Stefanovic, Djordje L J

    2015-12-01

    Matic, MS, Pazin, NR, Mrdakovic, VD, Jankovic, NN, Ilic, DB, and Stefanovic, DLJ. Optimum drop height for maximizing power output in drop jump: The effect of maximal muscle strength. J Strength Cond Res XX(X): 000-000, 2015-The main purpose of this study was to explore the cause-and-effect relation of maximal muscle strength (MSmax) on the optimum drop height (DHopt) that maximizes power output in drop jump. In total, 30 physically active male students participated in this study, whereas the 16 subjects were selected according to their resistance strength training background (i.e., level of MSmax) and allocated into 2 equal subgroups: strong (n = 8) and weak (n = 8). The main testing session consisted of drop jumps performed from 8 different drop heights (i.e., from 0.12 to 0.82 m). The individual DHopt was determined based on the maximal value power output across applied ranges of drop heights. The tested relationships between DHopt and MSmax were moderate (r = 0.39-0.50, p ? 0.05). In addition, the stronger individuals, on average, showed maximal values of power output on the higher drop height compared with the weaker individuals (0.62 vs. 0.32 m). Finally, significant differences in the individual DHopt between groups were detected (p < 0.01). The present findings suggest that drop height should be adjusted based on a subject's neuromuscular capacity to produce MSmax. Hence, from the perspective of strength and conditioning practitioners, MSmax should be considered as an important factor that could affect the DHopt, and therefore should be used for its adjustment in terms of optimizing athlete's testing, training, or rehabilitation intervention. PMID:26020711

  5. Maximal Oxygen Uptake, Sweating and Tolerance to Exercise in the Heat

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Castle, B. L.; Ruff, W. K.

    1972-01-01

    The physiological mechanisms that facilitate acute acclimation to heat have not been fully elucidated, but the result is the establishment of a more efficient cardiovascular system to increase heat dissipation via increased sweating that allows the acclimated man to function with a cooler internal environment and to extend his performance. Men in good physical condition with high maximal oxygen uptakes generally acclimate to heat more rapidly and retain it longer than men in poorer condition. Also, upon first exposure trained men tolerate exercise in the heat better than untrained men. Both resting in heat and physical training in a cool environment confer only partial acclimation when first exposed to work in the heat. These observations suggest separate additive stimuli of metabolic heat from exercise and environmental heat to increase sweating during the acclimation process. However, the necessity of utilizing physical exercise during acclimation has been questioned. Bradbury et al. (1964) have concluded exercise has no effect on the course of heat acclimation since increased sweating can be induced by merely heating resting subjects. Preliminary evidence suggests there is a direct relationship between the maximal oxygen uptake and the capacity to maintain thermal regulation, particularly through the control of sweating. Since increased sweating is an important mechanism for the development of heat acclimation, and fit men have high sweat rates, it follows that upon initial exposure to exercise in the heat, men with high maximal oxygen uptakes should exhibit less strain than men with lower maximal oxygen uptakes. The purpose of this study was: (1) to determine if men with higher maximal oxygen uptakes exhibit greater tolerance than men with lower oxygen uptakes during early exposure to exercise in the heat, and (2) to investigate further the mechanism of the relationship between sweating and maximal work capacity.

  6. Maximizing strain in miniaturized dielectric elastomer actuators

    NASA Astrophysics Data System (ADS)

    Rosset, Samuel; Araromi, Oluwaseun; Shea, Herbert

    2015-04-01

    We present a theoretical model to optimise the unidirectional motion of a rigid object bonded to a miniaturized dielectric elastomer actuator (DEA), a configuration found for example in AMI's haptic feedback devices, or in our tuneable RF phase shifter. Recent work has shown that unidirectional motion is maximized when the membrane is both anistropically prestretched and subjected to a dead load in the direction of actuation. However, the use of dead weights for miniaturized devices is clearly highly impractical. Consequently smaller devices use the membrane itself to generate the opposing force. Since the membrane covers the entire frame, one has the same prestretch condition in the active (actuated) and passive zones. Because the passive zone contracts when the active zone expands, it does not provide a constant restoring force, reducing the maximum achievable actuation strain. We have determined the optimal ratio between the size of the electrode (active zone) and the passive zone, as well as the optimal prestretch in both in-plane directions, in order to maximize the absolute displacement of the rigid object placed at the active/passive border. Our model and experiments show that the ideal active ratio is 50%, with a displacement twice smaller than what can be obtained with a dead load. We expand our fabrication process to also show how DEAs can be laser-post-processed to remove carefully chosen regions of the passive elastomer membrane, thereby increasing the actuation strain of the device.

  7. Maximal lactate steady state in Judo

    PubMed Central

    de Azevedo, Paulo Henrique Silva Marques; Pithon-Curi, Tania; Zagatto, Alessandro Moura; Oliveira, João; Perez, Sérgio

    2014-01-01

    Summary Background: the purpose of this study was to verify the validity of respiratory compensation threshold (RCT) measured during a new single judo specific incremental test (JSIT) for aerobic demand evaluation. Methods: to test the validity of the new test, the JSIT was compared with Maximal Lactate Steady State (MLSS), which is the gold standard procedure for aerobic demand measuring. Eight well-trained male competitive judo players (24.3 ± 7.9 years; height of 169.3 ± 6.7cm; fat mass of 12.7 ± 3.9%) performed a maximal incremental specific test for judo to assess the RCT and performed on 30-minute MLSS test, where both tests were performed mimicking the UchiKomi drills. Results: the intensity at RCT measured on JSIT was not significantly different compared to MLSS (p=0.40). In addition, it was observed high and significant correlation between MLSS and RCT (r=0.90, p=0.002), as well as a high agreement. Conclusions: RCT measured during JSIT is a valid procedure to measure the aerobic demand, respecting the ecological validity of Judo. PMID:25332923

  8. Optimizing Population Variability to Maximize Benefit

    PubMed Central

    Izu, Leighton T.; Bányász, Tamás; Chen-Izu, Ye

    2015-01-01

    Variability is inherent in any population, regardless whether the population comprises humans, plants, biological cells, or manufactured parts. Is the variability beneficial, detrimental, or inconsequential? This question is of fundamental importance in manufacturing, agriculture, and bioengineering. This question has no simple categorical answer because research shows that variability in a population can have both beneficial and detrimental effects. Here we ask whether there is a certain level of variability that can maximize benefit to the population as a whole. We answer this question by using a model composed of a population of individuals who independently make binary decisions; individuals vary in making a yes or no decision, and the aggregated effect of these decisions on the population is quantified by a benefit function (e.g. accuracy of the measurement using binary rulers, aggregate income of a town of farmers). Here we show that an optimal variance exists for maximizing the population benefit function; this optimal variance quantifies what is often called the “right mix” of individuals in a population. PMID:26650247

  9. Spiders Tune Glue Viscosity to Maximize Adhesion.

    PubMed

    Amarpuri, Gaurav; Zhang, Ci; Diaz, Candido; Opell, Brent D; Blackledge, Todd A; Dhinojwala, Ali

    2015-11-24

    Adhesion in humid conditions is a fundamental challenge to both natural and synthetic adhesives. Yet, glue from most spider species becomes stickier as humidity increases. We find the adhesion of spider glue, from five diverse spider species, maximizes at very different humidities that matches their foraging habitats. By using high-speed imaging and spreading power law, we find that the glue viscosity varies over 5 orders of magnitude with humidity for each species, yet the viscosity at maximal adhesion for each species is nearly identical, 10(5)-10(6) cP. Many natural systems take advantage of viscosity to improve functional response, but spider glue's humidity responsiveness is a novel adaptation that makes the glue stickiest in each species' preferred habitat. This tuning is achieved by a combination of proteins and hygroscopic organic salts that determines water uptake in the glue. We therefore anticipate that manipulation of polymer-salts interaction to control viscosity can provide a simple mechanism to design humidity responsive smart adhesives. PMID:26513350

  10. Rapid Expectation Adaptation during Syntactic Comprehension

    PubMed Central

    Fine, Alex B.; Jaeger, T. Florian; Farmer, Thomas A.; Qian, Ting

    2013-01-01

    When we read or listen to language, we are faced with the challenge of inferring intended messages from noisy input. This challenge is exacerbated by considerable variability between and within speakers. Focusing on syntactic processing (parsing), we test the hypothesis that language comprehenders rapidly adapt to the syntactic statistics of novel linguistic environments (e.g., speakers or genres). Two self-paced reading experiments investigate changes in readers’ syntactic expectations based on repeated exposure to sentences with temporary syntactic ambiguities (so-called “garden path sentences”). These sentences typically lead to a clear expectation violation signature when the temporary ambiguity is resolved to an a priori less expected structure (e.g., based on the statistics of the lexical context). We find that comprehenders rapidly adapt their syntactic expectations to converge towards the local statistics of novel environments. Specifically, repeated exposure to a priori unexpected structures can reduce, and even completely undo, their processing disadvantage (Experiment 1). The opposite is also observed: a priori expected structures become less expected (even eliciting garden paths) in environments where they are hardly ever observed (Experiment 2). Our findings suggest that, when changes in syntactic statistics are to be expected (e.g., when entering a novel environment), comprehenders can rapidly adapt their expectations, thereby overcoming the processing disadvantage that mistaken expectations would otherwise cause. Our findings take a step towards unifying insights from research in expectation-based models of language processing, syntactic priming, and statistical learning. PMID:24204909

  11. From entropy-maximization to equality-maximization: Gauss, Laplace, Pareto, and Subbotin

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2014-12-01

    The entropy-maximization paradigm of statistical physics is well known to generate the omnipresent Gauss law. In this paper we establish an analogous socioeconomic model which maximizes social equality, rather than physical disorder, in the context of the distributions of income and wealth in human societies. We show that-on a logarithmic scale-the Laplace law is the socioeconomic equality-maximizing counterpart of the physical entropy-maximizing Gauss law, and that this law manifests an optimized balance between two opposing forces: (i) the rich and powerful, striving to amass ever more wealth, and thus to increase social inequality; and (ii) the masses, struggling to form more egalitarian societies, and thus to increase social equality. Our results lead from log-Gauss statistics to log-Laplace statistics, yield Paretian power-law tails of income and wealth distributions, and show how the emergence of a middle-class depends on the underlying levels of socioeconomic inequality and variability. Also, in the context of asset-prices with Laplace-distributed returns, our results imply that financial markets generate an optimized balance between risk and predictability.

  12. Expectancy and Repetition in Task Preparation

    NASA Technical Reports Server (NTRS)

    Ruthruff, E.; Remington, R. W.; Johnston, James C.; Null, Cynthia H. (Technical Monitor)

    1996-01-01

    We studied the mechanisms of task preparation using a design that pitted task expectancy against task repetition. In one experiment, two simple cognitive tasks were presented in a predictable sequence containing both repetitions and non-repetitions. The typical task sequence was AABBAABB. Occasional violations of this sequence allowed us to measure the effects of valid versus invalid expectancy. With this design, we were able to study the effects of task expectancy, task repetition, and interaction.

  13. A utility oriented radio resource management algorithm for heterogenous network

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoyan; Dong, Yan; Huang, Zailu

    2007-11-01

    A utility oriented radio resource management algorithm is proposed for broadband nongeostationary satellite network which works in the heterogeneous network environment and provides access services for various customers on the ground. Based on the game theory, the problem for optimizing the network's performance is turned into the problem for maximizing the network's long term utility in the proposed algorithm. With evaluation to the traffic condition and dimensions of Qos for the network at the moment while the access service requirements changing, the influence of this service requirement to the long term utility of the satellite network is audited and then the resource assignment decision can be made according to the rule for maximizing the satellite network's long term utility. The process directed by game theory guaranteed both that the benefit of the network and the requirements of the customers could be considered synthetically. The simulation results demonstrate the effectiveness of the proposed algorithm.

  14. Comparison between static maximal force and handbrake pulling force.

    PubMed

    Chateauroux, E; Wang, X

    2012-01-01

    The measurement of maximum pulling force is important not only for specifying force limit of industrial workers but also for designing controls requiring high force. This paper presents a comparison between maximal static handbrake pulling force (FST) and force exerted during normal handbrake pulling task (FDY). These forces were measured for different handle locations and subject characteristics. Participants were asked to pull a handbrake on an adjustable car mock-up as they would do when parking their own car, then to exert a force as high as possible on the pulled handbrake. Hand pulling forces were measured using a six-axes force sensor. 5 fixed handbrake positions were tested as well as a neutral handbrake position defined by the subject. FST and FDY were significantly correlated. Both were found to be dependent on handbrake position, age and gender. As expected, women and older subjects exerted lower forces. FST was significantly higher than FDY. The ratio FmR (FDY divided by FST) was also analyzed. Women showed higher FmR than men meaning that the task required a higher amount of muscle capability for women. FmR was also influenced by handbrake location. These data will be useful for handbrake design. PMID:22316898

  15. Quantum mechanics and the principle of maximal variety

    E-print Network

    Lee Smolin

    2015-06-09

    Quantum mechanics is derived from the principle that the universe contain as much variety as possible, in the sense of maximizing the distinctiveness of each subsystem. The quantum state of a microscopic system is defined to correspond to an ensemble of subsystems of the universe with identical constituents and similar preparations and environments. A new kind of interaction is posited amongst such similar subsystems which acts to increase their distinctiveness, by extremizing the variety. In the limit of large numbers of similar subsystems this interaction is shown to give rise to Bohm's quantum potential. As a result the probability distribution for the ensemble is governed by the Schroedinger equation. The measurement problem is naturally and simply solved. Microscopic systems appear statistical because they are members of large ensembles of similar systems which interact non-locally. Macroscopic systems are unique, and are not members of any ensembles of similar systems. Consequently their collective coordinates may evolve deterministically. This proposal could be tested by constructing quantum devices from entangled states of a modest number of quits which, by its combinatorial complexity, can be expected to have no natural copies.

  16. 77 FR 46069 - Proposed Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-02

    ...BUREAU OF CONSUMER FINANCIAL PROTECTION Proposed Guidelines for Ensuring...Information Disseminated by the Bureau of Consumer Financial Protection AGENCY: Bureau of Consumer Financial Protection. ACTION: Notice of...

  17. 76 FR 49473 - Petition to Maximize Practical Utility of List 1 Chemicals Screened Through EPA's Endocrine...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-10

    ... Protection Agency, 1200 Pennsylvania Ave., NW., Washington, DC 20460-0001. Delivery: OPP Regulatory Public.... Crystal Dr., Arlington, VA. Deliveries are only accepted during the Docket Facility's normal hours of... should be made for deliveries of boxed information. The Docket Facility telephone number is (703)...

  18. Maximizing semi-active vibration isolation utilizing a magnetorheological damper with an inner bypass configuration

    SciTech Connect

    Bai, Xian-Xu; Wereley, Norman M.; Hu, Wei

    2015-05-07

    A single-degree-of-freedom (SDOF) semi-active vibration control system based on a magnetorheological (MR) damper with an inner bypass is investigated in this paper. The MR damper employing a pair of concentric tubes, between which the key structure, i.e., the inner bypass, is formed and MR fluids are energized, is designed to provide large dynamic range (i.e., ratio of field-on damping force to field-off damping force) and damping force range. The damping force performance of the MR damper is modeled using phenomenological model and verified by the experimental tests. In order to assess its feasibility and capability in vibration control systems, the mathematical model of a SDOF semi-active vibration control system based on the MR damper and skyhook control strategy is established. Using an MTS 244 hydraulic vibration exciter system and a dSPACE DS1103 real-time simulation system, experimental study for the SDOF semi-active vibration control system is also conducted. Simulation results are compared to experimental measurements.

  19. Maximizing semi-active vibration isolation utilizing a magnetorheological damper with an inner bypass configuration

    NASA Astrophysics Data System (ADS)

    Bai, Xian-Xu; Wereley, Norman M.; Hu, Wei

    2015-05-01

    A single-degree-of-freedom (SDOF) semi-active vibration control system based on a magnetorheological (MR) damper with an inner bypass is investigated in this paper. The MR damper employing a pair of concentric tubes, between which the key structure, i.e., the inner bypass, is formed and MR fluids are energized, is designed to provide large dynamic range (i.e., ratio of field-on damping force to field-off damping force) and damping force range. The damping force performance of the MR damper is modeled using phenomenological model and verified by the experimental tests. In order to assess its feasibility and capability in vibration control systems, the mathematical model of a SDOF semi-active vibration control system based on the MR damper and skyhook control strategy is established. Using an MTS 244 hydraulic vibration exciter system and a dSPACE DS1103 real-time simulation system, experimental study for the SDOF semi-active vibration control system is also conducted. Simulation results are compared to experimental measurements.

  20. 76 FR 49473 - Petition to Maximize Practical Utility of List 1 Chemicals Screened Through EPA's Endocrine...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-10

    ...substances for potential endocrine effects. Potentially...Industrial Classification System (NAICS) codes, may...chemical substances for endocrine effects. This listing...Industrial Classification System (NAICS) codes...

  1. New Irrigation System Design for Maximizing Irrigation Efficiency and Increasing Rainfall Utilization 

    E-print Network

    Lyle, W. M.; Bordovsky, J. P.

    1980-01-01

    stream_source_info tr105.pdf.txt stream_content_type text/plain stream_size 330 Content-Encoding ISO-8859-1 stream_name tr105.pdf.txt Content-Type text/plain; charset=ISO-8859-1 TR- 105 1980 New Irrigation...

  2. 77 FR 46069 - Proposed Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-02

    ... published on January 3, 2002, at 67 FR 369-378 (reprinted February 5, 2002, at 67 FR 5365). The Bureau's..., identified by the title of this notice, by any of the following methods: Electronic:...

  3. A sampling plan for conduit-flow karst springs: Minimizing sampling cost and maximizing statistical utility

    USGS Publications Warehouse

    Currens, J.C.

    1999-01-01

    Analytical data for nitrate and triazines from 566 samples collected over a 3-year period at Pleasant Grove Spring, Logan County, KY, were statistically analyzed to determine the minimum data set needed to calculate meaningful yearly averages for a conduit-flow karst spring. Results indicate that a biweekly sampling schedule augmented with bihourly samples from high-flow events will provide meaningful suspended-constituent and dissolved-constituent statistics. Unless collected over an extensive period of time, daily samples may not be representative and may also be autocorrelated. All high-flow events resulting in a significant deflection of a constituent from base-line concentrations should be sampled. Either the geometric mean or the flow-weighted average of the suspended constituents should be used. If automatic samplers are used, then they may be programmed to collect storm samples as frequently as every few minutes to provide details on the arrival time of constituents of interest. However, only samples collected bihourly should be used to calculate averages. By adopting a biweekly sampling schedule augmented with high-flow samples, the need to continuously monitor discharge, or to search for and analyze existing data to develop a statistically valid monitoring plan, is lessened.Analytical data for nitrate and triazines from 566 samples collected over a 3-year period at Pleasant Grove Spring, Logan County, KY, were statistically analyzed to determine the minimum data set needed to calculate meaningful yearly averages for a conduit-flow karst spring. Results indicate that a biweekly sampling schedule augmented with bihourly samples from high-flow events will provide meaningful suspended-constituent and dissolved-constituent statistics. Unless collected over an extensive period of time, daily samples may not be representative and may also be autocorrelated. All high-flow events resulting in a significant deflection of a constituent from base-line concentrations should be sampled. Either the geometric mean or the flow-weighted average of the suspended constituents should be used. If automatic samplers are used, then they may be programmed to collect storm samples as frequently as every few minutes to provide details on the arrival time of constituents of interest. However, only samples collected bihourly should be used to calculate averages. By adopting a biweekly sampling schedule augmented with high-flow samples, the need to continuously monitor discharge, or to search for and analyze existing data to develop a statistically valid monitoring plan, is lessened.

  4. Effective multisite CD correlation to maximize high-end tool utilization

    NASA Astrophysics Data System (ADS)

    Duff, John W.; Allsop, John

    1999-04-01

    The semiconductor industry continues to accelerate its pace following the SIA roadmap. Many technical and business issues are exacerbated by this acceleration. One such obstacle that faces the mask making community touches on both the business and the technical aspects of the industry's desire to pull in the target dates for each roadmap node. In order to provide the industry with adequate high end capacity on a global basis, the multi-site global photomask company must routinely exercise inter-site transfer for the purpose of leveling the manufacturing loads during periods of peak regional demand. A crucial part of inter-site transfer is to ensure that common standards are used, the deviation from those standards are understood, and a statistics based methodology for correlating metrology equipment is developed. Minimizing critical dimension correlation deltas between sites is critical to successful load leveling in an era of ever shrinking error budgets. This paper will explore the methods and practices used by Photronics to achieve routine inter-site measurements correlations whose precision far exceeds those of the best available standards. Both the statistical methods employed and the results from a large sample of production plates will be reported.

  5. Ground truth spectrometry and imagery of eruption clouds to maximize utility of satellite imagery

    NASA Technical Reports Server (NTRS)

    Rose, William I.

    1993-01-01

    Field experiments with thermal imaging infrared radiometers were performed and a laboratory system was designed for controlled study of simulated ash clouds. Using AVHRR (Advanced Very High Resolution Radiometer) thermal infrared bands 4 and 5, a radiative transfer method was developed to retrieve particle sizes, optical depth and particle mass involcanic clouds. A model was developed for measuring the same parameters using TIMS (Thermal Infrared Multispectral Scanner), MODIS (Moderate Resolution Imaging Spectrometer), and ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer). Related publications are attached.

  6. Improving Simulated Annealing by Replacing Its Variables with Game-Theoretic Utility Maximizers

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Bandari, Esfandiar; Tumer, Kagan

    2001-01-01

    The game-theory field of Collective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved as a side-effect. Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting significantly improves simulated annealing for a model of an economic process run over an underlying small-worlds topology. Furthermore, these experiments reveal novel small-worlds phenomena, and highlight the shortcomings of conventional mechanism design in bounded rationality domains.

  7. Practical applicability of Nyayas – (Maxims) mentioned in Chakrapani Tika

    PubMed Central

    Vyas, Mahesh Kumar; Dwivedi, Rambabu

    2014-01-01

    The Nyayas – (Maxims) are of two types: (1) Loukika Nyaya and (2) Shastriya Nyaya. Loukika Nyayas are the one which are used by the common public in day today life whereas Shastriya Nyayas are the one which are used by the authors of the treatise to explore their concepts. Most commonly by using the meaning and gist of Loukika Nyaya, the Shastriya Nyayas were put forth by the Granthakaras. Moreover, notion of Nyaya depends upon the situation, place, and topic of explanation mainly. To explain the meaning of the topic, these Nyayas helped since Vaidika Kala. They teach hidden meaning correctly. As like Vedas, these Nyayas are also a part of other Shastras and so as in Ayurveda Shastra too. While explaining the Nidana, Chikitsa, etc., these Nyayas were utilized by the Acharyas of Ayurveda. To discern these Nyayas in their entirety at one place with examples is necessary for easy understanding of the Shastra. Here is an attempt to explore such Nyayas mentioned in Ayurveda for the benefit of easy understanding of the subject.

  8. Maximizing the liquid fuel yield in a biorefining process.

    PubMed

    Zhang, Bo; von Keitz, Marc; Valentas, Kenneth

    2008-12-01

    Biorefining strives to recover the maximum value from each fraction, at minimum energy cost. In order to seek an unbiased and thorough assessment of the alleged opportunity offered by biomass fuels, the direct conversion of various lignocellulosic biomass was studied: aspen pulp wood (Populus tremuloides), aspen wood pretreated with dilute acid, aspen lignin, aspen logging residues, corn stalk, corn spathe, corn cob, corn stover, corn stover pellet, corn stover pretreated with dilute acid, and lignin extracted from corn stover. Besides the heating rate, the yield of liquid products was found to be dependent on the final liquefaction temperature and the length of liquefaction time. The major compounds of the liquid products from various origins were identified by GC-MS. The lignin was found to be a good candidate for the liquefaction process, and biomass fractionation was necessary to maximize the yield of the liquid bio-fuel. The results suggest a biorefinery process accompanying pretreatment, fermentation to ethanol, liquefaction to bio-crude oil, and other thermo-conversion technologies, such as gasification. Other biorefinery options, including supercritical water gasification and the effectual utilization of the bio-crude oil, are also addressed. PMID:18781691

  9. A maximally selected test of symmetry about zero.

    PubMed

    Laska, Eugene; Meisner, Morris; Wanderling, Joseph

    2012-11-20

    The problem of testing symmetry about zero has a long and rich history in the statistical literature. We introduce a new test that sequentially discards observations whose absolute value is below increasing thresholds defined by the data. McNemar's statistic is obtained at each threshold and the largest is used as the test statistic. We obtain the exact distribution of this maximally selected McNemar and provide tables of critical values and a program for computing p-values. Power is compared with the t-test, the Wilcoxon Signed Rank Test and the Sign Test. The new test, MM, is slightly less powerful than the t-test and Wilcoxon Signed Rank Test for symmetric normal distributions with nonzero medians and substantially more powerful than all three tests for asymmetric mixtures of normal random variables with or without zero medians. The motivation for this test derives from the need to appraise the safety profile of new medications. If pre and post safety measures are obtained, then under the null hypothesis, the variables are exchangeable and the distribution of their difference is symmetric about a zero median. Large pre-post differences are the major concern of a safety assessment. The discarded small observations are not particularly relevant to safety and can reduce power to detect important asymmetry. The new test was utilized on data from an on-road driving study performed to determine if a hypnotic, a drug used to promote sleep, has next day residual effects. PMID:22729950

  10. Maximally Sparse Polynomials have Solid Amoebas

    E-print Network

    Nisse, Mounir

    2007-01-01

    Let $f$ be an ordinary polynomial in $\\mathbb{C}[z_1,..., z_n]$ with no negative exponents and with no factor of the form $z_1^{\\alpha_1}... z_n^{\\alpha_n}$ where $\\alpha_i$ are non zero natural integer. If we assume in addicting that $f$ is maximally sparse polynomial (that its support is equal to the set of vertices of its Newton polytope), then a complement component of the amoeba $\\mathscr{A}_f$ in $\\mathbb{R}^n$ of the algebraic hypersurface $V_f\\subset (\\mathbb{C}^*)^n$ defined by $f$, has order lying in the support of $f$, which means that $\\mathscr{A}_f$ is solid. This gives an affirmative answer to Passare and Rullg\\aa rd question in [PR2-01].

  11. Singletons and their maximal symmetry algebras

    E-print Network

    Xavier Bekaert

    2012-01-02

    Singletons are those unitary irreducible modules of the Poincare or (anti) de Sitter group that can be lifted to unitary modules of the conformal group. Higher-spin algebras are the corresponding realizations of the universal enveloping algebra of the conformal algebra on these modules. These objects appear in a wide variety of areas of theoretical physics: AdS/CFT correspondence, electric-magnetic duality, higher-spin multiplets, infinite-component Majorana equations, higher-derivative symmetries, etc. Singletons and higher-spin algebras are reviewed through a list of their many equivalent definitions in order to approach them from various perspectives. The focus of this introduction is on the symmetries of a singleton: its maximal algebra and the manifest realization thereof.

  12. Singletons and their maximal symmetry algebras

    E-print Network

    Bekaert, Xavier

    2011-01-01

    Singletons are those unitary irreducible modules of the Poincare or (anti) de Sitter group that can be lifted to unitary modules of the conformal group. Higher-spin algebras are the corresponding realizations of the universal enveloping algebra of the conformal algebra on these modules. These objects appear in a wide variety of areas of theoretical physics: AdS/CFT correspondence, electric-magnetic duality, higher-spin multiplets, infinite-component Majorana equations, higher-derivative symmetries, etc. Singletons and higher-spin algebras are reviewed through a list of their many equivalent definitions in order to approach them from various perspectives. The focus of this introduction is on the symmetries of a singleton: its maximal algebra and the manifest realization thereof.

  13. Maximizing the value of a breast center.

    PubMed

    Goldman, Mickey; Chang, Dan

    2010-08-01

    This article focuses on the value and benefit of a Breast Center to an organization by identifying the best ways to maximize their contribution in order to create and sustain a financially viable, clinically respected and community-oriented Breast Center. The goal of the Breast Center is to ultimately benefit the community and the hospital's Comprehensive Cancer Program as a whole. The value propositions are divided into three areas that have positive impacts to the program and hospital, collectively. These value propositions are: 1. Financial Value e identified values of the Breast Center that contribute to the bottom line - or Return on Investment (ROI) - of the Cancer Program. 2. Clinical Quality Values - identified values of the Breast Center that improve the quality of care and outcomes of the patients. 3. Intangibles Values - identified values of the Breast Center that connect to the community and women that is invaluable to the Cancer Program. PMID:20400310

  14. Maximal energy extraction under discrete diffusive exchange

    NASA Astrophysics Data System (ADS)

    Hay, M. J.; Schiff, J.; Fisch, N. J.

    2015-10-01

    Waves propagating through a bounded plasma can rearrange the densities of states in the six-dimensional velocity-configuration phase space. Depending on the rearrangement, the wave energy can either increase or decrease, with the difference taken up by the total plasma energy. In the case where the rearrangement is diffusive, only certain plasma states can be reached. It turns out that the set of reachable states through such diffusive rearrangements has been described in very different contexts. Building upon those descriptions, and making use of the fact that the plasma energy is a linear functional of the state densities, the maximal extractable energy under diffusive rearrangement can then be addressed through linear programming.

  15. Dispatch Scheduling to Maximize Exoplanet Detection

    NASA Astrophysics Data System (ADS)

    Johnson, Samson; McCrady, Nate; MINERVA

    2016-01-01

    MINERVA is a dedicated exoplanet detection telescope array using radial velocity measurements of nearby stars to detect planets. MINERVA will be a completely robotic facility, with a goal of maximizing the number of exoplanets detected. MINERVA requires a unique application of queue scheduling due to its automated nature and the requirement of high cadence observations. A dispatch scheduling algorithm is employed to create a dynamic and flexible selector of targets to observe, in which stars are chosen by assigning values through a weighting function. I designed and have begun testing a simulation which implements the functions of a dispatch scheduler and records observations based on target selections through the same principles that will be used at the commissioned site. These results will be used in a larger simulation that incorporates weather, planet occurrence statistics, and stellar noise to test the planet detection capabilities of MINERVA. This will be used to heuristically determine an optimal observing strategy for the MINERVA project.

  16. Maximizing Educational Opportunity through Community Resources.

    ERIC Educational Resources Information Center

    Maradian, Steve

    In the face of increased demands and diminishing resources, educational administrators at correctional facilities should look beyond institutional resources and utilize the services of area community colleges. The community college has an established track record in correctional education. Besides the nationally recognized correctional programs…

  17. Leverage Expectations and Bond Credit Spreads

    E-print Network

    Flannery, Mark J.; Nikolova, Stanislava; Ö ztekin, Ö zde

    2012-08-04

    In an efficient market, spreads will reflect both the issuer’s current risk and investors’ expectations about how that risk might change over time. Collin-Dufresne and Goldstein (2001) show analytically that a firm’s expected future leverage...

  18. Trends in Life Expectancy in Wellbeing

    ERIC Educational Resources Information Center

    Perenboom, R. J. M.; Van Herten, L. M.; Boshuizen, H. C.; Van Den Bos, G. A. M.

    2004-01-01

    Objectives: This paper describes and discusses trends in life expectancy in wellbeing between 1989 and 1998. Methods: Data on wellbeing by the Bradburn Affect Balance Scale is obtained from the Netherlands Continuous Health Interview Surveys for the calendar years from 1989 to 1998. Using Sullivan's method, life expectancy in wellbeing is…

  19. Do Students Expect Compensation for Wage Risk?

    ERIC Educational Resources Information Center

    Schweri, Juerg; Hartog, Joop; Wolter, Stefan C.

    2011-01-01

    We use a unique data set about the wage distribution that Swiss students expect for themselves ex ante, deriving parametric and non-parametric measures to capture expected wage risk. These wage risk measures are unfettered by heterogeneity which handicapped the use of actual market wage dispersion as risk measure in earlier studies. Students in…

  20. Rising Tides: Faculty Expectations of Library Websites

    ERIC Educational Resources Information Center

    Nicol, Erica Carlson; O'English, Mark

    2012-01-01

    Looking at 2003-2009 LibQUAL+ responses at research-oriented universities in the United States, faculty library users report a significant and consistent rise in desires and expectations for library-provided online tools and websites, even as student user groups show declining or leveling expectations. While faculty, like students, also report…

  1. The Expectant Reader in Theory and Practice.

    ERIC Educational Resources Information Center

    Fowler, Lois Josephs; McCormick, Kathleen

    1986-01-01

    Offers a method of using reader response theory that emphasizes the expectations about a text and how those expectations are fulfilled or deflated. Specifically, students read traditional fables, fairy tales, and parables, and compare them to contemporary works such as Kafka's "Metamorphosis" and Marquez's "The Very Old Man With Enormous Wings."…

  2. Grief Experiences and Expectance of Suicide

    ERIC Educational Resources Information Center

    Wojtkowiak, Joanna; Wild, Verena; Egger, Jos

    2012-01-01

    Suicide is generally viewed as an unexpected cause of death. However, some suicides might be expected to a certain extent, which needs to be further studied. The relationships between expecting suicide, feeling understanding for the suicide, and later grief experiences were explored. In total, 142 bereaved participants completed the Grief…

  3. 47 CFR 90.743 - Renewal expectancy.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 5 2011-10-01 2011-10-01 false Renewal expectancy. 90.743 Section 90.743 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Regulations Governing Licensing and Use of Frequencies in the 220-222 MHz Band § 90.743 Renewal expectancy. (a)...

  4. Alcohol expectancies in a Native American population.

    PubMed

    Garcia-Andrade, C; Wall, T L; Ehlers, C L

    1996-11-01

    Native Americans, as a group, have a high prevalence of alcohol abuse and alcohol dependence, although specific risk factors for alcoholism among this population have yet to be clearly identified. One set of factors that may contribute to the development of alcoholism are expectations of alcohol's effects. Previous research has shown that heavy drinkers and alcoholics have higher alcohol-related expectancies. Some studies have also shown an association between alcohol expectancies and a positive familial history of alcoholism. To examine factors that are related to expectations of alcohol's effects in a Native American population, this study evaluated healthy, nonalcoholic Mission Indian men between the ages of 18 and 25 years using the short form of the Alcohol Expectancy Questionnaire (AEQ). The influence of recent drinking history, family history of alcoholism, and degree of Native American heritage on alcohol-related expectancies was determined using regression analyses for the total AEQ score and for each of the six AEQ subscales. Recent drinking history accounted for a significant proportion of the variance in the total score, as well as scale I (global positive changes) and scale VI (arousal and power) of the AEQ. Degree of Native American heritage and family history of alcoholism did not account for a significant amount of variability in alcohol expectancies. These results suggest that, consistent with findings in other populations, alcohol expectancies are related to drinking patterns in Mission Indians. However, no association with two other potential risk factors were found in this sample of Native Americans. PMID:8947322

  5. The vortex-finding property of maximal center (and other) gauges

    SciTech Connect

    Faber, M.; Greensite, J.; Olejnik, S.; Yamada, D.

    1999-10-01

    The authors argue that the vortex-finding property of maximal center gauge, i.e. the ability of this gauge to locate center vortices inserted by hand on any given lattice, is the key to its success in extracting the vortex content of thermalized lattice configurations. The authors explain how this property comes about, and why it is expected not only in maximal center gauge, but also in an infinite class of gauge conditions based on adjoint-representation link variables. In principle, the vortex-finding property can be foiled by Gribov copies. This fact is relevant to a gauge-fixing procedure devised by Kovacs and Tomboulis, where they show that the loss of center dominance, found in their procedure, is explained by a corresponding loss of the vortex-finding property. The dependence of center dominance on the vortex-finding property is demonstrated numerically in a number of other gauges.

  6. Utilization of the terrestrial cyanobacteria

    NASA Astrophysics Data System (ADS)

    Katoh, Hiroshi; Tomita-Yokotani, Kaori; Furukawa, Jun; Kimura, Shunta; Yokoshima, Mika; Yamaguchi, Yuji; Takenaka, Hiroyuki

    The terrestrial, N _{2}-fixing cyanobacterium, Nostoc commune has expected to utilize for agriculture, food and terraforming cause of its extracellular polysaccharide, desiccation tolerance and nitrogen fixation. Previously, the first author indicated that desiccation related genes were analyzed and the suggested that the genes were related to nitrogen fixation and metabolisms. In this report, we suggest possibility of agriculture, using the cyanobacterium. Further, we also found radioactive compounds accumulated N. commune (cyanobacterium) in Fukushima, Japan after nuclear accident. Thus, it is investigated to decontaminate radioactive compounds from the surface soil by the cyanobacterium and showed to accumulate radioactive compounds using the cyanobacterium. We will discuss utilization of terrestrial cyanobacteria under closed environment. Keyword: Desiccation, terrestrial cyanobacteria, bioremediation, agriculture

  7. Content Specificity of Expectancy Beliefs and Task Values in Elementary Physical Education

    PubMed Central

    Chen, Ang; Martin, Robert; Ennis, Catherine D.; Sun, Haichun

    2015-01-01

    The curriculum may superimpose a content-specific context that mediates motivation (Bong, 2001). This study examined content specificity of the expectancy-value motivation in elementary school physical education. Students’ expectancy beliefs and perceived task values from a cardiorespiratory fitness unit, a muscular fitness unit, and a traditional skill/game unit were analyzed using constant comparison coding procedures, multivariate analysis of variance, ?2, and correlation analyses. There was no difference in the intrinsic interest value among the three content conditions. Expectancy belief, attainment, and utility values were significantly higher for the cardiorespiratory fitness curriculum. Correlations differentiated among the expectancy-value components of the content conditions, providing further evidence of content specificity in the expectancy-value motivation process. The findings suggest that expectancy beliefs and task values should be incorporated in the theoretical platform for curriculum development based on the learning outcomes that can be specified with enhanced motivation effect. PMID:18664044

  8. Orbiter electrical equipment utilization baseline

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The baseline for utilization of Orbiter electrical equipment in both electrical and Environmental Control and Life Support System (ECLSS) thermal analyses is established. It is a composite catalog of Space Shuttle equipment, as defined in the Shuttle Operational Data Book. The major functions and expected usage of each component type are described. Functional descriptions are designed to provide a fundamental understanding of the Orbiter electrical equipment, to insure correlation of equipment usage within nominal analyses, and to aid analysts in the formulation of off-nominal, contingency analyses.

  9. Solar energy research and utilization

    NASA Technical Reports Server (NTRS)

    Cherry, W. R.

    1974-01-01

    The role of solar energy is visualized in the heating and cooling of buildings, in the production of renewable gaseous, liquid and solid fuels, and in the production of electric power over the next 45 years. Potential impacts of solar energy on various energy markets, and estimated costs of such solar energy systems are discussed. Some typical solar energy utilization processes are described in detail. It is expected that at least 20% of the U.S. total energy requirements by 2020 will be delivered from solar energy.

  10. Competition and utility financial risks

    SciTech Connect

    Studness, C.M.

    1993-07-01

    While competition for electric utilities has grown steadily for over a decade, the inroads have been small. Utilities have lost load by being forced to buy power from cogenerators. They have foregone some of their normal growth by choosing to buy power from independent power producers instead of building generating facilities themselves. They have absorbed earnings erosion by giving discounts to large industrial customers to avoid having them move production outside their service areas. Yet although competition in these areas can be expected to intensity, the real financial risk for utilities lies on other fronts, principally direct price competition. The type of competition experienced thus far will constrain utility financial progress, but utilities will no doubt find ways to mitigate its impact, an example being investment in demand-side management (DSM) programs. Direct price competition, on the other hand, offers few if any avenues of escape, and it is only a matter of time before the barriers that prevent it are removed. One of the largest is the prohibition of retail wheeling, which is the principal source of price protection for utilities. Significantly, over the course of the last year the prohibition of retail wheeling has been transformed from an untouchable issue into the central issue in the struggle over competition. Price competition, when it develops, will be driven by the cost of producing electric power with new generating facilities and whatever excess generating capacity exists in the industry. How important price competition becomes will depend on what customers want. If low-cost power turns out to be a top priority, price competition will be a very important part of the competitive picture. The experience of industries that have been deregulated, such as the airlines, suggests that low prices will have a high priority.

  11. Utilities goals determine the best AMR solution

    SciTech Connect

    Kelly, R.

    1998-01-01

    Deregulation in the electric power industry is forcing many utilities to demand flexible communications systems-capable of providing core utility services, while also readily extendible for revenue-enhancing opportunities. To be competitive, particularly now with deregulation formally debuting, utility decision-makers need to acknowledge choices. Automatic meter reading (AMR) committees everywhere are being challenged by the minute to decide on systems. Appropriate AMR systems will allow utilities to improve efficiency, enhance customer satisfaction and provide additional services in an increasingly competitive environment. Utilities need technology that meets day-to-day requirements now and in the future. They need to decide which AMR value-added services and benefits address customer expectations. They will need to educate their customers more fully and improve communication with them. AMR gives them opportunities to differentiate themselves with lower prices, more services and better customer support.

  12. [What Expectations do Mental Disordered People have about the Treatment in an Psychiatric Hospital?

    PubMed

    Fleischmann, Heribert

    2003-05-01

    Patiens are mostly passive utilizer of the health-care-system. They are confronted with a supply of medical service and they are allowed to show their satisfaction with it retrospectively. Our medical system has in future to develope itself from an effective perspective to an utilizer orientated medicine. Orientation to the utilizers means to ask for the expectations of the patients for supply (at customer's option). Aim of our investigation was to check the subjective expectations of the patients before the beginning of in-patient treatment: 1. What is their opinion about the label of the disorder, they are suffering. 2. Of what therapeutic measures do they expect help for theirselves. 3. Do they want to play a part in planning of therapeutic measures. 209 of 344 (61 %) of the patients were at admission ready for answering a self designed questionnaire. Only 4 % of the patients said, that their disorder is called insanity. They preferred labels like mental illness (45 %), somatic illness (43 %) and mental health problem (42 %). A pharmacological therapy expected in totally 61 % of the patients. Mostly were expected drugs against depressive disorders (32 %), drugs against addiction (31 %) and tranquilizers (29 %). Only 10 % of the patients expected to get antipsychotic drugs. A verbal therapeutic intervention expected 76 % of the patients. To have a speak with the doctor is with 69 % a first rank desire, followed by speaking with the psychologist (60 %), the nurses (58 %) and the patients comrades (56 %). Psychotherapy in a narrower sense expect only 40 % of the patients. Furthermore there are privacy and recreation throug promenades in front of the expectations (69 %), followed by relaxation (59 %), occupational therapy (55 %) and sports or acitive exercise therapy (54 %). 75 % of the patients want to be informed about the therapy. 69 % want to cooperate with planning of the therapy. Only 21 % commit the therapy to the doctor. About one third of the patients expect a consultation with their relatives, the custodians and their family doctor. PMID:13130358

  13. Maximally localized Wannier functions: Theory and applications

    NASA Astrophysics Data System (ADS)

    Marzari, Nicola; Mostofi, Arash A.; Yates, Jonathan R.; Souza, Ivo; Vanderbilt, David

    2012-10-01

    The electronic ground state of a periodic system is usually described in terms of extended Bloch orbitals, but an alternative representation in terms of localized “Wannier functions” was introduced by Gregory Wannier in 1937. The connection between the Bloch and Wannier representations is realized by families of transformations in a continuous space of unitary matrices, carrying a large degree of arbitrariness. Since 1997, methods have been developed that allow one to iteratively transform the extended Bloch orbitals of a first-principles calculation into a unique set of maximally localized Wannier functions, accomplishing the solid-state equivalent of constructing localized molecular orbitals, or “Boys orbitals” as previously known from the chemistry literature. These developments are reviewed here, and a survey of the applications of these methods is presented. This latter includes a description of their use in analyzing the nature of chemical bonding, or as a local probe of phenomena related to electric polarization and orbital magnetization. Wannier interpolation schemes are also reviewed, by which quantities computed on a coarse reciprocal-space mesh can be used to interpolate onto much finer meshes at low cost, and applications in which Wannier functions are used as efficient basis functions are discussed. Finally the construction and use of Wannier functions outside the context of electronic-structure theory is presented, for cases that include phonon excitations, photonic crystals, and cold-atom optical lattices.

  14. The Constrained Maximal Expression Level Owing to Haploidy Shapes Gene Content on the Mammalian X Chromosome.

    PubMed

    Hurst, Laurence D; Ghanbarian, Avazeh T; Forrest, Alistair R R; Huminiecki, Lukasz

    2015-12-01

    X chromosomes are unusual in many regards, not least of which is their nonrandom gene content. The causes of this bias are commonly discussed in the context of sexual antagonism and the avoidance of activity in the male germline. Here, we examine the notion that, at least in some taxa, functionally biased gene content may more profoundly be shaped by limits imposed on gene expression owing to haploid expression of the X chromosome. Notably, if the X, as in primates, is transcribed at rates comparable to the ancestral rate (per promoter) prior to the X chromosome formation, then the X is not a tolerable environment for genes with very high maximal net levels of expression, owing to transcriptional traffic jams. We test this hypothesis using The Encyclopedia of DNA Elements (ENCODE) and data from the Functional Annotation of the Mammalian Genome (FANTOM5) project. As predicted, the maximal expression of human X-linked genes is much lower than that of genes on autosomes: on average, maximal expression is three times lower on the X chromosome than on autosomes. Similarly, autosome-to-X retroposition events are associated with lower maximal expression of retrogenes on the X than seen for X-to-autosome retrogenes on autosomes. Also as expected, X-linked genes have a lesser degree of increase in gene expression than autosomal ones (compared to the human/Chimpanzee common ancestor) if highly expressed, but not if lowly expressed. The traffic jam model also explains the known lower breadth of expression for genes on the X (and the Z of birds), as genes with broad expression are, on average, those with high maximal expression. As then further predicted, highly expressed tissue-specific genes are also rare on the X and broadly expressed genes on the X tend to be lowly expressed, both indicating that the trend is shaped by the maximal expression level not the breadth of expression per se. Importantly, a limit to the maximal expression level explains biased tissue of expression profiles of X-linked genes. Tissues whose tissue-specific genes are very highly expressed (e.g., secretory tissues, tissues abundant in structural proteins) are also tissues in which gene expression is relatively rare on the X chromosome. These trends cannot be fully accounted for in terms of alternative models of biased expression. In conclusion, the notion that it is hard for genes on the Therian X to be highly expressed, owing to transcriptional traffic jams, provides a simple yet robustly supported rationale of many peculiar features of X's gene content, gene expression, and evolution. PMID:26685068

  15. The Constrained Maximal Expression Level Owing to Haploidy Shapes Gene Content on the Mammalian X Chromosome

    PubMed Central

    Hurst, Laurence D.; Ghanbarian, Avazeh T.; Forrest, Alistair R. R.; Huminiecki, Lukasz

    2015-01-01

    X chromosomes are unusual in many regards, not least of which is their nonrandom gene content. The causes of this bias are commonly discussed in the context of sexual antagonism and the avoidance of activity in the male germline. Here, we examine the notion that, at least in some taxa, functionally biased gene content may more profoundly be shaped by limits imposed on gene expression owing to haploid expression of the X chromosome. Notably, if the X, as in primates, is transcribed at rates comparable to the ancestral rate (per promoter) prior to the X chromosome formation, then the X is not a tolerable environment for genes with very high maximal net levels of expression, owing to transcriptional traffic jams. We test this hypothesis using The Encyclopedia of DNA Elements (ENCODE) and data from the Functional Annotation of the Mammalian Genome (FANTOM5) project. As predicted, the maximal expression of human X-linked genes is much lower than that of genes on autosomes: on average, maximal expression is three times lower on the X chromosome than on autosomes. Similarly, autosome-to-X retroposition events are associated with lower maximal expression of retrogenes on the X than seen for X-to-autosome retrogenes on autosomes. Also as expected, X-linked genes have a lesser degree of increase in gene expression than autosomal ones (compared to the human/Chimpanzee common ancestor) if highly expressed, but not if lowly expressed. The traffic jam model also explains the known lower breadth of expression for genes on the X (and the Z of birds), as genes with broad expression are, on average, those with high maximal expression. As then further predicted, highly expressed tissue-specific genes are also rare on the X and broadly expressed genes on the X tend to be lowly expressed, both indicating that the trend is shaped by the maximal expression level not the breadth of expression per se. Importantly, a limit to the maximal expression level explains biased tissue of expression profiles of X-linked genes. Tissues whose tissue-specific genes are very highly expressed (e.g., secretory tissues, tissues abundant in structural proteins) are also tissues in which gene expression is relatively rare on the X chromosome. These trends cannot be fully accounted for in terms of alternative models of biased expression. In conclusion, the notion that it is hard for genes on the Therian X to be highly expressed, owing to transcriptional traffic jams, provides a simple yet robustly supported rationale of many peculiar features of X’s gene content, gene expression, and evolution. PMID:26685068

  16. What to Expect during Heart Surgery

    MedlinePLUS

    ... the NHLBI on Twitter. What To Expect During Heart Surgery Heart surgery is done in a hospital, ... surgery, takes about 3–6 hours. Traditional Open-Heart Surgery For this type of surgery, you'll ...

  17. Classics in the Classroom: Great Expectations Fulfilled.

    ERIC Educational Resources Information Center

    Pearl, Shela

    1986-01-01

    Describes how an English teacher in a Queens, New York, ghetto school introduced her grade nine students to Charles Dickens's "Great Expectations." Focuses on students' responses, which eventually became enthusiastic, and discusses the use of classics within the curriculum. (KH)

  18. Parental outcome expectations on children's TV viewing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Children's TV viewing has been associated with increased sedentary behavior and poor eating habits. Positive intervention effects have been observed when addressing outcome expectations as a mediator in interventions targeting children's dietary behavior. Little is known about parental outcome expec...

  19. What To Expect Before a Lung Transplant

    MedlinePLUS

    ... NHLBI on Twitter. What To Expect Before a Lung Transplant If you get into a medical center's ... friends also can offer support. When a Donor Lung Becomes Available OPTN matches donor lungs to recipients ...

  20. What to Expect During a Lung Transplant

    MedlinePLUS

    ... NHLBI on Twitter. What To Expect During a Lung Transplant Just before lung transplant surgery, you will ... airway and its blood vessels to your heart. Lung Transplant The illustration shows the process of a ...

  1. An expectation model of referring expressions

    E-print Network

    Kræmer, John, Ph. D. Massachusetts Institute of Technology

    2010-01-01

    This thesis introduces EMRE, an expectation-based model of referring expressions. EMRE is proposed as a model of non-syntactic dependencies - in particular, discourse-level semantic dependencies that bridge sentence gaps. ...

  2. A multistate analysis of active life expectancy.

    PubMed Central

    Rogers, A; Rogers, R G; Branch, L G

    1989-01-01

    With today's lower mortality rates, longer expectations of life, and new medical technologies, the nation's health policy focus has shifted from emphasis on individual survival to emphasis on personal health and independent living. Using longitudinal data sets and new methodological techniques, researchers have begun to assess active life expectancies, estimating not only how long a subpopulation can expect to live beyond each age, but what fractions of the expected remaining lifetime will be lived as independent, dependent, or institutionalized. New ideas are addressed, applying recently developed multistate life table methods to Waves One and Two of the Massachusetts Health Care Panel Study. Expectations of active life are presented for those 65 and older who initially are in one of two functional states of well-being. Included are expectations of life, for those, for example, who were independent and remained so, or those who were dependent and became independent. Although public health officials are concerned about the number of elderly who cease being independent, preliminary analysis shows that a significant number of the dependent elderly regain their independence, a situation which needs to be addressed in health care planning. PMID:2498971

  3. [Live longer, suffer more? Trends in life expectancy and health].

    PubMed

    Doblhammer, G; Kreft, D

    2011-08-01

    During the 20th century, life expectancy has been continuously increasing with the majority of the additional years resulting from decreasing mortality among the old and oldest old in the last few decades. Two phases of convergence and divergence in European mortality have been identified, with a possible new phase of divergence taking place among the oldest old. Over this period, women have always been living longer than men. Explanations for this phenomenon include not only biological factors and differences in lifestyle and health care utilization, but also differences in reporting patterns. Trends in health do not follow a clear direction. Reasons are the different dimensions of health as well as inadequate data. In general, the prevalence of morbidity has been increasing, while functional limitations and ADL disabilities have been decreasing. Due to a lack of data, no information exists for trends among the oldest old at age 80 and above. In absolute terms, the number of healthy years has been increasing with increasing life expectancy. In relative terms, they have been stable, probably slightly increasing in some countries. These trends suggest that increasing life expectancy does not result in an increase in morbidity. PMID:21800237

  4. Expected Utility as a Policy Making Tool: An Environmental Health Example

    E-print Network

    environmental contamination. 1 Introduction Decision­making under uncertainty has long been an aim into the framework of making a decision that is optimal in some sense, the problem of accounting for the evaluation for consideration of some of the more qualitative aspects of decision making. The outline of the paper is as follows

  5. Understanding the Hows and Whys of Decision-Making: From Expected Utility to Divisive Normalization.

    PubMed

    Glimcher, Paul

    2014-01-01

    Over the course of the last century, economists and ethologists have built detailed models from first principles of how humans and animals should make decisions. Over the course of the last few decades, psychologists and behavioral economists have gathered a wealth of data at variance with the predictions of these economic models. This has led to the development of highly descriptive models that can often predict what choices people or animals will make but without offering any insight into why people make the choices that they do--especially when those choices reduce a decision-maker's well-being. Over the course of the last two decades, neurobiologists working with economists and psychologists have begun to use our growing understanding of how the nervous system works to develop new models of how the nervous system makes decisions. The result, a growing revolution at the interdisciplinary border of neuroscience, psychology, and economics, is a new field called Neuroeconomics. Emerging neuroeconomic models stand to revolutionize our understanding of human and animal choice behavior by combining fundamental properties of neurobiological representation with decision-theoretic analyses. In this overview, one class of these models, based on the widely observed neural computation known as divisive normalization, is presented in detail. The work demonstrates not only that a discrete class of computation widely observed in the nervous system is fundamentally ubiquitous, but how that computation shapes behaviors ranging from visual perception to financial decision-making. It also offers the hope of reconciling economic analysis of what choices we should make with psychological observations of the choices we actually do make. PMID:25637264

  6. Isotope gastric emptying tests in clinical practice: expectation, outcome, and utility.

    PubMed Central

    Galil, M A; Critchley, M; Mackie, C R

    1993-01-01

    Tests of gastric emptying with modern scintigraphic methods are recommended in the clinical management of gastric disorders. An audit of 472 gastric emptying tests carried out over a 10 year period was performed to discover the reasons for requests from consultant clinicians, their anticipation of the results of tests, and the influence of the results upon the subsequent management of their patients. Excluding control (n = 47) and research (n = 50) studies, there were 375 clinical referrals that could be grouped under the headings: non-ulcer dyspepsia (n = 72), suspected diabetic gastroparesis (n = 18), peptic ulcer (n = 15), suspected delayed gastric emptying after surgery (n = 154), dumping and diarrhoea (= 107), and other indications (n = 9). Although the results were abnormal for 55 (48%) of the 'medical' patients, they did not seem to influence clinical management. Delayed gastric emptying after surgery was confirmed in only 20% of patients referred with this clinical diagnosis. Conversely, most (79%) o the patients referred with dumping and diarrhoea exhibited abnormally rapid emptying. Isotope gastric emptying studies may be useful in clinical practice. The results are often at variance with the clinical diagnosis. Clinicians must take into account the nature of the test meal used when results are correlated with clinical features. PMID:8344578

  7. Pace's Maxims for Homegrown Library Projects. Coming Full Circle

    ERIC Educational Resources Information Center

    Pace, Andrew K.

    2005-01-01

    This article discusses six maxims by which to run library automation. The following maxims are discussed: (1) Solve only known problems; (2) Avoid changing data to fix display problems; (3) Aut viam inveniam aut faciam; (4) If you cannot make it yourself, buy something; (5) Kill the alligator closest to the boat; and (6) Just because yours is…

  8. GENERATION AND RANDOM GENERATION: FROM SIMPLE GROUPS TO MAXIMAL SUBGROUPS

    E-print Network

    Burness, Tim

    GENERATION AND RANDOM GENERATION: FROM SIMPLE GROUPS TO MAXIMAL SUBGROUPS TIMOTHY C. BURNESS of generators for G. It is well known that d(G) = 2 for all (non-abelian) finite simple groups. We prove that d investigate the random generation of maximal subgroups of simple and almost simple groups. By applying

  9. EMSL Strategic Plan to Maximize Scientific Impact of

    E-print Network

    community. In order to jumpstart interest within the scientific community (particularly in the BER communityEMSL Strategic Plan to Maximize Scientific Impact of the Quiet Wing PEMP Notable Outcomes Goal 2, Washington 99352 #12;1.2 #12;EMSL Strategic Plan to Maximize Scientific Impact of the Quiet Wing 1 1

  10. Preschoolers Can Recognize Violations of the Gricean Maxims

    ERIC Educational Resources Information Center

    Eskritt, Michelle; Whalen, Juanita; Lee, Kang

    2008-01-01

    Grice ("Syntax and semantics: Speech acts", 1975, pp. 41-58, Vol. 3) proposed that conversation is guided by a spirit of cooperation that involves adherence to several conversational maxims. Three types of maxims were explored in the current study: 1) Quality, to be truthful; 2) Relation, to say only what is relevant to a conversation; and 3)…

  11. Generative Modeling for Maximizing Precision and Recall in Information Visualization

    E-print Network

    Kaski, Samuel

    579 Generative Modeling for Maximizing Precision and Recall in Information Visualization Jaakko) precision and recall. We turn the visualiza- tion into a generative modeling task where a simple user model maximizes pure recall, adding a mixture component that "explains away" misses allows our generative model

  12. Effect of Age and Other Factors on Maximal Heart Rate.

    ERIC Educational Resources Information Center

    Londeree, Ben R.; Moeschberger, Melvin L.

    1982-01-01

    To reduce confusion regarding reported effects of age on maximal exercise heart rate, a comprehensive review of the relevant English literature was conducted. Data on maximal heart rate after exercising with a bicycle, a treadmill, and after swimming were analyzed with regard to physical fitness and to age, sex, and racial differences. (Authors/PP)

  13. The maximal body massarea relationship in island mammals

    E-print Network

    Gonzalez, Andrew

    area, island evolution, mammal, maximal body mass. Journal of Biogeography (J. Biogeogr.) (2011) ª 2011ORIGINAL ARTICLE The maximal body mass­area relationship in island mammals Virginie Millien1 of a species' physiology, ecology and evolution (Peters, 1983; Calder, 1984; Schmidt-Nielsen, 1984), including

  14. Parallel Double Greedy Submodular Maximization Xinghao Pan1

    E-print Network

    McAuliffe, Jon

    ,stefje,jegonzal,josephkb,jordan}@eecs.berkeley.edu Abstract Many machine learning problems can be reduced to the maximization of sub- modular functions and tradeoffs of each approach. 1 Introduction Many important problems including sensor placement [3], image co-case guarantees on the quality of the solution. For several maximization problems of monotone submodular functions

  15. Surrogate Maximization/Minimization Algorithms and Extensions Zhihua Zhang

    E-print Network

    Yeung, Dit-Yan

    Surrogate Maximization/Minimization Algorithms and Extensions Zhihua Zhang Department of Electrical Clear Water Bay, Kowloon, Hong Kong Abstract Surrogate maximization (or minimization) (SM) algorithms by iterating two steps. The S-step computes a tractable surrogate function to substitute the orig- inal

  16. Detrimental Relations of Maximization with Academic and Career Attitudes

    ERIC Educational Resources Information Center

    Dahling, Jason J.; Thompson, Mindi N.

    2013-01-01

    Maximization refers to a decision-making style that involves seeking the single best option when making a choice, which is generally dysfunctional because people are limited in their ability to rationally evaluate all options and identify the single best outcome. The vocational consequences of maximization are examined in two samples, college…

  17. ON THE DERIVATION OF A CONVERSATIONAL MAXIM Th. R. Hofmann

    E-print Network

    --- believed to be a maxim for the speaker in well-formed conversation -- is in fact only a special case. This leads to questions about conversational maxims & the nature of the relation between syntactic form for the speaker. To illustrate their operation with the i st 2 groups, consider one saying "I think it is rain

  18. Note on maximally entangled Eisert-Lewenstein-Wilkens quantum games

    NASA Astrophysics Data System (ADS)

    Bolonek-Laso?, Katarzyna; Kosi?ski, Piotr

    2015-10-01

    Maximally entangled Eisert-Lewenstein-Wilkens games are analyzed. For a general class of gates defined in the previous papers of the first author, the general conditions are derived which allow to determine the form of gate leading to maximally entangled games. The construction becomes particularly simple provided one does distinguish between games differing by relabeling of strategies. Some examples are presented.

  19. [The effects of teacher expectancy and self-expectancy on performance].

    PubMed

    Choi, K S

    1987-08-01

    The present study was designed to investigate the effect on performance of the relationship between teacher expectancy and self-expectancy. For the induced expectancy, a random half of 96 high school students enrolled in a four-week summer language course of a Christian association were described to the instructors as having high success potential. The remaining trainees served as controls. Correct scores on the learning task, instructor ratings of behavior and attitude of the instructors were measured on three sessions of the course. Ratings of teacher's behavior were factor-analyzed and four interpretable factors emerged: Support, Caring, Attention, and Tutoring. The induced expectancy and specific levels of self-expectancy had significant effects on the subjects' performance and ratings of the instructor. It was concluded that self-expectancy mediates the effects of teacher expectancy on learning performance. Implications of these results for the Pygmalion effect were discussed. PMID:3450908

  20. Maximally entangled states in pseudo-telepathy games

    E-print Network

    Laura Man?inska

    2015-06-23

    A pseudo-telepathy game is a nonlocal game which can be won with probability one using some finite-dimensional quantum strategy but not using a classical one. Our central question is whether there exist two-party pseudo-telepathy games which cannot be won with probability one using a maximally entangled state. Towards answering this question, we develop conditions under which maximally entangled states suffice. In particular, we show that maximally entangled states suffice for weak projection games which we introduce as a relaxation of projection games. Our results also imply that any pseudo-telepathy weak projection game yields a device-independent certification of a maximally entangled state. In particular, by establishing connections to the setting of communication complexity, we exhibit a class of games $G_n$ for testing maximally entangled states of local dimension $\\Omega(n)$. We leave the robustness of these self-tests as an open question.

  1. On the maximal efficiency of the collisional Penrose process

    E-print Network

    Leiderschneider, Elly

    2015-01-01

    The center of mass (CM) energy in a collisional Penrose process - a collision taking place within the ergosphere of a Kerr black hole - can diverge under suitable extreme conditions (maximal Kerr, near horizon collision and suitable impact parameters). We present an analytic expression for the CM energy, refining expressions given in the literature. Even though the CM energy diverges, we show that the maximal energy attained by a particle that escapes the black hole's gravitational pull and reaches infinity is modest. We obtain an analytic expression for the energy of an escaping particle resulting from a collisional Penrose process, and apply it to derive the maximal energy and the maximal efficiency for several physical scenarios: pair annihilation, Compton scattering, and the elastic scattering of two massive particles. In all physically reasonable cases (in which the incident particles initially fall from infinity towards the black hole) the maximal energy (and the corresponding efficiency) are only one o...

  2. Home care technology through an ability expectation lens.

    PubMed

    Wolbring, Gregor; Lashewicz, Bonnie

    2014-01-01

    Home care is on the rise, and its delivery is increasingly reliant on an expanding variety of health technologies ranging from computers to telephone "health apps" to social robots. These technologies are most often predicated on expectations that people in their homes (1) can actively interact with these technologies and (2) are willing to submit to the action of the technology in their home. Our purpose is to use an "ability expectations" lens to bring together, and provide some synthesis of, the types of utility and disadvantages that can arise for people with disabilities in relation to home care technology development and use. We searched the academic databases Scopus, Web of Science, EBSCO ALL, IEEE Xplore, and Compendex to collect articles that had the term "home care technology" in the abstract or as a topic (in the case of Web of Science). We also used our background knowledge and related academic literature pertaining to self-diagnosis, health monitoring, companionship, health information gathering, and care. We examined background articles and articles collected through our home care technology search in terms of ability expectations assumed in the presentation of home care technologies, or discussed in relation to home care technologies. While advances in health care support are made possible through emerging technologies, we urge critical examination of such technologies in terms of implications for the rights and dignity of people with diverse abilities. Specifically, we see potential for technologies to result in new forms of exclusion and powerlessness. Ableism influences choices made by funders, policy makers, and the public in the development and use of home health technologies and impacts how people with disabilities are served and how useful health support technologies will be for them. We urge continued critical examination of technology development and use according to ability expectations, and we recommend increasing incorporation of participatory design processes to counteract potential for health support technology to render people with disabilities technologically excluded and powerless. PMID:24956581

  3. Adolescent Expectations of Early Death Predict Young Adult Socioeconomic Status

    PubMed Central

    Nguyen, Quynh C.; Hussey, Jon M.; Halpern, Carolyn T.; Villaveces, Andres; Marshall, Stephen W.; Siddiqi, Arjumand; Poole, Charles

    2013-01-01

    Among adolescents, expectations of early death have been linked to future risk behaviors. These expectations may also reduce personal investment in education and training, thereby lowering adult socioeconomic status attainment. The importance of socioeconomic status is highlighted by pervasive health inequities and dramatic differences in life expectancy among education and income groups. The objectives of this study were to investigate patterns of change in perceived chances of living to age 35 (Perceived Survival Expectations; PSE), predictors of PSE, and associations between PSE and future socioeconomic status attainment. We utilized the U.S. National Longitudinal Study of Adolescent Health (Add Health) initiated in 1994-95 among 20,745 adolescents in grades 7-12 with follow-up interviews in 1996 (Wave II), 2001-02 (Wave III) and 2008 (Wave IV; ages 24-32). At Wave I, 14% reported ? 50% chance of living to age 35 and older adolescents reported lower PSE than younger adolescents. At Wave III, PSE were similar across age. Changes in PSE from Wave I to III were moderate, with 89% of respondents reporting no change (56%), one level higher (22%) or one level lower (10%) in a 5-level PSE variable. Higher block group poverty rate, perceptions that the neighborhood is unsafe, and less time in the U.S. (among the foreign-born) were related to low PSE at Waves I and III. Low PSE at Waves I and III predicted lower education attainment and personal earnings at Wave IV in multinomial logistic regression models controlling for confounding factors such as previous family socioeconomic status, individual demographic characteristics, and depressive symptoms. Anticipation of an early death is prevalent among adolescents and predictive of lower future socioeconomic status. Low PSE reported early in life may be a marker for worse health trajectories. PMID:22405687

  4. Home Care Technology Through an Ability Expectation Lens

    PubMed Central

    2014-01-01

    Home care is on the rise, and its delivery is increasingly reliant on an expanding variety of health technologies ranging from computers to telephone “health apps” to social robots. These technologies are most often predicated on expectations that people in their homes (1) can actively interact with these technologies and (2) are willing to submit to the action of the technology in their home. Our purpose is to use an “ability expectations” lens to bring together, and provide some synthesis of, the types of utility and disadvantages that can arise for people with disabilities in relation to home care technology development and use. We searched the academic databases Scopus, Web of Science, EBSCO ALL, IEEE Xplore, and Compendex to collect articles that had the term “home care technology” in the abstract or as a topic (in the case of Web of Science). We also used our background knowledge and related academic literature pertaining to self-diagnosis, health monitoring, companionship, health information gathering, and care. We examined background articles and articles collected through our home care technology search in terms of ability expectations assumed in the presentation of home care technologies, or discussed in relation to home care technologies. While advances in health care support are made possible through emerging technologies, we urge critical examination of such technologies in terms of implications for the rights and dignity of people with diverse abilities. Specifically, we see potential for technologies to result in new forms of exclusion and powerlessness. Ableism influences choices made by funders, policy makers, and the public in the development and use of home health technologies and impacts how people with disabilities are served and how useful health support technologies will be for them. We urge continued critical examination of technology development and use according to ability expectations, and we recommend increasing incorporation of participatory design processes to counteract potential for health support technology to render people with disabilities technologically excluded and powerless. PMID:24956581

  5. MISR IDL Utilities

    Atmospheric Science Data Center

    2013-03-20

    ... information. A tar file package is available for download which is opened on Unix systems with the tar utility or on Windows systems with a utility such as WinZip, or the files can be downloaded ...

  6. Global biomass production potentials exceed expected future demand without the need for cropland expansion.

    PubMed

    Mauser, Wolfram; Klepper, Gernot; Zabel, Florian; Delzeit, Ruth; Hank, Tobias; Putzenlechner, Birgitta; Calzadilla, Alvaro

    2015-01-01

    Global biomass demand is expected to roughly double between 2005 and 2050. Current studies suggest that agricultural intensification through optimally managed crops on today's cropland alone is insufficient to satisfy future demand. In practice though, improving crop growth management through better technology and knowledge almost inevitably goes along with (1) improving farm management with increased cropping intensity and more annual harvests where feasible and (2) an economically more efficient spatial allocation of crops which maximizes farmers' profit. By explicitly considering these two factors we show that, without expansion of cropland, today's global biomass potentials substantially exceed previous estimates and even 2050s' demands. We attribute 39% increase in estimated global production potentials to increasing cropping intensities and 30% to the spatial reallocation of crops to their profit-maximizing locations. The additional potentials would make cropland expansion redundant. Their geographic distribution points at possible hotspots for future intensification. PMID:26558436

  7. Global biomass production potentials exceed expected future demand without the need for cropland expansion

    PubMed Central

    Mauser, Wolfram; Klepper, Gernot; Zabel, Florian; Delzeit, Ruth; Hank, Tobias; Putzenlechner, Birgitta; Calzadilla, Alvaro

    2015-01-01

    Global biomass demand is expected to roughly double between 2005 and 2050. Current studies suggest that agricultural intensification through optimally managed crops on today's cropland alone is insufficient to satisfy future demand. In practice though, improving crop growth management through better technology and knowledge almost inevitably goes along with (1) improving farm management with increased cropping intensity and more annual harvests where feasible and (2) an economically more efficient spatial allocation of crops which maximizes farmers' profit. By explicitly considering these two factors we show that, without expansion of cropland, today's global biomass potentials substantially exceed previous estimates and even 2050s' demands. We attribute 39% increase in estimated global production potentials to increasing cropping intensities and 30% to the spatial reallocation of crops to their profit-maximizing locations. The additional potentials would make cropland expansion redundant. Their geographic distribution points at possible hotspots for future intensification. PMID:26558436

  8. Maximizing energy transfer in vibrofluidized granular systems.

    PubMed

    Windows-Yule, C R K; Rosato, A D; Parker, D J; Thornton, A R

    2015-05-01

    Using discrete particle simulations validated by experimental data acquired using the positron emission particle tracking technique, we study the efficiency of energy transfer from a vibrating wall to a system of discrete, macroscopic particles. We demonstrate that even for a fixed input energy from the wall, energy conveyed to the granular system under excitation may vary significantly dependent on the frequency and amplitude of the driving oscillations. We investigate the manner in which the efficiency with which energy is transferred to the system depends on the system variables and determine the key control parameters governing the optimization of this energy transfer. A mechanism capable of explaining our results is proposed, and the implications of our findings in the research field of granular dynamics as well as their possible utilization in industrial applications are discussed. PMID:26066169

  9. EXPECT: Explicit Representations for Flexible Acquisition

    NASA Technical Reports Server (NTRS)

    Swartout, BIll; Gil, Yolanda

    1995-01-01

    To create more powerful knowledge acquisition systems, we not only need better acquisition tools, but we need to change the architecture of the knowledge based systems we create so that their structure will provide better support for acquisition. Current acquisition tools permit users to modify factual knowledge but they provide limited support for modifying problem solving knowledge. In this paper, the authors argue that this limitation (and others) stem from the use of incomplete models of problem-solving knowledge and inflexible specification of the interdependencies between problem-solving and factual knowledge. We describe the EXPECT architecture which addresses these problems by providing an explicit representation for problem-solving knowledge and intent. Using this more explicit representation, EXPECT can automatically derive the interdependencies between problem-solving and factual knowledge. By deriving these interdependencies from the structure of the knowledge-based system itself EXPECT supports more flexible and powerful knowledge acquisition.

  10. Information structure expectations in sentence comprehension

    PubMed Central

    Carlson, Katy; Dickey, Michael Walsh; Frazier, Lyn; Clifton, Charles

    2009-01-01

    In English, new information typically appears late in the sentence, as does primary accent. Because of this tendency, perceivers might expect the final constituent or constituents of a sentence to contain informational focus. This expectation should in turn affect how they comprehend focus-sensitive constructions such as ellipsis sentences. Results from four experiments on sluicing sentences (e.g., The mobster implicated the thug, but we can’t find out who else) suggest that perceivers do prefer to place focus late in the sentence, though that preference can be mitigated by prosodic information (pitch accents, Experiment 2) or syntactic information (clefted sentences, Experiment 3) indicating that focus is located elsewhere. Furthermore, it is not necessarily the direct object, but the informationally-focused constituent that is the preferred antecedent (Experiment 4). Expectations regarding the information structure of a sentence, which are only partly cancelable by means of overt focus markers, may explain persistent biases in ellipsis resolution. PMID:18609404

  11. Reflections on meeting women's childbirth expectations.

    PubMed

    Records, Kathie; Wilson, Barbara L

    2011-01-01

    When care providers support their personal worth, use caring communication, facilitate consumer participation in decision making, seek optimal outcomes, and know the patient holistically, female patients feel that their dignity is respected. We compare women's expectations for dignified care in contemporary society with the expectations of women 40 years ago. Some progress has been made toward valuing women's voices and participation in decision making, the availability of interventions for optimal outcomes, and recognition of the importance of cultural competence. Continued work is needed to meet women's expectations for receiving individualized and tailored care, information about intervention effectiveness and risks, and support for the birth process that the family desires. A renewed focus on the recipient of care as a coparticipant in her birthing experiences may result in improved outcomes and resolution of tensions between childbearing women and sociopolitical forces and standards of care. PMID:21771068

  12. The Dopaminergic Midbrain Encodes the Expected Certainty about Desired Outcomes.

    PubMed

    Schwartenbeck, Philipp; FitzGerald, Thomas H B; Mathys, Christoph; Dolan, Ray; Friston, Karl

    2015-10-01

    Dopamine plays a key role in learning; however, its exact function in decision making and choice remains unclear. Recently, we proposed a generic model based on active (Bayesian) inference wherein dopamine encodes the precision of beliefs about optimal policies. Put simply, dopamine discharges reflect the confidence that a chosen policy will lead to desired outcomes. We designed a novel task to test this hypothesis, where subjects played a "limited offer" game in a functional magnetic resonance imaging experiment. Subjects had to decide how long to wait for a high offer before accepting a low offer, with the risk of losing everything if they waited too long. Bayesian model comparison showed that behavior strongly supported active inference, based on surprise minimization, over classical utility maximization schemes. Furthermore, midbrain activity, encompassing dopamine projection neurons, was accurately predicted by trial-by-trial variations in model-based estimates of precision. Our findings demonstrate that human subjects infer both optimal policies and the precision of those inferences, and thus support the notion that humans perform hierarchical probabilistic Bayesian inference. In other words, subjects have to infer both what they should do as well as how confident they are in their choices, where confidence may be encoded by dopaminergic firing. PMID:25056572

  13. The Dopaminergic Midbrain Encodes the Expected Certainty about Desired Outcomes

    PubMed Central

    Schwartenbeck, Philipp; FitzGerald, Thomas H. B.; Mathys, Christoph; Dolan, Ray; Friston, Karl

    2015-01-01

    Dopamine plays a key role in learning; however, its exact function in decision making and choice remains unclear. Recently, we proposed a generic model based on active (Bayesian) inference wherein dopamine encodes the precision of beliefs about optimal policies. Put simply, dopamine discharges reflect the confidence that a chosen policy will lead to desired outcomes. We designed a novel task to test this hypothesis, where subjects played a “limited offer” game in a functional magnetic resonance imaging experiment. Subjects had to decide how long to wait for a high offer before accepting a low offer, with the risk of losing everything if they waited too long. Bayesian model comparison showed that behavior strongly supported active inference, based on surprise minimization, over classical utility maximization schemes. Furthermore, midbrain activity, encompassing dopamine projection neurons, was accurately predicted by trial-by-trial variations in model-based estimates of precision. Our findings demonstrate that human subjects infer both optimal policies and the precision of those inferences, and thus support the notion that humans perform hierarchical probabilistic Bayesian inference. In other words, subjects have to infer both what they should do as well as how confident they are in their choices, where confidence may be encoded by dopaminergic firing. PMID:25056572

  14. Design and manufacturing rules for maximizing the performance of polycrystalline piezoelectric bending actuators

    NASA Astrophysics Data System (ADS)

    Jafferis, Noah T.; Smith, Michael J.; Wood, Robert J.

    2015-06-01

    Increasing the energy and power density of piezoelectric actuators is very important for any weight-sensitive application, and is especially crucial for enabling autonomy in micro/milli-scale robots and devices utilizing this technology. This is achieved by maximizing the mechanical flexural strength and electrical dielectric strength through the use of laser-induced melting or polishing, insulating edge coating, and crack-arresting features, combined with features for rigid ground attachments to maximize force output. Manufacturing techniques have also been developed to enable mass customization, in which sheets of material are pre-stacked to form a laminate from which nearly arbitrary planar actuator designs can be fabricated using only laser cutting. These techniques have led to a 70% increase in energy density and an increase in mean lifetime of at least 15× compared to prior manufacturing methods. In addition, measurements have revealed a doubling of the piezoelectric coefficient when operating at the high fields necessary to achieve maximal energy densities, along with an increase in the Young’s modulus at the high compressive strains encountered—these two effects help to explain the higher performance of our actuators as compared to that predicted by linear models.

  15. Utility interests in cogeneration

    SciTech Connect

    Not Available

    1987-08-01

    A listing of utilities expressing an interest in cogeneration notes the difference between intention and action and the range of competition approach taken by individual utilities. The list cites the name of the utility, the cogeneration company, and the nature of the facilities. The list includes 31 facilities.

  16. D2-brane Chern-Simons theories: F-maximization = a-maximization

    E-print Network

    Fluder, Martin

    2015-01-01

    We study a system of N D2-branes probing a generic Calabi-Yau three-fold singularity in the presence of a non-zero quantized Romans mass n. We argue that the low-energy effective N = 2 Chern-Simons quiver gauge theory flows to a superconformal fixed point in the IR, and construct the dual AdS_4 solution in massive IIA supergravity. We compute the free energy F of the gauge theory on S^3 using localization. In the large N limit we find F = c(nN)^{1/3}a^{2/3}, where c is a universal constant and a is the a-function of the "parent" four-dimensional N = 1 theory on N D3-branes probing the same Calabi-Yau singularity. It follows that maximizing F over the space of admissible R-symmetries is equivalent to maximizing a for this class of theories. Moreover, we show that the gauge theory result precisely matches the holographic free energy of the supergravity solution, and provide a similar matching of the VEV of a BPS Wilson loop operator.

  17. D2-brane Chern-Simons theories: F-maximization = a-maximization

    E-print Network

    Martin Fluder; James Sparks

    2015-07-21

    We study a system of N D2-branes probing a generic Calabi-Yau three-fold singularity in the presence of a non-zero quantized Romans mass n. We argue that the low-energy effective N = 2 Chern-Simons quiver gauge theory flows to a superconformal fixed point in the IR, and construct the dual AdS_4 solution in massive IIA supergravity. We compute the free energy F of the gauge theory on S^3 using localization. In the large N limit we find F = c(nN)^{1/3}a^{2/3}, where c is a universal constant and a is the a-function of the "parent" four-dimensional N = 1 theory on N D3-branes probing the same Calabi-Yau singularity. It follows that maximizing F over the space of admissible R-symmetries is equivalent to maximizing a for this class of theories. Moreover, we show that the gauge theory result precisely matches the holographic free energy of the supergravity solution, and provide a similar matching of the VEV of a BPS Wilson loop operator.

  18. Evaluation of anti-hyperglycemic effect of Actinidia kolomikta (Maxim. etRur.) Maxim. root extract.

    PubMed

    Hu, Xuansheng; Cheng, Delin; Wang, Linbo; Li, Shuhong; Wang, Yuepeng; Li, Kejuan; Yang, Yingnan; Zhang, Zhenya

    2015-05-01

    This study aimed to evaluate the anti-hyperglycemic effect of ethanol extract from Actinidia kolomikta (Maxim. etRur.) Maxim. root (AKE).An in vitro evaluation was performed by using rat intestinal ?-glucosidase (maltase and sucrase), the key enzymes linked with type 2 diabetes. And an in vivo evaluation was also performed by loading maltose, sucrose, glucose to normal rats. As a result, AKE showed concentration-dependent inhibition effects on rat intestinal maltase and rat intestinal sucrase with IC(50) values of 1.83 and 1.03mg/mL, respectively. In normal rats, after loaded with maltose, sucrose and glucose, administration of AKE significantly reduced postprandial hyperglycemia, which is similar to acarbose used as an anti-diabetic drug. High contents of total phenolics (80.49 ± 0.05mg GAE/g extract) and total flavonoids (430.69 ± 0.91mg RE/g extract) were detected in AKE. In conclusion, AKE possessed anti-hyperglycemic effects and the possible mechanisms were associated with its inhibition on ?-glucosidase and the improvement on insulin release and/or insulin sensitivity as well. The anti-hyperglycemic activity possessed by AKE maybe attributable to its high contents of phenolic and flavonoid compounds. PMID:26051735

  19. Information Utility, Reader Interest, Publication Rating and Student Newspaper Readership.

    ERIC Educational Resources Information Center

    Lin, Carolyn A.

    2000-01-01

    Investigates student motivation for reading a college student newspaper, including evaluations of performance and content preferences. Finds general support for expectations derived from utility theory; that localism (i.e. campus news) remains one of the strongest niches for an urban college paper; and that heavier readers perceived higher utility

  20. College for some to college for all: social background, occupational expectations, and educational expectations over time.

    PubMed

    Goyette, Kimberly A

    2008-06-01

    The educational expectations of 10th-graders have dramatically increased from 1980 to 2002. Their rise is attributable in part to the changing educational composition of students' parents and related to the educational profiles of their expected occupations. Students whose parents have gone to college are more likely to attend college themselves, and students expect occupations that are more prestigious in 2002 than in 1980. The educational requirements of particular occupation categories have risen only slightly. These analyses also reveal that educational expectations in recent cohorts are more loosely linked to social background and occupational plans than they were in 1980. The declining importance of parents' background and the decoupling of educational and occupational plans, in addition to a strong and significant effect of cohort on educational expectations, suggest that the expectation of four-year college attainment is indeed becoming the norm. PMID:19069055

  1. A taxonomic approach to communicating maxims in interstellar messages

    NASA Astrophysics Data System (ADS)

    Vakoch, Douglas A.

    2011-02-01

    Previous discussions of interstellar messages that could be sent to extraterrestrial intelligence have focused on descriptions of mathematics, science, and aspects of human culture and civilization. Although some of these depictions of humanity have implicitly referred to our aspirations, this has not clearly been separated from descriptions of our actions and attitudes as they are. In this paper, a methodology is developed for constructing interstellar messages that convey information about our aspirations by developing a taxonomy of maxims that provide guidance for living. Sixty-six maxims providing guidance for living were judged for degree of similarity to each of other. Quantitative measures of the degree of similarity between all pairs of maxims were derived by aggregating similarity judgments across individual participants. These composite similarity ratings were subjected to a cluster analysis, which yielded a taxonomy that highlights perceived interrelationships between individual maxims and that identifies major classes of maxims. Such maxims can be encoded in interstellar messages through three-dimensional animation sequences conveying narratives that highlight interactions between individuals. In addition, verbal descriptions of these interactions in Basic English can be combined with these pictorial sequences to increase intelligibility. Online projects to collect messages such as the SETI Institute's Earth Speaks and La Tierra Habla, can be used to solicit maxims from participants around the world.

  2. How fast-growing bacteria robustly tune their ribosome concentration to approximate growth-rate maximization

    PubMed Central

    Bosdriesz, Evert; Molenaar, Douwe; Teusink, Bas; Bruggeman, Frank J

    2015-01-01

    Maximization of growth rate is an important fitness strategy for bacteria. Bacteria can achieve this by expressing proteins at optimal concentrations, such that resources are not wasted. This is exemplified for Escherichia coli by the increase of its ribosomal protein-fraction with growth rate, which precisely matches the increased protein synthesis demand. These findings and others have led to the hypothesis that E. coli aims to maximize its growth rate in environments that support growth. However, what kind of regulatory strategy is required for a robust, optimal adjustment of the ribosome concentration to the prevailing condition is still an open question. In the present study, we analyze the ppGpp-controlled mechanism of ribosome expression used by E. coli and show that this mechanism maintains the ribosomes saturated with its substrates. In this manner, overexpression of the highly abundant ribosomal proteins is prevented, and limited resources can be redirected to the synthesis of other growth-promoting enzymes. It turns out that the kinetic conditions for robust, optimal protein-partitioning, which are required for growth rate maximization across conditions, can be achieved with basic biochemical interactions. We show that inactive ribosomes are the most suitable ‘signal’ for tracking the intracellular nutritional state and for adjusting gene expression accordingly, as small deviations from optimal ribosome concentration cause a huge fractional change in ribosome inactivity. We expect to find this control logic implemented across fast-growing microbial species because growth rate maximization is a common selective pressure, ribosomes are typically highly abundant and thus costly, and the required control can be implemented by a small, simple network. PMID:25754869

  3. Demystify Learning Expectations to Address Grade Inflation

    ERIC Educational Resources Information Center

    Hodges, Linda C.

    2014-01-01

    This article describes the subject of "grade inflation," a reference to educators giving higher grades to student work than their expectations for student achievement warrant. Of the many reasons why this practice happens, Hodges specifically discusses inflating grades as "a natural consequence" when the faculty really…

  4. Macroeconomics after Two Decades of Rational Expectations.

    ERIC Educational Resources Information Center

    McCallum, Bennett T.

    1994-01-01

    Discusses real business cycle analysis, growth theory, and other economic concepts in the context of the rational expectations revolution in macroeconomics. Focuses on post-1982 research. Concludes that the rejuvenation of growth analysis is an encouraging development because it could lead to changes in welfare policy. (CFR)

  5. Colleges and Companies Sharing Great Expectations.

    ERIC Educational Resources Information Center

    Council for Industry and Higher Education (United Kingdom).

    Companies have high expectations of the 440 colleges of further education, which will be the largest source of the skilled middle-range staff on whom industrial efficiency and innovation depend. Colleges look to employers to play four distinct roles--as customers, as places of learning, as advisors, and as joint planners in the regional or local…

  6. Future Expectations of Brasilian Street Youth

    ERIC Educational Resources Information Center

    Raffaelli, M.; Koller, S.H.

    2005-01-01

    Future expectations of youth surviving on the streets of Porto Alegre, Brasil, were examined. The sample consisted of 35 boys and 34 girls aged 10-18 (M age 14.4) who participated in a sentence completion task and semi-structured interviews. Responses to two incomplete sentences regarding the future revealed a mismatch between hoped-for and…

  7. Perspective Great Expectations: Using Whole-Brain

    E-print Network

    Deco, Gustavo

    Neuron Perspective Great Expectations: Using Whole-Brain Computational Connectomics for Understanding Neuropsychiatric Disorders Gustavo Deco1,2,* and Morten L. Kringelbach3,4 1Center for Brain://dx.doi.org/10.1016/j.neuron.2014.08.034 The study of human brain networks with in vivo neuroimaging has given

  8. Unrealistic Expectations Businesses Have about Translators.

    ERIC Educational Resources Information Center

    Rodriguez, Cecilia M.

    Increased international business and technological advances that speed business communication are affecting the expectations that business has for translators. More companies are asking translation agencies to translate such items as English business letters, advertising campaigns, flyers, brochures, and technical manuals into other languages,…

  9. NCAA Penalizes Fewer Teams than Expected

    ERIC Educational Resources Information Center

    Sander, Libby

    2008-01-01

    This article reports that the National Collegiate Athletic Association (NCAA) has penalized fewer teams than it expected this year over athletes' poor academic performance. For years, officials with the NCAA have predicted that strikingly high numbers of college sports teams could be at risk of losing scholarships this year because of their…

  10. Inverse momentum expectation values for hydrogenic systems

    E-print Network

    Robert Delbourgo; David Elliott

    2009-04-28

    By using the Fourier transforms of the general hydrogenic bound state wave functions (as ultraspherical polynomials) one may find expectation values of arbitrary functions of momentum p. In this manner the effect of a reciprocity perturbation 1/p can be evaluated for all hydrogenic states.

  11. Life expectancy of children with cerebral palsy

    E-print Network

    Wirosoetisno, Djoko

    Life expectancy of children with cerebral palsy J L Hutton, K Hemming and UKCP collaboration What is UKCP? Information about the physical effects of cerebral palsy on the everyday lives of children with cerebral palsy which collect information about children within specific local areas. They are the Mersey

  12. Effects of Evaluation Expectation on Artistic Creativity.

    ERIC Educational Resources Information Center

    Amabile, Teresa M.

    Conditions are examined under which the imposition of an extrinsic constraint upon performance of an activity can lead to decrements in creativity. Female college students worked on an art activity either with or without the expectation of external evaluation. In addition, subjects were asked to focus upon either the creative or the technical…

  13. Men's Alcohol Expectancies at Selected Community Colleges

    ERIC Educational Resources Information Center

    Derby, Dustin C.

    2011-01-01

    Men's alcohol expectancies are an important cognitive-behavioral component of their consumption; yet, sparse research details such behaviors for men in two-year colleges. Selected for inclusion with the current study were 563 men from seven Illinois community colleges. Logistic regression analysis indicated four significant, positive relationships…

  14. What to Expect After Pulmonary Rehabilitation

    MedlinePLUS

    ... NHLBI on Twitter. What To Expect After Pulmonary Rehabilitation Most pulmonary rehabilitation (PR) programs last a few months. At the ... of the program will show whether your symptoms, physical activity level, and ... your medical therapy. Or, your doctor might recommend more tests. These ...

  15. Culture and Caregiving: Goals, Expectations, & Conflict.

    ERIC Educational Resources Information Center

    Fenichel, Emily, Ed.

    2003-01-01

    "Zero to Three" is a single-focus bulletin of the National Center for Infants, Toddlers, and Families providing insight from multiple disciplines on the development of infants, toddlers, and their families. This issue focuses on the goals, expectations, and conflict in the relationship between culture and child caregiving and other care services.…

  16. Developing expectations regarding the boundaries of expertise.

    PubMed

    Landrum, Asheley R; Mills, Candice M

    2015-01-01

    Three experiments examined elementary school-aged children's and adults' expectations regarding what specialists (i.e., those with narrow domains of expertise) and generalists (i.e., those with broad domains of expertise) are likely to know. Experiment 1 demonstrated developmental differences in the ability to differentiate between generalists and specialists, with younger children believing generalists have more specific trivia knowledge than older children and adults believed. Experiment 2 demonstrated that children and adults expected generalists to have more underlying principles knowledge than specific trivia knowledge about unfamiliar animals. However, they believed that generalists would have more of both types of knowledge than themselves. Finally, Experiment 3 demonstrated that children and adults recognized that underlying principles knowledge can be generalized between topics closely related to the specialists' domains of expertise. However, they did not recognize when this knowledge was generalizable to topics slightly less related, expecting generalists to know only as much as they would. Importantly, this work contributes to the literature by showing how much of and what kinds of knowledge different types of experts are expected to have. In sum, this work provides insight into some of the ways children's notions of expertise change over development. The current research demonstrates that between the ages of 5 and 10, children are developing the ability to recognize how experts' knowledge is likely to be limited. That said, even older children at times struggle to determine the breadth of an experts' knowledge. PMID:25460394

  17. Diversity in Literary Response: Revisiting Gender Expectations

    ERIC Educational Resources Information Center

    Brendler, Beth M.

    2014-01-01

    Drawing on and reexamining theories on gender and literacy, derived from research performed between 1974 and 2002, this qualitative study explored the gender assumptions and expectations of Language Arts teachers in a graduate level adolescent literature course at a university in the Midwestern United States. The theoretical framework was…

  18. 10 CFR 63.304 - Reasonable expectation.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Reasonable expectation. 63.304 Section 63.304 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Postclosure Public Health and Environmental Standards §...

  19. Risk Acceptance and Expectations of Laryngeal Allotransplantation

    PubMed Central

    Jo, Hyun Kyo; Park, Jang Wan; Hwang, Jae Ha; Lee, Sam Yong; Shin, Jun Ho

    2014-01-01

    Background Laryngeal allotransplantation (LA) is a technique involving transplantation of a deceased donor's larynx into a recipient, and it may be substituted for conventional laryngeal reconstruction. There are widely different views on LA, as the recipient is administered continuous, potentially life-threatening, immunosuppressive therapy for a functional or aesthetic result, which is not directly related to life extension. The purpose of this study was to analyze the difference in risk acceptance and expectations of LA between four population groups. Methods A survey was performed to examine patients' risk acceptance and expectations of LA. The survey included 287 subjects in total (general public, n=100; kidney transplant recipients, n=53; post-laryngectomy patients, n=34; doctors, n=100), using a Korean translated version of the louisville instrument for transplantation (LIFT) questionnaire. Results All four groups responded differently at various levels of their perception in risk acceptance and expectations. The kidney transplant recipients reported the highest risk acceptance and expectations, and the doctor group the lowest. Conclusions This study examined the disparate perception between specific population groups of the risks and benefits of using LA for the promotion of the quality of life. By addressing the information gaps about LA in the different populations that have been highlighted from this survey, we suggest that LA can become a more viable alternative to classical surgery with resultant improved quality of life for patients. PMID:25276642

  20. Young Infants' Expectations about Hidden Objects

    ERIC Educational Resources Information Center

    Ruffman, Ted; Slade, Lance; Redman, Jessica

    2005-01-01

    Infants aged 3-5 months (mean of approximately 4 months) were given a novel anticipatory looking task to test object permanence understanding. They were trained to expect an experimenter to retrieve an object from behind a transparent screen upon hearing a cue (''Doors up, here comes the hand''). The experimenter then hid the object behind one of…

  1. Solving Rational Expectations Models Using Excel

    ERIC Educational Resources Information Center

    Strulik, Holger

    2004-01-01

    Simple problems of discrete-time optimal control can be solved using a standard spreadsheet software. The employed-solution method of backward iteration is intuitively understandable, does not require any programming skills, and is easy to implement so that it is suitable for classroom exercises with rational-expectations models. The author…

  2. Effects of Syntactic Expectations on Speech Segmentation

    ERIC Educational Resources Information Center

    Mattys, Sven L.; Melhorn, James F.; White, Laurence

    2007-01-01

    Although the effect of acoustic cues on speech segmentation has been extensively investigated, the role of higher order information (e.g., syntax) has received less attention. Here, the authors examined whether syntactic expectations based on subject-verb agreement have an effect on segmentation and whether they do so despite conflicting acoustic…

  3. Caps and Robbers: What Can You Expect?

    ERIC Educational Resources Information Center

    Zager, Laura A.; Verghese, George C.

    2007-01-01

    The "matching" hats problem is a classic exercise in probability: if "n" people throw their hats in a box, and then each person randomly draws one out again, what is the expected number of people who draw their own hat? This paper presents several extensions to this problem, with solutions that involve interesting tricks with iterated…

  4. How prior expectations shape multisensory perception.

    PubMed

    Gau, Remi; Noppeney, Uta

    2016-01-01

    The brain generates a representation of our environment by integrating signals from a common source, but segregating signals from different sources. This fMRI study investigated how the brain arbitrates between perceptual integration and segregation based on top-down congruency expectations and bottom-up stimulus-bound congruency cues. Participants were presented audiovisual movies of phonologically congruent, incongruent or McGurk syllables that can be integrated into an illusory percept (e.g. "ti" percept for visual «ki» with auditory /pi/). They reported the syllable they perceived. Critically, we manipulated participants' top-down congruency expectations by presenting McGurk stimuli embedded in blocks of congruent or incongruent syllables. Behaviorally, participants were more likely to fuse audiovisual signals into an illusory McGurk percept in congruent than incongruent contexts. At the neural level, the left inferior frontal sulcus (lIFS) showed increased activations for bottom-up incongruent relative to congruent inputs. Moreover, lIFS activations were increased for physically identical McGurk stimuli, when participants segregated the audiovisual signals and reported their auditory percept. Critically, this activation increase for perceptual segregation was amplified when participants expected audiovisually incongruent signals based on prior sensory experience. Collectively, our results demonstrate that the lIFS combines top-down prior (in)congruency expectations with bottom-up (in)congruency cues to arbitrate between multisensory integration and segregation. PMID:26419391

  5. A Phenomenological Study to Discover Low-Income Adults' Perceptions and Expectations Regarding Financial Literacy

    ERIC Educational Resources Information Center

    Schaffer, Brigid Ann

    2013-01-01

    This phenomenological study explored the perceptions and expectations of low income adults regarding financial literacy to discover ways to increase attendance in financial literacy programs designs for this cohort. The study utilized interviews with closed-ended questions to establish the participants' backgrounds, then opened-ended questions to…

  6. "Expectations to Change" ((E2C): A Participatory Method for Facilitating Stakeholder Engagement with Evaluation Findings

    ERIC Educational Resources Information Center

    Adams, Adrienne E.; Nnawulezi, Nkiru A.; Vandenberg, Lela

    2015-01-01

    From a utilization-focused evaluation perspective, the success of an evaluation is rooted in the extent to which the evaluation was used by stakeholders. This paper details the "Expectations to Change" (E2C) process, an interactive, workshop-based method designed to engage primary users with their evaluation findings as a means of…

  7. Maximizing Exposure Therapy: An Inhibitory Learning Approach

    PubMed Central

    Craske, Michelle G.; Treanor, Michael; Conway, Chris; Zbozinek, Tomislav; Vervliet, Bram

    2014-01-01

    Exposure therapy is an effective approach for treating anxiety disorders, although a substantial number of individuals fail to benefit or experience a return of fear after treatment. Research suggests that anxious individuals show deficits in the mechanisms believed to underlie exposure therapy, such as inhibitory learning. Targeting these processes may help improve the efficacy of exposure-based procedures. Although evidence supports an inhibitory learning model of extinction, there has been little discussion of how to implement this model in clinical practice. The primary aim of this paper is to provide examples to clinicians for how to apply this model to optimize exposure therapy with anxious clients, in ways that distinguish it from a ‘fear habituation’ approach and ‘belief disconfirmation’ approach within standard cognitive-behavior therapy. Exposure optimization strategies include 1) expectancy violation, 2) deepened extinction, 3) occasional reinforced extinction, 4) removal of safety signals, 5) variability, 6) retrieval cues, 7) multiple contexts, and 8) affect labeling. Case studies illustrate methods of applying these techniques with a variety of anxiety disorders, including obsessive-compulsive disorder, posttraumatic stress disorder, social phobia, specific phobia, and panic disorder. PMID:24864005

  8. Maximal stochastic transport in the Lorenz equations

    NASA Astrophysics Data System (ADS)

    Agarwal, Sahil; Wettlaufer, J. S.

    2016-01-01

    We calculate the stochastic upper bounds for the Lorenz equations using an extension of the background method. In analogy with Rayleigh-Bénard convection the upper bounds are for heat transport versus Rayleigh number. As might be expected, the stochastic upper bounds are larger than the deterministic counterpart of Souza and Doering [1], but their variation with noise amplitude exhibits interesting behavior. Below the transition to chaotic dynamics the upper bounds increase monotonically with noise amplitude. However, in the chaotic regime this monotonicity depends on the number of realizations in the ensemble; at a particular Rayleigh number the bound may increase or decrease with noise amplitude. The origin of this behavior is the coupling between the noise and unstable periodic orbits, the degree of which depends on the degree to which the ensemble represents the ergodic set. This is confirmed by examining the close returns plots of the full solutions to the stochastic equations and the numerical convergence of the noise correlations. The numerical convergence of both the ensemble and time averages of the noise correlations is sufficiently slow that it is the limiting aspect of the realization of these bounds. Finally, we note that the full solutions of the stochastic equations demonstrate that the effect of noise is equivalent to the effect of chaos.

  9. Maximal Stochastic Transport in the Lorenz Equations

    E-print Network

    Sahil Agarwal; J. S. Wettlaufer

    2015-08-14

    We calculate the stochastic upper bounds for the Lorenz equations using an extension of the background method. In analogy with Rayleigh-B\\'enard convection the upper bounds are for heat transport versus Rayleigh number. As might be expected, the stochastic upper bounds are larger than the deterministic counterpart of \\citet{Doering15}, but their variation with noise amplitude exhibits interesting behavior. Below the transition to chaotic dynamics the upper bounds increase monotonically with noise amplitude. However, in the chaotic regime this monotonicity depends on the number of realizations in the ensemble; at a particular Rayleigh number the bound may increase or decrease with noise amplitude. The origin of this behavior is the coupling between the noise and unstable periodic orbits, the degree of which depends on the degree to which the ensemble represents the ergodic set. This is confirmed by examining the close returns plots of the full solutions to the stochastic equations and the numerical convergence of the noise correlations. The numerical convergence of both the ensemble and time averages of the noise correlations is sufficiently slow that it is the limiting aspect of the realization of these bounds. Finally, we note that the full solutions of the stochastic equations demonstrate that the effect of noise is equivalent to the effect of chaos.

  10. Parameterization of rainfall-runoff models by using utility functions for the reproduction of low and average flows

    NASA Astrophysics Data System (ADS)

    Baratti, Emanuele; Montanari, Alberto; Toth, Elena

    2014-05-01

    In the majority of rainfall-runoff modelling applications, the objective function to be minimised in the parameterisation procedure is the mean square error or another quadratic function (such as the Nash-Sutcliff efficiency). Since the use of squares forces an arbitrarily greater influence of large errors, generally corresponding to large streamflow values, such choice may prevent the identification of an adequate input-output relationship for the reproduction of low and average flows. This contribution presents the results of a series of calibration/validation experiments with a conceptual rainfall-runoff model, applied over several case-study catchments, where the performance function is based on the expected utility of the rainfall-runoff model. The method is based on the evidence that the performances of a hydrological model closely depend on the purpose of the application. For instance, in a flood forecasting system, the model could be used to estimate peak flow conditions (e.g. peak time and peak flow rate), whereas in a water resources management system, it could be particularly appreciated the capability of the model to reproduce the discharges for the entire year, or, in particular, those of water scarcity periods. In the proposed method, at each time step, the comparison between simulated and observed data is carried out by using an 'ad-hoc' utility function. The calibration is performed by maximizing the overall estimated utility of the simulated data. Different utility functions are tested and the results are compared, over validation data, against those obtained with traditional squared functions. The results reveal that an adequate utility function allows an improvement of the model performances in the reproduction of low and average flows, with a moderate deterioration of the simulation of high flows. It is also pointed out as the traditional calibration procedures may be considered as a particular case of the presented approach.

  11. Bioengineering and Coordination of Regulatory Networks and Intracellular Complexes to Maximize Hydrogen Production by Phototrophic Microorganisms

    SciTech Connect

    Tabita, F. Robert

    2013-07-30

    In this study, the Principal Investigator, F.R. Tabita has teemed up with J. C. Liao from UCLA. This project's main goal is to manipulate regulatory networks in phototrophic bacteria to affect and maximize the production of large amounts of hydrogen gas under conditions where wild-type organisms are constrained by inherent regulatory mechanisms from allowing this to occur. Unrestrained production of hydrogen has been achieved and this will allow for the potential utilization of waste materials as a feed stock to support hydrogen production. By further understanding the means by which regulatory networks interact, this study will seek to maximize the ability of currently available “unrestrained” organisms to produce hydrogen. The organisms to be utilized in this study, phototrophic microorganisms, in particular nonsulfur purple (NSP) bacteria, catalyze many significant processes including the assimilation of carbon dioxide into organic carbon, nitrogen fixation, sulfur oxidation, aromatic acid degradation, and hydrogen oxidation/evolution. Moreover, due to their great metabolic versatility, such organisms highly regulate these processes in the cell and since virtually all such capabilities are dispensable, excellent experimental systems to study aspects of molecular control and biochemistry/physiology are available.

  12. Improving information technology to maximize fenestration energyefficiency

    SciTech Connect

    Arasteh, Dariush; Mitchell, Robin; Kohler, Christian; Huizenga,Charlie; Curcija, Dragan

    2001-06-06

    Improving software for the analysis of fenestration product energy efficiency and developing related information technology products that aid in optimizing the use of fenestration products for energy efficiency are essential steps toward ensuring that more efficient products are developed and that existing and emerging products are utilized in the applications where they will produce the greatest energy savings. Given the diversity of building types and designs and the climates in the U.S., no one fenestration product or set of properties is optimal for all applications. Future tools and procedures to analyze fenestration product energy efficiency will need to both accurately analyze fenestration product performance under a specific set of conditions and to look at whole fenestration product energy performance over the course of a yearly cycle and in the context of whole buildings. Several steps have already been taken toward creating fenestration product software that will provide the information necessary to determine which details of a fenestration product's design can be improved to have the greatest impact on energy efficiency, what effects changes in fenestration product design will have on the comfort parameters that are important to consumers, and how specific fenestration product designs will perform in specific applications. Much work remains to be done, but the energy savings potential justifies the effort. Information is relatively cheap compared to manufacturing. Information technology has already been responsible for many improvements in the global economy--it can similarly facilitate many improvements in fenestration product energy efficiency.

  13. Maximizing a transport platform through computer technology.

    PubMed

    Hudson, Timothy L

    2003-01-01

    One of the most recent innovations coalescing computer technology and medical care is the further development of integrated medical component technology coupled with a computer subsystem. One such example is the self-contained patient transport system known as the Life Support for Trauma and Transport (LSTAT(tm)). The LSTAT creates a new transport platform that integrates the most current medical monitoring and therapeutic capabilities with computer processing capacity, creating the first "smart litter". The LSTAT is built around a computer system that is network capable and acts as the data hub for multiple medical devices and utilities, including data, power, and oxygen systems. The system logs patient and device data in a simultaneous, time-synchronized, continuous format, allowing electronic transmission, storage, and electronic documentation. The third-generation LSTAT includes an oxygen system, ventilator, clinical point-of-care blood analyzer, suction, defibrillator, infusion pump, and physiologic monitor, as well as on-board power and oxygen systems. The developers of LSTAT and other developers have the ability to further expand integrative component technology by developing and integrating clinical decision support systems. PMID:12802947

  14. Microorganism Utilization for Synthetic Milk

    NASA Technical Reports Server (NTRS)

    Morford, Megan A.; Khodadad, Christina L.; Caro, Janicce I.; Spencer, LaShelle E.; Richards, Jeffery T.; Strayer, Richard F.; Birmele, Michele N.; Wheeler, Raymond M.

    2014-01-01

    A desired architecture for long duration spaceflight, like aboard the International Space Station or for future missions to Mars, is to provide a supply of fresh food crops for the astronauts. However, some crops can create a high proportion of inedible plant waste. The main goal of the Synthetic Biology project, Cow in a Column, was to produce the components of milk (sugar, lipid, protein) from inedible plant waste by utilizing microorganisms (fungi, yeast, bacteria). Of particular interest was utilizing the valuable polysaccharide, cellulose, found in plant waste, to naturally fuel-through microorganism cellular metabolism- the creation of sugar (glucose), lipid (milk fat), and protein (casein) in order to produce a synthetic edible food product. Environmental conditions such as pH, temperature, carbon source, aeration, and choice microorganisms were optimized in the laboratory and the desired end-products, sugars and lipids, were analyzed. Trichoderma reesei, a known cellulolytic fungus, was utilized to drive the production of glucose, with the intent that the produced glucose would serve as the carbon source for milk fat production and be a substitute for the milk sugar lactose. Lipid production would be carried out by Rhodosporidium toruloides, yeast known to accumulate those lipids that are typically found in milk fat. Results showed that glucose and total lipid content were below what was expected during this phase of experimentation. In addition, individual analysis of six fatty acids revealed that the percentage of each fatty acid was lower than naturally produced bovine milk. Overall, this research indicates that microorganisms could be utilized to breakdown inedible solid waste to produce useable products. For future work, the production of the casein protein for milk would require the development of a genetically modified organism, which was beyond the scope of the original project. Additional trials would be needed to further refine the required environment/organisms for the production of desired sugar and lipid end-products.

  15. Panel construction for mapping in admixed populations via expected mutual information.

    PubMed

    Bercovici, Sivan; Geiger, Dan; Shlush, Liran; Skorecki, Karl; Templeton, Alan

    2008-04-01

    Mapping by admixture linkage disequilibrium (MALD) is an economical and powerful approach for the identification of genomic regions harboring disease susceptibility genes in recently admixed populations. We develop an information-theory-based measure, called expected mutual information (EMI), which computes the impact of a set of markers on the ability to infer ancestry at each chromosomal location. We then present a simple and effective algorithm for the selection of panels that strives to maximize the EMI score. Finally, we demonstrate via well-established simulation tools that our panels provide more power and accuracy for inferring disease gene loci via the MALD method in comparison to previous methods. PMID:18353806

  16. Differences in Life Expectancy and Disability Free Life Expectancy in Italy. A Challenge to Health Systems

    ERIC Educational Resources Information Center

    Burgio, A.; Murianni, L.; Folino-Gallo, P.

    2009-01-01

    Background: Measures of health expectancy such as Disability Free Life Expectancy are used to evaluate and compare regional/national health statuses. These indicators are useful for understanding changes in the health status and defining health policies and decisions on the provision of services because provide useful information on possible areas…

  17. Expected Value and Variance 6.1 Expected Value of Discrete Random Variables

    E-print Network

    Indiana University

    a fair coin three times. Let X denote the number of heads which appear. Then the possible values of XChapter 6 Expected Value and Variance 6.1 Expected Value of Discrete Random Variables When a large for the probability distribution of a numerically-valued random variable. In this and in the next section, we shall

  18. The Maximization of Teacher Assignment: A Linear Programming Model

    ERIC Educational Resources Information Center

    Berrie, Phillip J.

    1974-01-01

    This study attempted to maximize the effectiveness of the teaching force through a successful assignment of teachers by considering the individual needs, competencies, and preferences of teachers and administrative staff, and selected student and curriculum factors. (Author)

  19. DSS Model for Profit Maximization at Customer Enquiry Evaluation Stage

    E-print Network

    Xiong, M.H.

    This paper presents an optimal method and a heuristic approach which aims at maximizing the profit when responding to a set of customer enquiries under limited capacity. The model takes into consideration the quantity of ...

  20. Maximizing the Potential of Multiculturalism in the Classroom.

    ERIC Educational Resources Information Center

    Ziegler, Suzanne

    1981-01-01

    Aimed at maximizing the positive potential of a multicultural classroom and improving intergroup relations, three behavioral and three attitudinal objectives, with suggested programs and evaluative measures, are presented. (NEC)

  1. Supply Chain Network Design Profit Maximization and Oligopolistic Competition

    E-print Network

    Nagurney, Anna

    Supply Chain Network Design Under Profit Maximization and Oligopolistic Competition Anna Nagurney Department of Finance and Operations Management Isenberg School of Management University of Massachusetts model the supply chain network design problem with oligopolis- tic firms who are involved

  2. Sensitivity to conversational maxims in deaf and hearing children.

    PubMed

    Surian, Luca; Tedoldi, Mariantonia; Siegal, Michael

    2010-09-01

    We investigated whether access to a sign language affects the development of pragmatic competence in three groups of deaf children aged 6 to 11 years: native signers from deaf families receiving bimodal/bilingual instruction, native signers from deaf families receiving oralist instruction and late signers from hearing families receiving oralist instruction. The performance of these children was compared to a group of hearing children aged 6 to 7 years on a test designed to assess sensitivity to violations of conversational maxims. Native signers with bimodal/bilingual instruction were as able as the hearing children to detect violations that concern truthfulness (Maxim of Quality) and relevance (Maxim of Relation). On items involving these maxims, they outperformed both the late signers and native signers attending oralist schools. These results dovetail with previous findings on mindreading in deaf children and underscore the role of early conversational experience and instructional setting in the development of pragmatics. PMID:19719886

  3. Maximal slicing of D-dimensional spherically symmetric vacuum spacetime

    SciTech Connect

    Nakao, Ken-ichi; Abe, Hiroyuki; Yoshino, Hirotaka; Shibata, Masaru

    2009-10-15

    We study the foliation of a D-dimensional spherically symmetric black-hole spacetime with D{>=}5 by two kinds of one-parameter families of maximal hypersurfaces: a reflection-symmetric foliation with respect to the wormhole slot and a stationary foliation that has an infinitely long trumpetlike shape. As in the four-dimensional case, the foliations by the maximal hypersurfaces avoid the singularity irrespective of the dimensionality. This indicates that the maximal slicing condition will be useful for simulating higher-dimensional black-hole spacetimes in numerical relativity. For the case of D=5, we present analytic solutions of the intrinsic metric, the extrinsic curvature, the lapse function, and the shift vector for the foliation by the stationary maximal hypersurfaces. These data will be useful for checking five-dimensional numerical-relativity codes based on the moving puncture approach.

  4. Morning vs. evening maximal cycle power and technical swimming ability.

    PubMed

    Deschodt, Veronique J; Arsac, Laurent M

    2004-02-01

    The aim of this study was to observe diurnal influences on maximal power and technical swimming ability at three different times (8 AM, 1 PM, and 6 PM). Prior to each test, tympanic temperature was taken. Maximal power was analyzed by cycle tests. Stroke length, stroke rate, hand pattern, and swimming velocity were recorded between the 20th and the 28th m of the 50-m freestyle. Temperature varied +/-0.4 degrees C between morning and evening. Concomitantly, maximal power (+7%) and technical ability (+3% in stroke length, +5% in stroke rate and changes in underwater hand coordinates) were greater in the evening. The present study confirms and specifies diurnal influences on all-out performances with regard to both maximal power and technical ability. Thus, when swimmers are called upon to perform at a high level in the morning, they should warm up extensively in order to "swamp" the diurnal effects of the morning. PMID:14971968

  5. Obtaining Maximal Concatenated Phylogenetic Data Sets from Large Sequence Databases

    E-print Network

    Sanderson, Michael J.

    Obtaining Maximal Concatenated Phylogenetic Data Sets from Large Sequence Databases Michael J of tree reconstruction, phylogeneticists are extracting increasingly large multigene data sets from permits all such data sets to be obtained in reasonable computing times even for large numbers

  6. Seed supply for broadscale restoration: maximizing evolutionary potential

    E-print Network

    Vesk, Peter

    SYNTHESIS Seed supply for broadscale restoration: maximizing evolutionary potential Linda M the major paradigms that drive seed sourcing guidelines. This is particularly important since many, seed quality. Correspondence Linda M. Broadhurst, CSIRO Plant Industry, PO Box 1600, Canberra ACT 2601

  7. Local distinguishability of maximally entangled states in canonical form

    NASA Astrophysics Data System (ADS)

    Zhang, Zhi-Chao; Gao, Fei; Qin, Su-Juan; Zuo, Hui-Juan; Wen, Qiao-Yan

    2015-10-01

    In this paper, we mainly study the local distinguishability of mutually orthogonal maximally entangled states in canonical form. In d ? d, Nathanson (Phys Rev A 88:062316, 2013) presented a feasible necessary and sufficient condition for distinguishing the general bipartite quantum states by one-way local operations and classical communication (LOCC). However, for maximally entangled states in canonical form, it is still unknown how to more effectively judge whether there exists a state such that those unitary operators corresponding to those maximally entangled states are pairwise orthogonal. In this work, we exhibit one method which can be used to more effectively judge it. Furthermore, we construct some sets of maximally entangled states and can easily know that those states are not distinguished by one-way LOCC with the help of our new method.

  8. Young infants have biological expectations about animals

    PubMed Central

    Setoh, Peipei; Wu, Di; Baillargeon, Renée; Gelman, Rochel

    2013-01-01

    What are the developmental origins of our concept of animal? There has long been controversy concerning this question. At issue is whether biological reasoning develops from earlier forms of reasoning, such as physical and psychological reasoning, or whether from a young age children endow animals with biological properties. Here we demonstrate that 8-mo-old infants already expect novel objects they identify as animals to have insides. Infants detected a violation when an object that was self-propelled and agentive (but not an object that lacked one or both of these properties) was revealed to be hollow. Infants also detected a violation when an object that was self-propelled and furry (but not an object that lacked one or both of these properties) either was shown to be hollow or rattled (when shaken) as although mostly hollow. Young infants’ expectations about animals’ insides may serve as a foundation for the development of more advanced biological knowledge. PMID:24003134

  9. A new augmentation based algorithm for extracting maximal chordal subgraphs

    SciTech Connect

    Bhowmick, Sanjukta; Chen, Tzu-Yi; Halappanavar, Mahantesh

    2015-02-01

    A graph is chordal if every cycle of length greater than three contains an edge between non-adjacent vertices. Chordal graphs are of interest both theoretically, since they admit polynomial time solutions to a range of NP-hard graph problems, and practically, since they arise in many applications including sparse linear algebra, computer vision, and computational biology. A maximal chordal subgraph is a chordal subgraph that is not a proper subgraph of any other chordal subgraph. Existing algorithms for computing maximal chordal subgraphs depend on dynamically ordering the vertices, which is an inherently sequential process and therefore limits the algorithms’ parallelizability. In this paper we explore techniques to develop a scalable parallel algorithm for extracting a maximal chordal subgraph. We demonstrate that an earlier attempt at developing a parallel algorithm may induce a non-optimal vertex ordering and is therefore not guaranteed to terminate with a maximal chordal subgraph. We then give a new algorithm that first computes and then repeatedly augments a spanning chordal subgraph. After proving that the algorithm terminates with a maximal chordal subgraph, we then demonstrate that this algorithm is more amenable to parallelization and that the parallel version also terminates with a maximal chordal subgraph. That said, the complexity of the new algorithm is higher than that of the previous parallel algorithm, although the earlier algorithm computes a chordal subgraph which is not guaranteed to be maximal. We experimented with our augmentation-based algorithm on both synthetic and real-world graphs. We provide scalability results and also explore the effect of different choices for the initial spanning chordal subgraph on both the running time and on the number of edges in the maximal chordal subgraph.

  10. Maximizing the value of education for university undergraduate research fellows 

    E-print Network

    Tilley, Aaron Benjamin

    2013-02-22

    -1 MAXIMIZING THE VALUE OF EDUCATION FOR UNIVERSITY UNDERGRADUATE RESEARCH FELLOWS A Senior Honors Thesis By AARON BENJAMIN TILLEY Submined to the Office of Honors Programs 8r. Academic Scholarships Texas ARM University In partial fulfillment... of the requirements For the Designation of UNIVERSITY UNDERGRADUATE RESEARCH FELLOW April 2000 Group: Political Science MAXIMIZING THE VALUE OF EDUCATION FOR UNIVERSITY UNDERGRADUATE RESEARCH FELLOWS A Senior Honors Thesis By AARON BENJAMIN TILLEY...

  11. PLATO Simulator: Realistic simulations of expected observations

    NASA Astrophysics Data System (ADS)

    Marcos-Arenal, P.; Zima, W.; De Ridder, J.; Aerts, C.; Huygen, R.; Samadi, R.; Green, J.; Piotto, G.; Salmon, S.; Catala, C.; Rauer, H.

    2015-06-01

    PLATO Simulator is an end-to-end simulation software tool designed for the performance of realistic simulations of the expected observations of the PLATO mission but easily adaptable to similar types of missions. It models and simulates photometric time-series of CCD images by including models of the CCD and its electronics, the telescope optics, the stellar field, the jitter movements of the spacecraft, and all important natural noise sources.

  12. First Contact: Expectations of Beginning Astronomy Students

    NASA Astrophysics Data System (ADS)

    Lacey, T. L.; Slater, T. F.

    1999-05-01

    Three hundred seven undergraduate students enrolled in Introductory Astronomy were surveyed at the beginning of class to determine their expectations for course content. The course serves as a survey of astronomy for non-science majors and is a distribution course for general education core requirements. The course has no prerequisites, meets three times each week for 50 minutes, and represents three semester credit hours. The university catalog describes the course with the title "PHYSICS 101 - Mysteries of the Sky" and the official course description is: a survey of the struggle to understand the Universe and our place therein. The structure, growth, methods, and limitations of science will be illustrated using the development of astronomy as a vehicle. Present day views of the Universe are presented. Two questions were asked as open response items: What made you decide to take this course? and What do you expect to learn in this course? The reasons that students cited to take the course, in order of frequency, were: interested in astronomy, interesting or fun sounding course, required general education fulfillment, recommendation by peer. Secondary reasons cited were required for major or minor, general interest in science, and was available in the schedule. Tertiary reasons listed were recommendation by advisor or orientation leader, inflate grade point average, and heard good things about the teacher. The students' expectations about what they would learn in the course were numerous. The most common objects listed, in order of frequency, were: stars, constellations, planets, galaxies, black holes, solar system, comets, galaxies, asteroids, moon, and Sun. More interesting were the aspects not specifically related to astronomy. These were weather, atmosphere, UFOs and the unexplained, generally things in the sky. A mid-course survey suggests that students expected to learn more constellations and that the topics would be less in-depth.

  13. Investigating expectation effects using multiple physiological measures

    PubMed Central

    Siller, Alexander; Ambach, Wolfgang; Vaitl, Dieter

    2015-01-01

    The study aimed at experimentally investigating whether the human body can anticipate future events under improved methodological conditions. Previous studies have reported contradictory results for the phenomenon typically called presentiment. If the positive findings are accurate, they call into doubt our views about human perception, and if they are inaccurate, a plausible conventional explanation might be based on the experimental design of the previous studies, in which expectation due to item sequences was misinterpreted as presentiment. To address these points, we opted to collect several physiological variables, to test different randomization types and to manipulate subjective significance individually. For the latter, we combined a mock crime scenario, in which participants had to steal specific items, with a concealed information test (CIT), in which the participants had to conceal their knowledge when interrogated about items they had stolen or not stolen. We measured electrodermal activity, respiration, finger pulse, heart rate (HR), and reaction times. The participants (n = 154) were assigned randomly to four different groups. Items presented in the CIT were either drawn with replacement (full) or without replacement (pseudo) and were either presented category-wise (cat) or regardless of categories (nocat). To understand how these item sequences influence expectation and modulate physiological reactions, we compared the groups with respect to effect sizes for stolen vs. not stolen items. Group pseudo_cat yielded the highest effect sizes, and pseudo_nocat yielded the lowest. We could not find any evidence of presentiment but did find evidence of physiological correlates of expectation. Due to the design differing fundamentally from previous studies, these findings do not allow for conclusions on the question whether the expectation bias is being confounded with presentiment. PMID:26500600

  14. Setting clear expectations for safety basis development

    SciTech Connect

    MORENO, M.R.

    2003-05-03

    DOE-RL has set clear expectations for a cost-effective approach for achieving compliance with the Nuclear Safety Management requirements (10 CFR 830, Nuclear Safety Rule) which will ensure long-term benefit to Hanford. To facilitate implementation of these expectations, tools were developed to streamline and standardize safety analysis and safety document development resulting in a shorter and more predictable DOE approval cycle. A Hanford Safety Analysis and Risk Assessment Handbook (SARAH) was issued to standardized methodologies for development of safety analyses. A Microsoft Excel spreadsheet (RADIDOSE) was issued for the evaluation of radiological consequences for accident scenarios often postulated for Hanford. A standard Site Documented Safety Analysis (DSA) detailing the safety management programs was issued for use as a means of compliance with a majority of 3009 Standard chapters. An in-process review was developed between DOE and the Contractor to facilitate DOE approval and provide early course correction. As a result of setting expectations and providing safety analysis tools, the four Hanford Site waste management nuclear facilities were able to integrate into one Master Waste Management Documented Safety Analysis (WM-DSA).

  15. Expected performance of m-solution backtracking

    NASA Technical Reports Server (NTRS)

    Nicol, D. M.

    1986-01-01

    This paper derives upper bounds on the expected number of search tree nodes visited during an m-solution backtracking search, a search which terminates after some preselected number m problem solutions are found. The search behavior is assumed to have a general probabilistic structure. The results are stated in terms of node expansion and contraction. A visited search tree node is said to be expanding if the mean number of its children visited by the search exceeds 1 and is contracting otherwise. It is shown that if every node expands, or if every node contracts, then the number of search tree nodes visited by a search has an upper bound which is linear in the depth of the tree, in the mean number of children a node has, and in the number of solutions sought. Also derived are bounds linear in the depth of the tree in some situations where an upper portion of the tree contracts (expands), while the lower portion expands (contracts). While previous analyses of 1-solution backtracking have concluded that the expected performance is always linear in the tree depth, the model allows superlinear expected performance.

  16. On the maximal efficiency of the collisional Penrose process

    E-print Network

    Elly Leiderschneider; Tsvi Piran

    2015-10-22

    The center of mass (CM) energy in a collisional Penrose process - a collision taking place within the ergosphere of a Kerr black hole - can diverge under suitable extreme conditions (maximal Kerr, near horizon collision and suitable impact parameters). We present an analytic expression for the CM energy, refining expressions given in the literature. Even though the CM energy diverges, we show that the maximal energy attained by a particle that escapes the black hole's gravitational pull and reaches infinity is modest. We obtain an analytic expression for the energy of an escaping particle resulting from a collisional Penrose process, and apply it to derive the maximal energy and the maximal efficiency for several physical scenarios: pair annihilation, Compton scattering, and the elastic scattering of two massive particles. In all physically reasonable cases (in which the incident particles initially fall from infinity towards the black hole) the maximal energy (and the corresponding efficiency) are only one order of magnitude larger than the rest mass energy of the incident particles. The maximal efficiency found is $\\approx 13.92$ and it is obtained for the scattering of an outgoing massless particle by a massive particle.

  17. Enumerating all maximal frequent subtrees in collections of phylogenetic trees

    PubMed Central

    2014-01-01

    Background A common problem in phylogenetic analysis is to identify frequent patterns in a collection of phylogenetic trees. The goal is, roughly, to find a subset of the species (taxa) on which all or some significant subset of the trees agree. One popular method to do so is through maximum agreement subtrees (MASTs). MASTs are also used, among other things, as a metric for comparing phylogenetic trees, computing congruence indices and to identify horizontal gene transfer events. Results We give algorithms and experimental results for two approaches to identify common patterns in a collection of phylogenetic trees, one based on agreement subtrees, called maximal agreement subtrees, the other on frequent subtrees, called maximal frequent subtrees. These approaches can return subtrees on larger sets of taxa than MASTs, and can reveal new common phylogenetic relationships not present in either MASTs or the majority rule tree (a popular consensus method). Our current implementation is available on the web at https://code.google.com/p/mfst-miner/. Conclusions Our computational results confirm that maximal agreement subtrees and all maximal frequent subtrees can reveal a more complete phylogenetic picture of the common patterns in collections of phylogenetic trees than maximum agreement subtrees; they are also often more resolved than the majority rule tree. Further, our experiments show that enumerating maximal frequent subtrees is considerably more practical than enumerating ordinary (not necessarily maximal) frequent subtrees. PMID:25061474

  18. Maximizing the detection of near-Earth objects

    NASA Astrophysics Data System (ADS)

    Albin, T.; Albrecht, S.; Koschny, D.; Drolshagen, G.

    2014-07-01

    Planetary bodies with a perihelion equal or less than 1.3 astronomical units (au) are called near-Earth objects (NEOs). These objects are divided into 4 sub-families, two of them cross Earth's orbit and may be a potential hazard for the planet. The Tunguska event and the incident in Chelyabinsk last year have shown the devastating destructiveness of NEOs with a size of only approximately 40 and 20 meters, respectively. To predict and identify further threats, telescopic NEO surveys currently extend our knowledge of the population of these objects. Today (March 2014) approximately 10,700 NEOs are known. Based on an extrapolation of the current population, Bottke et al. (2002) predict a total number of N?(1.0±0.5)×10^{8} NEOs up to an absolute magnitude of H = 30.5 mag. Additionally, Bottke et al. (2002) computed a de-biased model of the expected orbital elements distribution of the NEOs. They have investigated the theoretical distribution of NEOs by a dynamical simulation, following the orbital evolution of these objects from several source regions. Based on both models we performed simulations of the detectability of the theoretical NEO population for certain telescopes with certain properties. The goal of these simulations is to optimize the search strategies of NEO surveys. Our simulation models the optical telescope attributes (main and secondary mirror size, optical throughput, field-of-view), the electronics (CCD Camera, pixel size, quantum efficiency, gain, exposure time, pixel binning, dark / bias noise, Signal-to-Noise ratio), atmospheric effects (seeing, sky background illumination) and the brightness and angular velocity of the NEOs. We present exemplarily results for two telescopes, currently developed by the European Space Agency for a future NEO survey: the so-called Fly-Eye Telescope, a 1-m effective aperture telescope with a field of view of 6.5×6.5 deg^2 and the Test-Bed Telescope, with an aperture of 56 cm and a field of view of 2.2×2.2 deg^2. The results of both telescopes can be easily adapted to other telescopes with similar properties. We show different observation strategies to maximize the detection rate of undiscovered NEOs depending on different telescope operation modes (exposure time, pixel binning) and different sky conditions (seeing, sky background brightness).

  19. Electric utility credit trends

    SciTech Connect

    Burkhardt, D.A.; Whitfield, H.

    1990-10-11

    This article looks at the changes in credit ratings for electric utilities in the 1980's and some of the reasons for these changes. The ratings examined are Moody's and Standard and Poor's. Most of the downward ratings occurred as a result of increased debt and problems with nuclear power. General Public Utilities and Allegheny Power Systems were identified as two of the utilities whose ratings improved.

  20. The moderating effect of gender on the relation between expectancies and gambling frequency among college students.

    PubMed

    Teeters, Jenni B; Ginley, Meredith K; Whelan, James P; Meyers, Andrew W; Pearlson, Godfrey D

    2015-03-01

    Compared to college females, college males are more likely to report frequent gambling. Research on gambling outcome expectancies has shown that expectations about gambling influence gambling behavior and that endorsement of particular expectancies differs by gender. Knowledge regarding the differential predictive utility of specific gambling expectancies based on gender would help to determine how beliefs about gambling may be fundamentally different for men and women. The present study explored whether gender moderates the relation between gambling expectancy and gambling frequency in a college sample. 421 college students completed an online survey that included questions about their demographics, gambling frequency, and gambling expectancies. Hierarchical regression analyses indicated that gender moderated the relations between the expectancies of social consequences, material gain, and gambling frequency. For females, greater endorsement of social consequences predicted less frequent gambling. For both males and females, greater endorsement of material gain predicted more frequent gambling. The current findings can help inform prevention and intervention efforts by identifying gambling expectations that are differentially related to college student gambling behavior choices. PMID:24065315

  1. Moving multiple sinks through wireless sensor networks for lifetime maximization.

    SciTech Connect

    Petrioli, Chiara; Carosi, Alessio; Basagni, Stefano; Phillips, Cynthia Ann

    2008-01-01

    Unattended sensor networks typically watch for some phenomena such as volcanic events, forest fires, pollution, or movements in animal populations. Sensors report to a collection point periodically or when they observe reportable events. When sensors are too far from the collection point to communicate directly, other sensors relay messages for them. If the collection point location is static, sensor nodes that are closer to the collection point relay far more messages than those on the periphery. Assuming all sensor nodes have roughly the same capabilities, those with high relay burden experience battery failure much faster than the rest of the network. However, since their death disconnects the live nodes from the collection point, the whole network is then dead. We consider the problem of moving a set of collectors (sinks) through a wireless sensor network to balance the energy used for relaying messages, maximizing the lifetime of the network. We show how to compute an upper bound on the lifetime for any instance using linear and integer programming. We present a centralized heuristic that produces sink movement schedules that produce network lifetimes within 1.4% of the upper bound for realistic settings. We also present a distributed heuristic that produces lifetimes at most 25:3% below the upper bound. More specifically, we formulate a linear program (LP) that is a relaxation of the scheduling problem. The variables are naturally continuous, but the LP relaxes some constraints. The LP has an exponential number of constraints, but we can satisfy them all by enforcing only a polynomial number using a separation algorithm. This separation algorithm is a p-median facility location problem, which we can solve efficiently in practice for huge instances using integer programming technology. This LP selects a set of good sensor configurations. Given the solution to the LP, we can find a feasible schedule by selecting a subset of these configurations, ordering them via a traveling salesman heuristic, and computing feasible transitions using matching algorithms. This algorithm assumes sinks can get a schedule from a central server or a leader sink. If the network owner prefers the sinks make independent decisions, they can use our distributed heuristic. In this heuristic, sinks maintain estimates of the energy distribution in the network and move greedily (with some coordination) based on local search. This application uses the new SUCASA (Solver Utility for Customization with Automatic Symbol Access) facility within the PICO (Parallel Integer and Combinatorial Optimizer) integer programming solver system. SUCASA allows rapid development of customized math programming (search-based) solvers using a problem's natural multidimensional representation. In this case, SUCASA also significantly improves runtime compared to implementations in the ampl math programming language or in perl.

  2. Bison distribution under conflicting foraging strategies: site fidelity vs. energy maximization.

    PubMed

    Merkle, Jerod A; Cherry, Seth G; Fortin, Daniel

    2015-07-01

    Foraging strategies based on site fidelity and maximization of energy intake rate are two adaptive forces shaping animal behavior. Whereas these strategies can both be evolutionarily stable, they predict conflicting optimal behaviors when population abundance is in decline. In such a case, foragers employing an energy-maximizing strategy should reduce their use of low-quality patches as interference competition becomes less intense for high-quality patches. Foragers using a site fidelity strategy, however, should continue to use familiar patches. Because natural fluctuations in population abundance provide the only non-manipulative opportunity to evaluate adaptation to these evolutionary forces, few studies have examined these foraging strategies simultaneously. Using abundance and space use data from a free-ranging bison (Bison bison) population living in a meadow-forest matrix in Prince Albert National Park, Canada, we determined how individuals balance the trade-off between site fidelity and energy-maximizing patch choice strategies with respect to changes in population abundance. From 1996 to 2005, bison abundance increased from 225 to 475 and then decreased to 225 by 2013. During the period of population increase, population range size increased. This expansion involved the addition of relatively less profitable areas and patches, leading to a decrease in the mean expected profitability of the range. Yet, during the period of population decline, we detected neither a subsequent retraction in population range size nor an increase in mean expected profitability of the range. Further, patch selection models. during the population decline indicated that, as density decreased, bison portrayed stronger fidelity to previously visited meadows, but no increase in selection strength for profitable meadows. Our analysis reveals that an energy-maximizing patch choice strategy alone cannot explain the distribution ofindividuals and populations, and site fidelity is an important evolutionary force shaping animal distribution. Animals may not always forage in the richest patches available, as ecological theory would often predict, but their use of profitable patches is dependent on population dynamics and the strength of site fidelity. Our findings are likewise relevant to applied inquiries such as forecasting species range shifts and reducing human-wildlife conflicts. PMID:26378302

  3. International utilization and operations

    NASA Technical Reports Server (NTRS)

    Goldberg, Stanley R.

    1989-01-01

    The international framework of the Space Station Freedom Program is described. The discussion covers the U.S. space policy, international agreements, international Station elements, overall program management structure, and utilization and operations management. Consideration is also given to Freedom's user community, Freedom's crew, pressurized payload and attached payload accommodations, utilization and operations planning, user integration, and user operations.

  4. The Long Arm of Expectancies: Adolescent Alcohol Expectancies Predict Adult Alcohol Use

    PubMed Central

    Patrick, Megan E.; Wray-Lake, Laura; Finlay, Andrea K.; Maggs, Jennifer L.

    2010-01-01

    Aims: Alcohol expectancies are strong concurrent predictors of alcohol use and problems, but the current study addressed their unique power to predict from adolescence to midlife. Method: Long-term longitudinal data from the national British Cohort Study 1970 (N = 2146, 59.8% female) were used to predict alcohol use and misuse in the mid-30s by alcohol expectancies reported in adolescence. Results: Cohort members with more positive alcohol expectancies at age 16 reported greater alcohol quantity concurrently, increases in alcohol quantity relative to their peers between ages 16 and 35, and a higher likelihood of lifetime and previous year alcohol misuse at age 35, independent of gender, social class in family of origin, age of alcohol use onset, adolescent delinquent behavior and age 16 exam scores. Conclusions: Alcohol expectancies were strong proximal predictors of alcohol use and predicted relative change in alcohol use and misuse across two decades into middle adulthood. PMID:19808940

  5. Expectancy and Treatment Interactions: A Dissociation between Acupuncture Analgesia and Expectancy Evoked Placebo Analgesia

    E-print Network

    Kong, Jian

    Recent advances in placebo research have demonstrated the mind's power to alter physiology. In this study, we combined an expectancy manipulation model with both verum and sham acupuncture treatments to address: 1) how and ...

  6. The WFIRST Microlensing Survey: Expectations and Unexpectations

    NASA Astrophysics Data System (ADS)

    Gaudi, B. Scott; Penny, Matthew

    2016-01-01

    The WFIRST microlensing survey will provide the definitive determination of the demographics of cool planets with semimajor axes > 1 AU and masses greater than that of the Earth, including free-floating planets. Together with the results from Kepler, TESS, and PLATO, WFIRST will complete the statistical census of planets in the Galaxy. These expectations are based on the most basic and conservative assumptions about the data quality, and assumes that the analysis methodologies will be similar to that used for current ground-based microlensing. Yet, in fact, the data quality will be dramatically better, and information content substantially richer, for the WFIRST microlensing survey as compared to current ground-based surveys. Thus WFIRST should allow for orders of magnitude improvement in both sensitivity and science yield. We will review some of these expected improvements and opportunities (the "known unknowns"), and provide a "to do list" of what tasks will need to be completed in order to take advantage of these opportunities. We will then speculate on the opportunities that we may not be aware of yet (the "unknown unknowns"), how we might go about determining what those opportunities are, and how we might figure out what we will need to do to take advantage of them.This work was partially supported by NASA grant NNX14AF63G.

  7. The superego, narcissism and Great Expectations.

    PubMed

    Ingham, Graham

    2007-06-01

    The author notes that the concepts of the superego and narcissism were linked at conception and that superego pathology may be seen as a determining factor in the formation of a narcissistic disorder; thus an examination of the superego can function as a "biopsy", indicating the condition of the personality as a whole. Charles Dickens's novel "Great Expectations" is presented as a penetrating exploration of these themes and it is argued that in Pip, the central character, Dickens provides a perceptive study of the history of a narcissistic condition. Other key figures in the book are understood as superego representations and, as such, integral to the vicissitudes of Pip's development. In particular, the lawyer Jaggers is considered as an illustration of Bion's notion of the "ego-destructive superego". In the course of the paper, the author suggests that Great Expectations affirms the psychoanalytic understanding that emotional growth and some recovery from narcissistic difficulties necessarily take place alongside modification of the superego, allowing for responsible knowledge of the state of the object and the possibility of realistic reparation. PMID:17537703

  8. Expectations and Interpretations During Causal Learning

    PubMed Central

    Luhmann, Christian C.; Ahn, Woo-kyoung

    2012-01-01

    In existing models of causal induction, 4 types of covariation information (i.e., presence/absence of an event followed by presence/absence of another event) always exert identical influences on causal strength judgments (e.g., joint presence of events always suggests a generative causal relationship). In contrast, we suggest that, due to expectations developed during causal learning, learners give varied interpretations to covariation information as it is encountered and that these interpretations influence the resulting causal beliefs. In Experiments 1A–1C, participants’ interpretations of observations during a causal learning task were dynamic, expectation based, and, furthermore, strongly tied to subsequent causal judgments. Experiment 2 demonstrated that adding trials of joint absence or joint presence of events, whose roles have been traditionally interpreted as increasing causal strengths, could result in decreased overall causal judgments and that adding trials where one event occurs in the absence of another, whose roles have been traditionally interpreted as decreasing causal strengths, could result in increased overall causal judgments. We discuss implications for traditional models of causal learning and how a more top-down approach (e.g., Bayesian) would be more compatible with the current findings. PMID:21534705

  9. Anticipatory looks reveal expectations about discourse relations.

    PubMed

    Rohde, Hannah; Horton, William S

    2014-12-01

    Previous research provides evidence for expectation-driven processing within sentences at phonological, lexical, and syntactic levels of linguistic structure. Less well-established is whether comprehenders also anticipate pragmatic relationships between sentences. To address this, we evaluate a unit of discourse structure that comprehenders must infer to hold between sentences in order for a discourse to make sense-the intersentential coherence relation. In a novel eyetracking paradigm, we trained participants to associate particular spatial locations with particular coherence relations. Experiment 1 shows that the subset of listeners who successfully acquired the location?relation mappings during training subsequently looked to these locations during testing in response to a coherence-signaling intersentential connective. Experiment 2 finds that listeners' looks during sentences containing coherence-biasing verbs reveal expectations about upcoming sentence types. This work extends existing research on prediction beyond sentence-internal structure and provides a new methodology for examining the cues that comprehenders use to establish relationships at the discourse level. PMID:25247235

  10. The expected anisotropy in solid inflation

    NASA Astrophysics Data System (ADS)

    Bartolo, Nicola; Peloso, Marco; Ricciardone, Angelo; Unal, Caner

    2014-11-01

    Solid inflation is an effective field theory of inflation in which isotropy and homogeneity are accomplished via a specific combination of anisotropic sources (three scalar fields that individually break isotropy). This results in specific observational signatures that are not found in standard models of inflation: a non-trivial angular dependence for the squeezed bispectrum, and a possibly long period of anisotropic inflation (to drive inflation, the ``solid'' must be very insensitive to any deformation, and thus background anisotropies are very slowly erased). In this paper we compute the expected level of statistical anisotropy in the power spectrum of the curvature perturbations of this model. To do so, we account for the classical background values of the three scalar fields that are generated on large (superhorizon) scales during inflation via a random walk sum, as the perturbation modes leave the horizon. Such an anisotropy is unavoidably generated, even starting from perfectly isotropic classical initial conditions. The expected level of anisotropy is related to the duration of inflation and to the amplitude of the squeezed bispectrum. If this amplitude is close to its current observational limit (so that one of the most interesting predictions of the model can be observed in the near future), we find that a level of statistical anisotropy F2 gives frozen and scale invariant vector perturbations on superhorizon scales.

  11. Identifying, meeting, and assessing customer expectations

    SciTech Connect

    Danner, T.A.

    1995-02-01

    Maintaining proficiency in carrying out mission goals is fundamental to the success of any organization. The definitive mission of the Waste Management and Remedial Action Division (WMRAD) of Oak Ridge National Laboratory (ORNL) is {open_quotes}to conduct waste management activities in a compliant, publicly acceptable, technically sound, and cost-efficient manner{close_quotes}. In order to effectively fulfill this mission, must meet or exceed several standards in respect to our customers. These include: (1) identifying current and future customer expectations; (2) managing our relationships with our customers; (3) ensuring our commitment to our customers; and (4) measuring our success m customer satisfaction. Our customers have a great variety of requirements and expectations. Many of these are in the form of local, state, and federal regulations and environmental standards. Others are brought to our attention through inquires made to the Department of Energy (DOE).Consumer surveys have proven to be effective tools which have been used to make improvements, enhance certain program elements, and identify beneficial areas in already existing programs. In addition, national working groups, technology transfer meetings, and manager/contractor`s meeting offer excellent opportunities to assess our activities.

  12. The expected anisotropy in solid inflation

    SciTech Connect

    Bartolo, Nicola; Ricciardone, Angelo; Peloso, Marco; Unal, Caner E-mail: peloso@physics.umn.edu E-mail: unal@physics.umn.edu

    2014-11-01

    Solid inflation is an effective field theory of inflation in which isotropy and homogeneity are accomplished via a specific combination of anisotropic sources (three scalar fields that individually break isotropy). This results in specific observational signatures that are not found in standard models of inflation: a non-trivial angular dependence for the squeezed bispectrum, and a possibly long period of anisotropic inflation (to drive inflation, the ''solid'' must be very insensitive to any deformation, and thus background anisotropies are very slowly erased). In this paper we compute the expected level of statistical anisotropy in the power spectrum of the curvature perturbations of this model. To do so, we account for the classical background values of the three scalar fields that are generated on large (superhorizon) scales during inflation via a random walk sum, as the perturbation modes leave the horizon. Such an anisotropy is unavoidably generated, even starting from perfectly isotropic classical initial conditions. The expected level of anisotropy is related to the duration of inflation and to the amplitude of the squeezed bispectrum. If this amplitude is close to its current observational limit (so that one of the most interesting predictions of the model can be observed in the near future), we find that a level of statistical anisotropy F{sup 2} gives frozen and scale invariant vector perturbations on superhorizon scales.

  13. Ventilatory patterns differ between maximal running and cycling.

    PubMed

    Tanner, David A; Duke, Joseph W; Stager, Joel M

    2014-01-15

    To determine the effect of exercise mode on ventilatory patterns, 22 trained men performed two maximal graded exercise tests; one running on a treadmill and one cycling on an ergometer. Tidal flow-volume (FV) loops were recorded during each minute of exercise with maximal loops measured pre and post exercise. Running resulted in a greater VO2peak than cycling (62.7±7.6 vs. 58.1±7.2mLkg(-1)min(-1)). Although maximal ventilation (VE) did not differ between modes, ventilatory equivalents for O2 and CO2 were significantly larger during maximal cycling. Arterial oxygen saturation (estimated via ear oximeter) was also greater during maximal cycling, as were end-expiratory (EELV; 3.40±0.54 vs. 3.21±0.55L) and end-inspiratory lung volumes, (EILV; 6.24±0.88 vs. 5.90±0.74L). Based on these results we conclude that ventilatory patterns differ as a function of exercise mode and these observed differences are likely due to the differences in posture adopted during exercise in these modes. PMID:24211317

  14. 5 CFR 351.807 - Certification of Expected Separation.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... false Certification of Expected Separation. 351.807 Section 351.807 ...351.807 Certification of Expected Separation. (a) For the purpose of enabling...agency may issue a Certificate of Expected Separation to a competing employee who...

  15. Comparing life expectancy and health-adjusted life expectancy by body mass index category in adult Canadians: a descriptive study

    PubMed Central

    2013-01-01

    Background While many studies have examined differences between body mass index (BMI) categories in terms of mortality risk and health-related quality of life (HRQL), little is known about the effect of body weight on health expectancy. We examined life expectancy (LE), health-adjusted life expectancy (HALE), and proportion of LE spent in nonoptimal (or poor) health by BMI category for the Canadian adult population (age ? 20). Methods Respondents to the National Population Health Survey (NPHS) were followed for mortality outcomes from 1994 to 2009. Our study population at baseline (n=12,478) was 20 to 100 years old with an average age of 47. LE was produced by building abridged life tables by sex and BMI category using data from the NPHS and the Canadian Chronic Disease Surveillance System. HALE was estimated using the Health Utilities Index from the Canadian Community Health Survey as a measure of HRQL. The contribution of HRQL to loss of healthy life years for each BMI category was also assessed using two methods: by calculating differences between LE and HALE proportional to LE and by using a decomposition technique to separate out mortality and HRQL contributions to loss of HALE. Results At age 20, for both sexes, LE is significantly lower in the underweight and obesity class 2+ categories, but significantly higher in the overweight category when compared to normal weight (obesity class 1 was nonsignificant). HALE at age 20 follows these same associations and is significantly lower for class 1 obesity in women. Proportion of life spent in nonoptimal health and decomposition of HALE demonstrate progressively higher losses of healthy life associated with lowered HRQL for BMI categories in excess of normal weight. Conclusions Although being in the overweight category for adults may be associated with a gain in life expectancy as compared to normal weight adults, overweight individuals also experience a higher proportion of these years of life in poorer health. Due to the descriptive nature of this study, further research is needed to explore the causal mechanisms which explain these results, including the important differences we observed between sexes and within obesity subcategories. PMID:24252500

  16. The future of life expectancy and life expectancy inequalities in England and Wales: Bayesian spatiotemporal forecasting

    PubMed Central

    Bennett, James E; Li, Guangquan; Foreman, Kyle; Best, Nicky; Kontis, Vasilis; Pearson, Clare; Hambly, Peter; Ezzati, Majid

    2015-01-01

    Summary Background To plan for pensions and health and social services, future mortality and life expectancy need to be forecast. Consistent forecasts for all subnational units within a country are very rare. Our aim was to forecast mortality and life expectancy for England and Wales' districts. Methods We developed Bayesian spatiotemporal models for forecasting of age-specific mortality and life expectancy at a local, small-area level. The models included components that accounted for mortality in relation to age, birth cohort, time, and space. We used geocoded mortality and population data between 1981 and 2012 from the Office for National Statistics together with the model with the smallest error to forecast age-specific death rates and life expectancy to 2030 for 375 of England and Wales' 376 districts. We measured model performance by withholding recent data and comparing forecasts with this withheld data. Findings Life expectancy at birth in England and Wales was 79·5 years (95% credible interval 79·5–79·6) for men and 83·3 years (83·3–83·4) for women in 2012. District life expectancies ranged between 75·2 years (74·9–75·6) and 83·4 years (82·1–84·8) for men and between 80·2 years (79·8–80·5) and 87·3 years (86·0–88·8) for women. Between 1981 and 2012, life expectancy increased by 8·2 years for men and 6·0 years for women, closing the female–male gap from 6·0 to 3·8 years. National life expectancy in 2030 is expected to reach 85·7 (84·2–87·4) years for men and 87·6 (86·7–88·9) years for women, further reducing the female advantage to 1·9 years. Life expectancy will reach or surpass 81·4 years for men and reach or surpass 84·5 years for women in every district by 2030. Longevity inequality across districts, measured as the difference between the 1st and 99th percentiles of district life expectancies, has risen since 1981, and is forecast to rise steadily to 8·3 years (6·8–9·7) for men and 8·3 years (7·1–9·4) for women by 2030. Interpretation Present forecasts underestimate the expected rise in life expectancy, especially for men, and hence the need to provide improved health and social services and pensions for elderly people in England and Wales. Health and social policies are needed to curb widening life expectancy inequalities, help deprived districts catch up in longevity gains, and avoid a so-called grand divergence in health and longevity. Funding UK Medical Research Council and Public Health England. PMID:25935825

  17. Maximal CP and Bounds on the Neutron Electric Dipole Moment from P and CP Breaking

    E-print Network

    Ravi Kuchimanchi

    2012-08-09

    We find in theories with spontaneous P and CP violation that symmetries needed to set the tree level strong CP phase to zero can also set all non-zero tree level CP violating phases to the maximal value \\pi / 2 in the symmetry basis simultaneously explaining the smallness of \\bar{\\theta} and the largeness of the CKM CP violating phase. In these models we find the one loop lower bound \\bar{\\theta} > 10^{-11} relevant for early discovery of neutron edm d_n > 10^{-27} ecm. The lower bound relaxes to \\bar{\\theta} > 10^{-13} or d_n > 10^{-29} ecm for the case where the CP phases are non-maximal. Interestingly the spontaneous CP phase appears in the quark sector, not the Higgs sector, and is enabled by a heavy left-right symmetric vectorlike quark family with mass M. These results do not vanish in the decoupling limit of M_{H_2^+} > M \\rightarrow \\infty (where M_{H_2^+} is the mass of heavy Higgs at the parity breaking scale) and the age-old expectation that laws of nature (or its Lagrangian) are parity and matter-antimatter symmetric may be testable by the above predictions and EDM experiments, even if new physics occurs only at see-saw, GUT or Planck scales. There is also a region in parameter space with M_{H_2^+} < M where the above bounds are dampened by the factor (M_{H_2^+}/M)^2. By using flavour symmetries and texture arguments we also make predictions for the CKM phase that arises from the maximal phase on diagonalization to the physical basis. There are no axions predicted in this model.

  18. A Note on Locally Unextendible Non-Maximally Entangled Basis

    E-print Network

    Bin Chen; Halqem Nizamidin; Shao-Ming Fei

    2013-04-28

    We study the locally unextendible non-maximally entangled basis (LUNMEB) in $H^{d}\\bigotimes H^{d}$. We point out that there exists an error in the proof of the main result of LUNMEB [Quant. Inf. Comput. 12, 0271(2012)], which claims that there are at most $d$ orthogonal vectors in a LUNMEB, constructed from a given non-maximally entangled state. We show that both the proof and the main result are not correct in general. We present a counter example for $d=4$, in which five orthogonal vectors from a specific non-maximally entangled state are constructed. Besides, we completely solve the problem of LUNMEB for the case of $d=2$.

  19. Expected background in the LZ experiment

    NASA Astrophysics Data System (ADS)

    Kudryavtsev, Vitaly A.

    2015-08-01

    The LZ experiment, featuring a 7-tonne active liquid xenon target, is aimed at achieving unprecedented sensitivity to WIMPs with the background expected to be dominated by astrophysical neutrinos. To reach this goal, extensive simulations are carried out to accurately calculate the electron recoil and nuclear recoil rates in the detector. Both internal (from target material) and external (from detector components and surrounding environment) backgrounds are considered. A very efficient suppression of background rate is achieved with an outer liquid scintillator veto, liquid xenon skin and fiducialisation. Based on the current measurements of radioactivity of different materials, it is shown that LZ can achieve the reduction of a total background for a WIMP search down to about 2 events in 1000 live days for 5.6 tonne fiducial mass.

  20. Preparing for TESS: What to Expect

    NASA Astrophysics Data System (ADS)

    Berta-Thompson, Zachory K.

    2015-08-01

    The Transiting Exoplanet Survey Satellite (TESS) will launch in 2017 as a NASA Explorer mission, and will discover hundreds of new small planets transiting nearby, bright stars. As has been the case with Kepler, understanding and limiting systematic noise sources will be key to squeezing the best photometric precision out of the TESS instrument. I will describe our efforts at MIT to minimize such systematics, speaking both generally and in regard to one very specific challenge: mitigating the scourge of cosmic rays passing through TESS's thick CCD detectors. I will present the current data collection strategy and its expected performance in light of these known error sources, and I will share detailed simulations of what the TESS survey data will be like. Harnessing the unique opportunity offered by this focus meeting, I hope to solicit feedback from the ExoStats community on what additional lessons from Kepler should be considered in advance of the launch of TESS.

  1. Expected results of Cassini Radio Science experiments

    NASA Astrophysics Data System (ADS)

    Castillo, J.; Rappaport, N.

    Cassini gravity radio science experiments scheduled from February 2005 to July 2008 are expected to improve our knowledge of the Saturnian system through direct measurements of gravity parameters performed in a multidisciplinary and comparative planetary science approach. In 2005, direct mass determination will be achieved for Enceladus, Hyperion, and Dione, as well as gravity field measurement of Rhea. Detection of an ocean suspected to lie within Titan is expected to happen by 2007. However, after the two first flybys scheduled in 2006, the determination of the dimensionless moment of inertia of this body will provide scientists with enough information to build detailed models to be compared with the Galilean satellites. Density determination of all major satellites will be performed through navigation passes scheduled throughout the tour. Accurate and independent determination of Saturn's high zonal harmonics up to degree 8 will provide crucial constraints on the interior of this giant planet by the end of the initial mission. Comparison of direct mass determination with values inferred from analytical theories is very important. Besides, density distribution sampling in the Saturnian system will provide new constraints on the models of evolution of Saturn's subnebula, as well as references for compared planetology with the Jovian satellites. This is particularly timely as a mission toward Jupiter is being scheduled in the frame of NASA New Frontiers program. Fresh geophysical observations of icy satellites and the finding or absence of a deep ocean within Titan will be crucial inputs for constraining numerical models of internal and external dynamics of this category of bodies. We will especially stress out the synergy between the information provided by the Radio Science Subsystems with the other instruments onboard Cassini to leverage our understanding of the phenomena responsible for the dynamics and evolution of the icy satellites.

  2. Cardiovascular consequences of bed rest: effect on maximal oxygen uptake

    NASA Technical Reports Server (NTRS)

    Convertino, V. A.

    1997-01-01

    Maximal oxygen uptake (VO2max) is reduced in healthy individuals confined to bed rest, suggesting it is independent of any disease state. The magnitude of reduction in VO2max is dependent on duration of bed rest and the initial level of aerobic fitness (VO2max), but it appears to be independent of age or gender. Bed rest induces an elevated maximal heart rate which, in turn, is associated with decreased cardiac vagal tone, increased sympathetic catecholamine secretion, and greater cardiac beta-receptor sensitivity. Despite the elevation in heart rate, VO2max is reduced primarily from decreased maximal stroke volume and cardiac output. An elevated ejection fraction during exercise following bed rest suggests that the lower stroke volume is not caused by ventricular dysfunction but is primarily the result of decreased venous return associated with lower circulating blood volume, reduced central venous pressure, and higher venous compliance in the lower extremities. VO2max, stroke volume, and cardiac output are further compromised by exercise in the upright posture. The contribution of hypovolemia to reduced cardiac output during exercise following bed rest is supported by the close relationship between the relative magnitude (% delta) and time course of change in blood volume and VO2max during bed rest, and also by the fact that retention of plasma volume is associated with maintenance of VO2max after bed rest. Arteriovenous oxygen difference during maximal exercise is not altered by bed rest, suggesting that peripheral mechanisms may not contribute significantly to the decreased VO2max. However reduction in baseline and maximal muscle blood flow, red blood cell volume, and capillarization in working muscles represent peripheral mechanisms that may contribute to limited oxygen delivery and, subsequently, lowered VO2max. Thus, alterations in cardiac and vascular functions induced by prolonged confinement to bed rest contribute to diminution of maximal oxygen uptake and reserve capacity to perform physical work.

  3. Has the Brain Maximized its Information Storage Capacity?

    E-print Network

    Armen Stepanyants

    2003-07-11

    Learning and memory may rely on the ability of neuronal circuits to reorganize by dendritic spine remodeling. We have looked for geometrical parameters of cortical circuits, which maximize information storage capacity associated with this mechanism. In particular, we calculated optimal volume fractions of various neuropil components. The optimal axonal and dendritic volume fractions are not significantly different from anatomical measurements in the mouse and rat neocortex, and the rat hippocampus. This has led us to propose that the maximization of information storage capacity associated with dendritic spine remodeling may have been an important driving force in the evolution of the cortex.

  4. Anatomy of maximal stop mixing in the MSSM

    E-print Network

    Brümmer, Felix; Kulkarni, Suchita

    2012-01-01

    A Standard Model-like Higgs near 125 GeV in the MSSM requires multi-TeV stop masses, or a near-maximal contribution to its mass from stop mixing. We investigate the maximal mixing scenario, and in particular its prospects for being realized it in potentially realistic GUT models. We work out constraints on the possible GUT-scale soft terms, which we compare with what can be obtained from some well-known mechanisms of SUSY breaking mediation. Finally, we analyze two promising scenarios in detail, namely gaugino mediation and gravity mediation with non-universal Higgs masses.

  5. Anatomy of maximal stop mixing in the MSSM

    E-print Network

    Felix Brümmer; Sabine Kraml; Suchita Kulkarni

    2012-05-03

    A Standard Model-like Higgs near 125 GeV in the MSSM requires multi-TeV stop masses, or a near-maximal contribution to its mass from stop mixing. We investigate the maximal mixing scenario, and in particular its prospects for being realized it in potentially realistic GUT models. We work out constraints on the possible GUT-scale soft terms, which we compare with what can be obtained from some well-known mechanisms of SUSY breaking mediation. Finally, we analyze two promising scenarios in detail, namely gaugino mediation and gravity mediation with non-universal Higgs masses.

  6. Power-law cosmologies in minimal and maximal gauged supergravity

    E-print Network

    J. Blåbäck; A. Borghese; S. S. Haque

    2013-03-13

    In this paper we search for accelerating power-law solutions and ekpyrotic solutions within minimal and maximal four dimensional supergravity theories. We focus on the STU model for N=1 and on the new CSO(p,q,r) theories, which were recently obtained exploiting electromagnetic duality, for N=8. In the minimal case we find some new ekpyrotic solutions, while in the maximal case we find some new generic power-law solutions. We do not find any new accelerating solutions for these models.

  7. Projection of two biphoton qutrits onto a maximally entangled state.

    PubMed

    Halevy, A; Megidish, E; Shacham, T; Dovrat, L; Eisenberg, H S

    2011-04-01

    Bell state measurements, in which two quantum bits are projected onto a maximally entangled state, are an essential component of quantum information science. We propose and experimentally demonstrate the projection of two quantum systems with three states (qutrits) onto a generalized maximally entangled state. Each qutrit is represented by the polarization of a pair of indistinguishable photons-a biphoton. The projection is a joint measurement on both biphotons using standard linear optics elements. This demonstration enables the realization of quantum information protocols with qutrits, such as teleportation and entanglement swapping. PMID:21517363

  8. Sequential, solid-phase assay for biotin in physiologic fluids that correlates with expected biotin status

    SciTech Connect

    Mock, D.M.; DuBois, D.B.

    1986-03-01

    Interest in accurate measurement of biotin concentrations in plasma and urine has been stimulated by recent advances in the understanding of biotin-responsive inborn errors of metabolism and by several reports describing acquired biotin deficiency during parenteral alimentation. This paper presents a biotin assay utilizing radiolabeled avidin in a sequential, solid-phase method; the assay has increased sensitivity compared to previous methods (greater than or equal to 10 fmol/tube), correlates with expected trends in biotin concentrations in blood and urine in a rat model of biotin deficiency, and can utilize commercially available radiolabeled avidin.

  9. Cogeneration - A Utility Perspective 

    E-print Network

    Williams, M.

    1983-01-01

    Cogeneration has become an extremely popular subject when discussing conservation and energy saving techniques. One of the key factors which effect conservation is the utility viewpoint on PURPA and cogeneration rule making. These topics...

  10. PAM stack test utility

    Energy Science and Technology Software Center (ESTSC)

    2007-08-22

    The pamtest utility calls the normal PAM hooks using a service and username supplied on the command line. This allows an administratory to test any one of many configured PAM stacks as any existing user on the machine.

  11. The bond ionicity in ANB8-N compounds from maximally localized Wannier functions

    NASA Astrophysics Data System (ADS)

    Qteish, Abdallah

    2015-07-01

    The bond ionicity in seventy two ANB8-N compounds is investigated according to the recently introduced first-principles ionicity scale, based on the centers of the maximally localized Wannier functions, which has several interesting features. The obtained bond ionicities (qi) are found to exhibit the expected trends, according to electronegativity arguments. In particular, the bond ionicity in the alkaline-earth oxides increases by going from MgO to BaO. A strong crystal structure dependence of qi is observed. A critical value of qi (of 0.91) that separates between the tetrahedrally and octahedrally coordinated systems is inferred directly from the calculated values of qi. The volume dependence of qi is investigated for all the considered compounds and found to reduce by volume decrease for most of the studied systems. The adopted ionicity scale is established as a very strong competitor to the most widely accepted Phillips and Pauling ionicity measures.

  12. Utility requirements for fusion

    SciTech Connect

    Vondrasek, R.J.

    1982-02-01

    This report describes work done and results obtained during performance of Task 1 of a study of Utility Requirements and Criteria for Fusion Options. The work consisted of developing a list of utility requirements for fusion optics containing definition of the requirements and showing their relative importance to the utility industry. The project team members developed a preliminary list which was refined by discussions and literature searches. The refined list was recast as a questionnaire which was sent to a substantial portion of the utility industry in this country. Forty-three questionnaire recipients responded including thirty-two utilities. A workshop was held to develop a revised requirements list using the survey responses as a major input. The list prepared by the workshop was further refined by a panel consisting of vice presidents of the three project team firms. The results of the study indicate that in addition to considering the cost of energy for a power plant, utilities consider twenty-three other requirements. Four of the requirements were judged to be vital to plant acceptability: Plant Capital Cost, Financial Liability, Plant Safety and Licensability.

  13. Camera scheduling and energy allocation for lifetime maximization in user-centric visual sensor networks.

    PubMed

    Yu, Chao; Sharma, Gaurav

    2010-08-01

    We explore camera scheduling and energy allocation strategies for lifetime optimization in image sensor networks. For the application scenarios that we consider, visual coverage over a monitored region is obtained by deploying wireless, battery-powered image sensors. Each sensor camera provides coverage over a part of the monitored region and a central processor coordinates the sensors in order to gather required visual data. For the purpose of maximizing the network operational lifetime, we consider two problems in this setting: a) camera scheduling, i.e., the selection, among available possibilities, of a set of cameras providing the desired coverage at each time instance, and b) energy allocation, i.e., the distribution of total available energy between the camera sensor nodes. We model the network lifetime as a stochastic random variable that depends upon the coverage geometry for the sensors and the distribution of data requests over the monitored region, two key characteristics that distinguish our problem from other wireless sensor network applications. By suitably abstracting this model of network lifetime and utilizing asymptotic analysis, we propose lifetime-maximizing camera scheduling and energy allocation strategies. The effectiveness of the proposed camera scheduling and energy allocation strategies is validated by simulations. PMID:20350857

  14. Modeling regulated water utility investment incentives

    NASA Astrophysics Data System (ADS)

    Padula, S.; Harou, J. J.

    2014-12-01

    This work attempts to model the infrastructure investment choices of privatized water utilities subject to rate of return and price cap regulation. The goal is to understand how regulation influences water companies' investment decisions such as their desire to engage in transfers with neighbouring companies. We formulate a profit maximization capacity expansion model that finds the schedule of new supply, demand management and transfer schemes that maintain the annual supply-demand balance and maximize a companies' profit under the 2010-15 price control process in England. Regulatory incentives for costs savings are also represented in the model. These include: the CIS scheme for the capital expenditure (capex) and incentive allowance schemes for the operating expenditure (opex) . The profit-maximizing investment program (what to build, when and what size) is compared with the least cost program (social optimum). We apply this formulation to several water companies in South East England to model performance and sensitivity to water network particulars. Results show that if companies' are able to outperform the regulatory assumption on the cost of capital, a capital bias can be generated, due to the fact that the capital expenditure, contrarily to opex, can be remunerated through the companies' regulatory capital value (RCV). The occurrence of the 'capital bias' or its entity depends on the extent to which a company can finance its investments at a rate below the allowed cost of capital. The bias can be reduced by the regulatory penalties for underperformances on the capital expenditure (CIS scheme); Sensitivity analysis can be applied by varying the CIS penalty to see how and to which extent this impacts the capital bias effect. We show how regulatory changes could potentially be devised to partially remove the 'capital bias' effect. Solutions potentially include allowing for incentives on total expenditure rather than separately for capex and opex and allowing both opex and capex to be remunerated through a return on the company's regulatory capital value.

  15. Maximizing the Value of Photovoltaic Installations on Schools in California: Choosing the Best Electricity Rates

    SciTech Connect

    Ong, S.; Denholm, P.

    2011-07-01

    Schools in California often have a choice between multiple electricity rate options. For schools with photovoltaic (PV) installations, choosing the right rate is essential to maximize the value of PV generation. The rate option that minimizes a school?s electricity expenses often does not remain the most economical choice after the school installs a PV system. The complex interaction between PV generation, building load, and rate structure makes determining the best rate a challenging task. This report evaluates 22 rate structures across three of California?s largest electric utilities--Pacific Gas and Electric Co. (PG&E), Southern California Edison (SCE), and San Diego Gas and Electric (SDG&E)--in order to identify common rate structure attributes that are favorable to PV installations.

  16. Operation of MRO's High Resolution Imaging Science Experiment (HiRISE): Maximizing Science Participation

    NASA Technical Reports Server (NTRS)

    Eliason, E.; Hansen, C. J.; McEwen, A.; Delamere, W. A.; Bridges, N.; Grant, J.; Gulich, V.; Herkenhoff, K.; Keszthelyi, L.; Kirk, R.

    2003-01-01

    Science return from the Mars Reconnaissance Orbiter (MRO) High Resolution Imaging Science Experiment (HiRISE) will be optimized by maximizing science participation in the experiment. MRO is expected to arrive at Mars in March 2006, and the primary science phase begins near the end of 2006 after aerobraking (6 months) and a transition phase. The primary science phase lasts for almost 2 Earth years, followed by a 2-year relay phase in which science observations by MRO are expected to continue. We expect to acquire approx. 10,000 images with HiRISE over the course of MRO's two earth-year mission. HiRISE can acquire images with a ground sampling dimension of as little as 30 cm (from a typical altitude of 300 km), in up to 3 colors, and many targets will be re-imaged for stereo. With such high spatial resolution, the percent coverage of Mars will be very limited in spite of the relatively high data rate of MRO (approx. 10x greater than MGS or Odyssey). We expect to cover approx. 1% of Mars at approx. 1m/pixel or better, approx. 0.1% at full resolution, and approx. 0.05% in color or in stereo. Therefore, the placement of each HiRISE image must be carefully considered in order to maximize the scientific return from MRO. We believe that every observation should be the result of a mini research project based on pre-existing datasets. During operations, we will need a large database of carefully researched 'suggested' observations to select from. The HiRISE team is dedicated to involving the broad Mars community in creating this database, to the fullest degree that is both practical and legal. The philosophy of the team and the design of the ground data system are geared to enabling community involvement. A key aspect of this is that image data will be made available to the planetary community for science analysis as quickly as possible to encourage feedback and new ideas for targets.

  17. Does Television Viewing Cultivate Unrealistic Expectations About Marriage?

    ERIC Educational Resources Information Center

    Segrin, Chris; Nabi, Robin L.

    2002-01-01

    Examines relationship between television viewing, holding idealistic expectations about marriage, and intentions to marry among undergraduate students. Finds overall television viewing has a negative association with idealistic marriage expectations; romantic genre programming was positively associated with high expectations; and expectations were…

  18. Expectations of Achievement: Student, Teacher and Parent Perceptions

    ERIC Educational Resources Information Center

    Rubie-Davies, Christine M.; Peterson, Elizabeth; Irving, Earl; Widdowson, Deborah; Dixon, Robyn

    2010-01-01

    Teachers' expectations of students have been extensively studied for forty years. However, students' self-expectations and the expectations of parents are less well understood. The aim of the study was to investigate the role of student, teacher and parent expectations in relation to student achievement from the perspective of each group. Focus…

  19. Predicting gambling problems from gambling outcome expectancies in college student-athletes.

    PubMed

    St-Pierre, Renée A; Temcheff, Caroline E; Gupta, Rina; Derevensky, Jeffrey; Paskus, Thomas S

    2014-03-01

    While previous research has suggested the potential importance of gambling outcome expectancies in determining gambling behaviour among adolescents, the predictive ability of gambling outcome expectancies has not yet been clearly delineated for college-aged youth. The current study aims to explore the relationships between gender and outcome expectancies in the prediction of gambling severity among college student-athletes. Data from the National Collegiate Athletic Association (NCAA) study assessing gambling behaviours and problems among U.S. college student-athletes were utilized. Complete data was available for 7,517 student-athletes. As expected, male college student-athletes reported more gambling participation as well as greater gambling problems than their female counterparts. Findings showed positive relationships between the outcome expectancies of financial gain, and negative emotional impacts and gambling problems. That is, those who endorsed more items on the outcome expectancy scales for financial gain and negative emotional impacts also tended to endorse more gambling-related problems. Findings also showed a negative relationship between outcome expectancies of fun and enjoyment, and gambling problems over and above the variance accounted for by gender. Those with gambling problems were less likely to have the expectation that gambling would be fun than those without gambling problems. Despite NCAA efforts to curb gambling activity, the results suggest that college student-athletes are at risk for over-involvement in gambling. Therefore, it is important to explore gambling outcome expectancies within this group since the motivations and reasons for gambling might be able to inform treatment initiatives. PMID:23307022

  20. Online Scheduling of Parallel Jobs on Hypercubes: Maximizing the Throughput

    E-print Network

    Sgall, Jiri

    Online Scheduling of Parallel Jobs on Hypercubes: Maximizing the Throughput Ondrej Zaj´icek1 , Jir of scheduling unit-time parallel jobs on hypercubes. A parallel job has to be scheduled between its release time and deadline on a subcube of processors. The objective is to max- imize the number of early jobs. We provide

  1. Emotional Control and Instructional Effectiveness: Maximizing a Timeout

    ERIC Educational Resources Information Center

    Andrews, Staci R.

    2015-01-01

    This article provides recommendations for best practices for basketball coaches to maximize the instructional effectiveness of a timeout during competition. Practical applications are derived from research findings linking emotional intelligence to effective coaching behaviors. Additionally, recommendations are based on the implications of the…

  2. An effective theory of metrics with maximal acceleration

    E-print Network

    Ricardo Gallego Torromé

    2015-10-15

    A geometric theory for spacetimes whose world lines associated with physical particles have an upper bound for the proper acceleration is developed. After some fundamental remarks on the requirements that the classical dynamics for point particles should hold, the notion of generalized metric and a theory of maximal proper acceleration are introduced. A perturbative approach to metrics of maximal proper acceleration is discussed and we show how it provides a consistent theory where the associated Lorentzian metric corresponds to the limit when the maximal proper acceleration goes to infinity. Then several of the physical and kinematical properties of the maximal acceleration metric are investigated, including a discussion of the rudiments of the causal theory and the introduction of the notions of radar distance and celerity function. We discuss the corresponding modification of the Einstein mass-energy relation when the associated Lorentzian geometry is flat. In such context it is also proved that the physical dispersion relation is relativistic. Two possible physical scenarios where the modified mass-energy relation could be confronted against experiment are briefly discussed.

  3. On Maximal Subalgebras and the Hypercentre of Lie Algebras.

    ERIC Educational Resources Information Center

    Honda, Masanobu

    1997-01-01

    Derives two sufficient conditions for a finitely generated Lie algebra to have the nilpotent hypercenter. Presents a relatively large class of generalized soluble Lie algebras. Proves that if a finitely generated Lie algebra has a nilpotent maximal subalgebra, the Fitting radical is nilpotent. (DDR)

  4. POLES OF MAXIMAL ORDER OF IGUSA ZETA JOHANNES NICAISE

    E-print Network

    Payne, Sam

    POLES OF MAXIMAL ORDER OF IGUSA ZETA FUNCTIONS JOHANNES NICAISE Abstract. These are notes for the third in a series of lectures at the 2015 Simons Symposium on Tropical and Non-Archimedean Geometry of the 2013 Symposium. Here we discuss an application to Igusa zeta functions, obtained in collaboration

  5. Maximizing Battery Life Routing in Wireless Ad Hoc Networks

    E-print Network

    Liang, Weifa

    Maximizing Battery Life Routing in Wireless Ad Hoc Networks Weifa Liang Department of Computer Department of Computer Science Dalian University of Technology Dalian, 116024, P. R. China yys216@263.net Abstract--Most wireless ad hoc networks consist of mobile devices which operate on batteries. Power con

  6. EMSL Strategic Plan to Maximize Scientific Impact of

    E-print Network

    EMSL Strategic Plan to Maximize Scientific Impact of HRMAC PEMP Notable Outcomes Goal 3.2 AA Strategic Plan. Section 3 outlines a specific, actionable 2-3-year plan for achieving the goals in Section 1, and microfluidics. HRMAC will be the best, if not the only, capability in the world to characterize molecular

  7. Targeted Enrichment: Maximizing Orthologous Gene Comparisons across Deep Evolutionary Time

    E-print Network

    Hillis, David

    Targeted Enrichment: Maximizing Orthologous Gene Comparisons across Deep Evolutionary Time Shannon a single species to test the limits of hybridization-based enrichment of hundreds of exons across frog species that diverged up to 250 million years ago. Enrichment success for a given species depends greatly

  8. EMSL Strategic Plan to Maximize Scientific Impact of

    E-print Network

    of other DOE offices, and the greater scientific community. In order to jump start interest within the scientific community (particularly in the BER community) accelerate the impact of this capabilityEMSL Strategic Plan to Maximize Scientific Impact of the Radiochemistry Annex PEMP Notable Outcomes

  9. Maximal tree size of few-qubit states

    E-print Network

    Huy Nguyen Le; Yu Cai; Xingyao Wu; Rafael Rabelo; Valerio Scarani

    2014-07-10

    Tree size ($\\rm{TS}$) is an interesting measure of complexity for multiqubit states: not only is it in principle computable, but one can obtain lower bounds for it. In this way, it has been possible to identify families of states whose complexity scales superpolynomially in the number of qubits. With the goal of progressing in the systematic study of the mathematical property of $\\rm{TS}$, in this work we characterize the tree size of pure states for the case where the number of qubits is small, namely, 3 or 4. The study of three qubits does not hold great surprises, insofar as the structure of entanglement is rather simple; the maximal $\\rm{TS}$ is found to be 8, reached for instance by the $|\\rm{W}\\rangle$ state. The study of four qubits yields several insights: in particular, the most economic description of a state is found not to be recursive. The maximal $\\rm{TS}$ is found to be 16, reached for instance by a state called $|\\Psi^{(4)}\\rangle$ which was already discussed in the context of four-photon down-conversion experiments. We also find that the states with maximal tree size form a set of zero measure: a smoothed version of tree size over a neighborhood of a state ($\\epsilon-\\rm{TS}$) reduces the maximal values to 6 and 14, respectively. Finally, we introduce a notion of tree size for mixed states and discuss it for a one-parameter family of states.

  10. Statistical 3D Cranio-Facial Models Maxime Berar

    E-print Network

    Payan, Yohan

    Tronche France Yohan.Payan @imag.fr Abstract In forensic science, 3D cranio facail reconstruction is used or occulted data problem. Results are visually correct. 1. Introduction In forensic science, faceStatistical 3D Cranio-Facial Models Maxime Berar Laboratoire des images et des signaux, Saint

  11. The University of Reading 1 Finding Maximal Cliques Using

    E-print Network

    Mitchell, Richard

    ,320 seconds in MATLAB RJM algorithm first attempt: 2.62 secs! Bron Kerbosch MATLAB similar Careful coding in MATLAB use built in MATRIX functions (less code to interpret) Work on columns not rows Algorithm took 0The University of Reading 1 Finding Maximal Cliques Using MATLAB Dr Richard Mitchell Cybernetics

  12. Maximizing Efficiency of Solar-Powered Systems by Load Matching

    E-print Network

    Shinozuka, Masanobu

    Maximizing Efficiency of Solar-Powered Systems by Load Matching Pai H. Chou, Dexin Li and Sungjun,dexinl,ksungjun}@uci.edu ABSTRACT Solar power is an important source of renewable energy for many low-power systems. Matching's total en- ergy output under a given solar profile by load matching. The power efficiency was validated

  13. Optimal technique for maximal forward rotating vaults in men's gymnastics.

    PubMed

    Hiley, Michael J; Jackson, Monique I; Yeadon, Maurice R

    2015-08-01

    In vaulting a gymnast must generate sufficient linear and angular momentum during the approach and table contact to complete the rotational requirements in the post-flight phase. This study investigated the optimization of table touchdown conditions and table contact technique for the maximization of rotation potential for forwards rotating vaults. A planar seven-segment torque-driven computer simulation model of the contact phase in vaulting was evaluated by varying joint torque activation time histories to match three performances of a handspring double somersault vault by an elite gymnast. The closest matching simulation was used as a starting point to maximize post-flight rotation potential (the product of angular momentum and flight time) for a forwards rotating vault. It was found that the maximized rotation potential was sufficient to produce a handspring double piked somersault vault. The corresponding optimal touchdown configuration exhibited hip flexion in contrast to the hyperextended configuration required for maximal height. Increasing touchdown velocity and angular momentum lead to additional post-flight rotation potential. By increasing the horizontal velocity at table touchdown, within limits obtained from recorded performances, the handspring double somersault tucked with one and a half twists, and the handspring triple somersault tucked became theoretically possible. PMID:26026290

  14. Octonionization of three player, two strategy maximally entangled quantum games

    E-print Network

    Aden Ahmed; Steve Bleiler; Faisal Shah Khan

    2012-02-14

    We develop an octonionic representation of the payoff function for three player, two strategy, maximally entangled quantum games in order to obtain computationally friendly version of this function. This computational capability is then exploited to analyze and potentially classify the Nash equilibria in the quantum games.

  15. Maximally Permissive Composition of Actors in Marten Lohstroh

    E-print Network

    Maximally Permissive Composition of Actors in Ptolemy II Marten Lohstroh Electrical Engineering Permissive Composition of Actors in Ptolemy II by Marten Lohstroh Committee in charge: Professor Edward A of Actors in Ptolemy II Copyright c 2013 by Marten Lohstroh Permission to make digital or hard copies of all

  16. Fertilizer placement to maximize nitrogen use by fescue

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The method of fertilizer nitrogen(N) application can affect N uptake in tall fescue and therefore its yield and quality. Subsurface-banding (knife) of fertilizer maximizes fescue N uptake in the poorly-drained clay–pan soils of southeastern Kansas. This study was conducted to determine if knifed N r...

  17. Maximizing Semantic Relatedness to Perform Word Sense Disambiguation

    E-print Network

    Pedersen, Ted

    Maximizing Semantic Relatedness to Perform Word Sense Disambiguation Ted Pedersen a, Satanjeev, Salt Lake City, UT Abstract This article presents a method of word sense disambiguation that assigns a target word the sense that is most related to the senses of its neighboring words. We explore the use

  18. CMOS Monolithic Voltage Converter ________________________________________________________________ Maxim Integrated Products 1

    E-print Network

    Berns, Hans-Gerd

    MAX660 CMOS Monolithic Voltage Converter monolithic, charge-pump voltage inverter converts a +1.5V to +5.5V input to a corresponding -1.5V to -5.5V literature: http://www.maxim-ic.com, or phone 1-800-998-8800 #12;CONDITIONS MAX660 CMOS Monolithic Voltage

  19. Throughput Maximization for Online Request Admissions in Mobile Cloudlets

    E-print Network

    Liang, Weifa

    Throughput Maximization for Online Request Admissions in Mobile Cloudlets Qiufen Xia, Weifa Liang, Australia ¶ School of Information Science and Technology, Sun Yat-Sen University, Guangzhou, 510006, China {qiufen.xia, wenzheng.xu}@anu.edu.au, wliang@cs.anu.edu.au Abstract--In mobile cloud computing (MCC

  20. Density-metric unimodular gravity: Vacuum maximal symmetry

    SciTech Connect

    Abbassi, A.H.; Abbassi, A.M.

    2011-05-15

    We have investigated the vacuum maximally symmetric solutions of recently proposed density-metric unimodular gravity theory. The results are widely different from inflationary scenario. The exponential dependence on time in deSitter space is substituted by a power law. Open space-times with non-zero cosmological constant are excluded.

  1. Maximal Independent Sets in Radio Networks Thomas Moscibroda

    E-print Network

    in the radio net- work model (as opposed to, say, message passing models). This is surprising when consideringMaximal Independent Sets in Radio Networks Thomas Moscibroda Computer Engineering and Networks We study the distributed complexity of computing a max- imal independent set (MIS) in radio networks

  2. Maximizing the Benefits of an Administrative Internship: Some Practical Advice.

    ERIC Educational Resources Information Center

    Oldfield, Kenneth; Ayers, Nancy

    Recommendations to help student interns in administrative positions maximize their educational opportunities vis-a-vis the "real world" and to also help them avoid certain placement-associated problems. The suggestions may be helpful to both new and established internship directors as well. Attention is focused on governmental administrative…

  3. An effective theory of metrics with maximal proper acceleration

    NASA Astrophysics Data System (ADS)

    Gallego Torromé, Ricardo

    2015-12-01

    A geometric theory for spacetimes whose world lines associated with physical particles have an upper bound for the proper acceleration is developed. After some fundamental remarks on the requirements that the classical dynamics for point particles should hold, the notion of a generalized metric and a theory of maximal proper acceleration are introduced. A perturbative approach to metrics of maximal proper acceleration is discussed and we show how it provides a consistent theory where the associated Lorentzian metric corresponds to the limit when the maximal proper acceleration goes to infinity. Then several of the physical and kinematical properties of the maximal acceleration metric are investigated, including a discussion of the rudiments of the causal theory and the introduction of the notions of radar distance and celerity function. We discuss the corresponding modification of the Einstein mass-energy relation when the associated Lorentzian geometry is flat. In such a context it is also proved that the physical dispersion relation is relativistic. Two possible physical scenarios where the modified mass-energy relation could be confronted against the experiment are briefly discussed.

  4. Maximizing Influence of Viral Marketing via Evolutionary User Selection

    E-print Network

    Yu, Qi

    in the network for efficient marketing. Nonetheless, most existing viral marketing techniques ignore the dynamic dynamics of the network to select an optimal subset of users that maximize the marketing influence over the network. I. INTRODUCTION The selection of users in viral marketing is based on the concept of network

  5. The Profit-Maximizing Firm: Old Wine in New Bottles.

    ERIC Educational Resources Information Center

    Felder, Joseph

    1990-01-01

    Explains and illustrates a simplified use of graphical analysis for analyzing the profit-maximizing firm. Believes that graphical analysis helps college students gain a deeper understanding of marginalism and an increased ability to formulate economic problems in marginalist terms. (DB)

  6. Constructive Notions of Maximality for Ideals Douglas S. Bridges

    E-print Network

    Constructive Notions of Maximality for Ideals Douglas S. Bridges (Department of Mathematics the constructive study of rings and ideals begun in [Bridges 2001], by examining two constructively distinct and Statistics, University of Canterbury, Christchurch, New Zealand d.bridges@math.canterbury.ac.nz) Robin S

  7. Modifying Softball for Maximizing Learning Outcomes in Physical Education

    ERIC Educational Resources Information Center

    Brian, Ali; Ward, Phillip; Goodway, Jacqueline D.; Sutherland, Sue

    2014-01-01

    Softball is taught in many physical education programs throughout the United States. This article describes modifications that maximize learning outcomes and that address the National Standards and safety recommendations. The modifications focus on tasks and equipment, developmentally appropriate motor-skill acquisition, increasing number of…

  8. Maximizing Thermal Efficiency and Optimizing Energy Management (Fact Sheet)

    SciTech Connect

    Not Available

    2012-03-01

    Researchers at the Thermal Test Facility (TTF) on the campus of the U.S. Department of Energy's National Renewable Energy Laboratory (NREL) in Golden, Colorado, are addressing maximizing thermal efficiency and optimizing energy management through analysis of efficient heating, ventilating, and air conditioning (HVAC) strategies, automated home energy management (AHEM), and energy storage systems.

  9. CHP Capacity Optimizer Identifies "Best Fit" to Maximize Cost Savings

    E-print Network

    Pennycook, Steve

    CHP Capacity Optimizer Identifies "Best Fit" to Maximize Cost Savings Spreadsheet Tool Determines, and power (CHP) applications can save both money and natural resources when system components are properly by the CHP system must be balanced with that of the conventional electricity grid and on-site boiler. The CHP

  10. Online Data Gathering for Maximizing Network Lifetime in Sensor Networks

    E-print Network

    Liang, Weifa

    sensors have significant power constraints (battery life), energy efficient methods must be employed transmission range. The main constraint of sensor nodes, however, is their low finite battery energies, which energy con- sumption but also to maximize the lifetime of each node in the network because a node failure

  11. Maximizing the heat flux in steady unicellular porous media convection

    E-print Network

    Lebovitz, Norman

    Maximizing the heat flux in steady unicellular porous media convection Lindsey T. Corson University of Strathclyde 1 Introduction Convection in a horizontal porous layer heated from below is relevant to a variety is to determine the maximum heat transport attainable in steady 2D unicel- 1 #12;lular porous media convection

  12. Ground Truth Estimation by Maximizing Topological Agreements in Electron Microscopy

    E-print Network

    Choe, Yoonsuck

    Ground Truth Estimation by Maximizing Topological Agreements in Electron Microscopy Data Huei are not suited for electron microscopy (EM) images because they typically do not take into account topological results. 1 Introduction Electron microscopy (EM) image segmentation is the first step toward the recon

  13. Increasing life expectancy of water resources literature

    NASA Astrophysics Data System (ADS)

    Heistermann, M.; Francke, T.; Georgi, C.; Bronstert, A.

    2014-06-01

    In a study from 2008, Larivière and colleagues showed, for the field of natural sciences and engineering, that the median age of cited references is increasing over time. This result was considered counterintuitive: with the advent of electronic search engines, online journal issues and open access publications, one could have expected that cited literature is becoming younger. That study has motivated us to take a closer look at the changes in the age distribution of references that have been cited in water resources journals since 1965. Not only could we confirm the findings of Larivière and colleagues. We were also able to show that the aging is mainly happening in the oldest 10-25% of an average reference list. This is consistent with our analysis of top-cited papers in the field of water resources. Rankings based on total citations since 1965 consistently show the dominance of old literature, including text books and research papers in equal shares. For most top-cited old-timers, citations are still growing exponentially. There is strong evidence that most citations are attracted by publications that introduced methods which meanwhile belong to the standard toolset of researchers and practitioners in the field of water resources. Although we think that this trend should not be overinterpreted as a sign of stagnancy, there might be cause for concern regarding how authors select their references. We question the increasing citation of textbook knowledge as it holds the risk that reference lists become overcrowded, and that the readability of papers deteriorates.

  14. Rising Expectations: Access to Biomedical Information

    PubMed Central

    Lindberg, D. A. B.; Humphreys, B. L.

    2008-01-01

    Summary Objective To provide an overview of the expansion in public access to electronic biomedical information over the past two decades, with an emphasis on developments to which the U.S. National Library of Medicine contributed. Methods Review of the increasingly broad spectrum of web-accessible genomic data, biomedical literature, consumer health information, clinical trials data, and images. Results The amount of publicly available electronic biomedical information has increased dramatically over the past twenty years. Rising expectations regarding access to biomedical information were stimulated by the spread of the Internet, the World Wide Web, advanced searching and linking techniques. These informatics advances simplified and improved access to electronic information and reduced costs, which enabled inter-organizational collaborations to build and maintain large international information resources and also aided outreach and education efforts The demonstrated benefits of free access to electronic biomedical information encouraged the development of public policies that further increase the amount of information available. Conclusions Continuing rapid growth of publicly accessible electronic biomedical information presents tremendous opportunities and challenges, including the need to ensure uninterrupted access during disasters or emergencies and to manage digital resources so they remain available for future generations. PMID:18587496

  15. When sentences live up to your expectations.

    PubMed

    Tuennerhoff, Johannes; Noppeney, Uta

    2016-01-01

    Speech recognition is rapid, automatic and amazingly robust. How the brain is able to decode speech from noisy acoustic inputs is unknown. We show that the brain recognizes speech by integrating bottom-up acoustic signals with top-down predictions. Subjects listened to intelligible normal and unintelligible fine structure speech that lacked the predictability of the temporal envelope and did not enable access to higher linguistic representations. Their top-down predictions were manipulated using priming. Activation for unintelligible fine structure speech was confined to primary auditory cortices, but propagated into posterior middle temporal areas when fine structure speech was made intelligible by top-down predictions. By contrast, normal speech engaged posterior middle temporal areas irrespective of subjects' predictions. Critically, when speech violated subjects' expectations, activation increases in anterior temporal gyri/sulci signalled a prediction error and the need for new semantic integration. In line with predictive coding, our findings compellingly demonstrate that top-down predictions determine whether and how the brain translates bottom-up acoustic inputs into intelligible speech. PMID:26363344

  16. The expected metric principle for probabilistic information retrieval

    E-print Network

    Chen, Harr

    2007-01-01

    Traditionally, information retrieval systems aim to maximize the number of relevant documents returned to a user within some window of the top. For that goal, the Probability Ranking Principle, which ranks documents in ...

  17. The Online Expectations of College-Bound Juniors and Seniors. E-Expectations Report, 2012

    ERIC Educational Resources Information Center

    Noel-Levitz, Inc, 2012

    2012-01-01

    Noel-Levitz, OmniUpdate, CollegeWeekLive, and NRCCUA[R] (National Research Center for College & University Admissions) conducted a survey of 2,000 college-bound juniors and seniors about their expectations for college Web sites, mobile usage, e-mail, and social media. Among the findings: (1) More than 50 percent of students said the Web played a…

  18. Expectations Lead to Performance: The Transformative Power of High Expectations in Preschool

    ERIC Educational Resources Information Center

    Wang, Ye; Engler, Karen S.; Oetting, Tara L.

    2014-01-01

    This article describes the preschool program at Missouri State University where deaf and hard of hearing children with all communication modalities and all styles of personal assistive listening devices are served. The job of the early intervention providers is to model for parents what high expectations look like and how to translate those…

  19. It Is Not What You Expect: Dissociating Conflict Adaptation from Expectancies in a Stroop Task

    ERIC Educational Resources Information Center

    Jimenez, Luis; Mendez, Amavia

    2013-01-01

    In conflict tasks, congruency effects are modulated by the sequence of preceding trials. This modulation effect has been interpreted as an influence of a proactive mechanism of adaptation to conflict (Botvinick, Nystrom, Fissell, Carter, & Cohen, 1999), but the possible contribution of explicit expectancies to this adaptation effect remains…

  20. What To Expect When You're Expected To Teach: The Anxious Craft of Teaching Composition.

    ERIC Educational Resources Information Center

    Bramblett, Anne, Ed.; Knoblauch, Alison, Ed.

    This collection of essays addresses the anxieties and problems of beginning writing teachers and provides a reality check for those who expect success from "day one." Following an Introduction: "Silences in Our Teaching Stories; What Do We Leave Out and Why?" (Thomas Newkirk), essays in the collection are: (1) "Forty-Eight Eyeballs" (Carrie…

  1. Confidence intervals for expected moments algorithm flood quantile estimates

    USGS Publications Warehouse

    Cohn, T.A.; Lane, W.L.; Stedinger, J.R.

    2001-01-01

    Historical and paleoflood information can substantially improve flood frequency estimates if appropriate statistical procedures are properly applied. However, the Federal guidelines for flood frequency analysis, set forth in Bulletin 17B, rely on an inefficient "weighting" procedure that fails to take advantage of historical and paleoflood information. This has led researchers to propose several more efficient alternatives including the Expected Moments Algorithm (EMA), which is attractive because it retains Bulletin 17B's statistical structure (method of moments with the Log Pearson Type 3 distribution) and thus can be easily integrated into flood analyses employing the rest of the Bulletin 17B approach. The practical utility of EMA, however, has been limited because no closed-form method has been available for quantifying the uncertainty of EMA-based flood quantile estimates. This paper addresses that concern by providing analytical expressions for the asymptotic variance of EMA flood-quantile estimators and confidence intervals for flood quantile estimates. Monte Carlo simulations demonstrate the properties of such confidence intervals for sites where a 25- to 100-year streamgage record is augmented by 50 to 150 years of historical information. The experiments show that the confidence intervals, though not exact, should be acceptable for most purposes.

  2. Surface expressions of mantle upwellings: Expect the unexpected

    NASA Astrophysics Data System (ADS)

    Druken, K. A.; Kincaid, C. R.; Griffiths, R. W.

    2011-12-01

    Surface expressions of deep mantle upwellings are oftentimes complex and vary from highly simplified schematic models. We present results from a series of 3-D laboratory experiments that examine patterns of strain alignment within mantle upwellings under a variety of tectonic settings (e.g. mid-plate, spreading centers, subduction zones) and compare these to observations from seismic anisotropy studies. Laboratory experiments utilize a glucose working fluid with a temperature dependent density and viscosity. In the strain alignment cases, ~5 mm long synthetic paintbrush hairs, or "whiskers", are embedded within the fluid and used as passive markers for the local orientation of maximum finite strain. Contrary to the common expectation, results show that finite strain aligns tangent, not parallel, to the radial flow within plume heads. In cases where a plume rises under a moving plate or spreading center, strain markers are initially tangential and then evolve towards alignment with the shear flow. Within subduction settings, upwellings are so severely distorted by slab-driven flow that they appear seismically invisible in terms of anisotropy patterns.

  3. Utility terrestrial biodiversity issues

    SciTech Connect

    Breece, G.A.; Ward, B.J.

    1996-11-01

    Results from a survey of power utility biologists indicate that terrestrial biodiversity is considered a major issued by only a few utilities; however, a majority believe it may be a future issue. Over half of the respondents indicated that their company is involved in some management for biodiversity, and nearly all feel that it should be a goal for resource management. Only a few utilities are funding biodiversity research, but a majority felt more research was needed. Generally, larger utilities with extensive land holdings had greater opportunities and resources for biodiversity management. Biodiversity will most likely be a concern with transmission rights-of-way construction and maintenance, endangered species issues and general land resource management, including mining reclamation and hydro relicensing commitments. Over half of the companies surveyed have established voluntary partnerships with management groups, and biodiversity is a goal in nearly all the joint projects. Endangered species management and protection, prevention of forest fragmentation, wetland protection, and habitat creation and protection are the most common partnerships involving utility companies. Common management practices and unique approaches are presented, along with details of the survey. 4 refs.

  4. Utility View of Risk Assessment 

    E-print Network

    Bickham, J.

    1985-01-01

    This paper will address a utility perspective in regard to risk assessment, reliability, and impact on the utility system. Discussions will also include the critical issues for utilities when contracting for energy and ...

  5. Arctic Climate Change: Where Reality Exceeds Expectations

    NASA Astrophysics Data System (ADS)

    Serreze, M. C.

    2007-12-01

    It was probably around the year 2000 when I had an epiphany. A realization, after years of sitting on the fence, that the changes unfolding in the Arctic were too persistent, and too coherent among different parts of the system, to be simply dismissed as natural climate fluctuations. Seven years have passed, and despite imprints of natural variability , the Arctic has continued along a warming path. The emerging surprise is the rapidity of change. In many ways, it seems that reality has exceeded expectations, and that our vision of the Arctic's future is already upon us. The most visually striking evidence of rapid change is the Arctic's shrinking sea ice cover. While climate models tell us that sea ice extent should already be declining in response to greenhouse gas loading, observed trends are much steeper - we are perhaps 30 years ahead of schedule. Climate models also tell us that largely as a result of sea ice loss, Arctic warming will be outsized compared to the rest of the northern hemisphere. However, this so-called Arctic Amplification is already here. The signal appears to be firm, and growing in strength. In turn, the Greenland ice sheet seems to be stirring in ways quite unexpected ten years ago, with disturbing implications for sea level rise. Why is the Arctic changing so rapidly? What are the missing pieces of the puzzle? Given where we stand today, might we realize a seasonally ice free Arctic Ocean as soon as 30 years from now? This Nye lecture will attempt to shed some light on these issues.

  6. Water in stars: expected and unexpected

    NASA Astrophysics Data System (ADS)

    Tsuji, T.; Aoki, W.; Ohnaka, K.

    1999-03-01

    We have confirmed the presence of water in the early M giant ? Cet (M1.5III) and supergiant KK Per (M2Iab) by the highest resolution grating mode of SWS, but this result is quite unexpected from present model atmospheres. In late M giant and supergiant stars, water observed originates partly in the photosphere as expected by the model atmospheres, but ISO SWS has revealed that the 2.7 mic\\ absorption bands appear to be somewhat stronger than predicted while 6.5 mic\\ bands weaker, indicating the contamination by an emission component. In the mid-infrared region extending to 45 mic, pure rotation lines of hho\\ appear as distinct emission on the high resolution SWS spectra of 30g Her (M7III) and S Per (M4-7Ia), along with the dust emission at 10, 13, 20 mic\\ and a new unidentified feature at 30 mic. Thus, together with the dust, water contributes to the thermal balance of the outer atmosphere already in the mid-infrared. The excitation temperature of hho\\ gas is estimated to be 500 - 1000 K. In view of this result for late M (super)giants, unexpected water observed in early M (super)giants should also be of non-photospheric in origin. Thus, ISO has finally established the presence of a new component of the outer atmosphere - a warm molecular envelope - in red giant and supergiant stars from early to late types. Such a rather warm molecular envelope will be a site of various activities such as chemical reactions, dust formation, mass-outflow etc.

  7. Deregulation of electric utilities

    SciTech Connect

    Zaccour, G.

    1998-07-01

    This volume is a collection of fourteen, mainly applied, economic papers examining electric utility deregulation in many parts of the world. These papers were presented at the International Workshop on Deregulation of Electric Utilities held in Montreal, Canada in September 1997. As the title suggests, these papers cover a broad range of topics. Despite the book's scattershot approach, a small subset of contributors asks a fundamental question: Is the industry sufficiently deregulated? This book succeeds in providing some concrete and well-analyzed examples that examine this important question.

  8. On the expected properties of exomoons

    NASA Astrophysics Data System (ADS)

    Canup, Robin

    2015-12-01

    The potential discovery of exomoons is important, as they could provide constraints on their host planets’ formation, and large exomoons may represent potentially habitable environments. Detection of exomoons is extremely challenging. However, upper limits on exomoon masses have now been determined for a few dozen planets (Kipping et al. 2015), and additional constraints and/or detections are anticipated in the next several years.In our solar system, regular satellites are thought to have originated by two main processes: giant impacts and co-accretion. The origin of moons by collisions into solid planets is reasonably well-understood. Depending primarily on the impact angle and the mass of the impactor compared to the target, collisions can produce a broad range of satellite-to-planet mass ratios, Msat/Mp, ranging from tiny moons to relatively massive satellites such as the Moon (Msat/Mp = 0.01; e.g., Canup 2004) and Pluto’s Charon (Msat/Mp = 0.12; e.g., Canup 2005). In contrast, the satellite systems of the gas planets in our solar system all have Msat/Mp ~10^{-4}. This similarity is striking given what were presumably different accretion histories for each of these planets. It has been shown that a common satellite system mass ratio results when satellites co-accrete within disks produced by gas and solids inflowing to a planet, with the predicted value of (Msat/Mp) depending rather weakly on the ratio of the disk’s gas viscosity parameter to the gas-to-solids ratio in the inflow (Canup & Ward 2006).The transition between these two modes of origin is unclear, but could reasonably occur once a planet grows large enough to accrete substantial gas through a circumplanetary disk (e.g., Mp ~ 5 to 10 Earth masses; e.g., Machida et al. 2008; 2010). Alternative satellite-forming mechanisms are also possible, e.g., intact capture. However to date, exomoon upper limits appear consistent with expectations based on formation by impact or co-accretion. If exomoons form primarily by these two processes, the most likely hosts of a Mars-sized exomoon would be predominantly solid planets of several Earth masses, or gas giants substantially more massive than Jupiter.

  9. A 'figure-of-merit' approach to extraterrestrial resource utilization

    NASA Technical Reports Server (NTRS)

    Ramohalli, Kumar; Kirsch, Thomas

    1989-01-01

    An approach for interrelated optimizations in space missions that utilize extraterrestrial resources is developed, consisting in the concept of an overall mission 'figure-of-merit' leading to more realistic designs than through individual performance maximizations. After a brief discussion of this concept, the synthesis of four major components of any future space mission is considered. The four major components are: orbital mechanics of the transportation; performance of the rocket motors; support systems including power, thermal and process controls, and instruments; and in situ resource utilization plant equipment. The Mars Sample Return mission is the basis of an illustration of the proposed concept. A popular spreadsheet is used to quantitatively demonstrate the interdependent nature of the mission optimization. Future prospects are outlined, that promise great economy through extraterrestrial resource utilization and a quick evaluation technique.

  10. Complete utilization of spent coffee to biodiesel, bio-oil and biochar

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Energy production from renewable or waste biomass/material is a more attractive alternative compared to conventional feedstocks, such as corn and soybean. The objective of this study is to maximize utilization of any waste organic carbon material to produce renewable energy. This study presents tota...

  11. The last recession was good for life expectancy.

    PubMed

    Kristjuhan, Ulo; Taidre, Erika

    2012-04-01

    Most people think that economic growth and a good economy are prerequisites for good health and high life expectancy. As such, a recession should decrease life expectancy or stop it from rising. In fact, recessions can boost life expectancy. This was the case during the Great Depression in the United States from 1929 to 1932 and during the recession in the European Union in 2009. In 2009, life expectancy increased most rapidly in European countries where the decrease in gross domestic product was greatest-Estonia, Latvia, and Lithuania. Studies of life expectancy increasing during recessions can yield valuable information regarding extending average life expectancy without essential costs. PMID:22533416

  12. Relaunching a national social marketing campaign: expectations and challenges for the "new" ParticipACTION.

    PubMed

    Faulkner, Guy; McCloy, Cora; Plotnikoff, Ronald C; Tremblay, Mark S

    2011-07-01

    ParticipACTION is a Canadian physical activity communications and social marketing organization that has been relaunched in 2007 after a 6-year hiatus. The purpose of this study is to qualitatively identify and describe the expectations and challenges the relaunch of the new ParticipACTION may present for existing physical activity organizations. Using a purposeful sampling strategy, the authors conduct semistructured telephone interviews with 49 key informants representing a range of national, provincial, and local organizations with a mandate to promote physical activity. Overall, there is strong support in seeing ParticipACTION relaunched. However, organizational expectations and/or their ideal vision for it are mixed. Organizations envision and support its performing an overarching social marketing and advocacy role, and in providing tools and resources that supplement existing organizational activities. Four major organizational challenges are identified concerning overlapping mandates, partnership and leadership concerns, competition for funding, and capacity concerns. Social marketing initiatives, such as ParticipACTION, may not be able to maximize their impact unless they address the expectations and concerns of competing organizations with a mandate to promote physical activity. PMID:19861703

  13. Module utilization committee

    NASA Technical Reports Server (NTRS)

    Volkmer, K.; Praver, G.

    1984-01-01

    Photovoltaic collector modules were declared surplus to the needs of the U.S. Dept. of Energy. The Module Utilization Committee was formed to make appropriate disposition of the surplus modules on a national basis and to act as a broker for requests for these modules originating outside of the National Photovoltaics Program.

  14. Utility Cost Analysis 

    E-print Network

    Horn, S.

    1984-01-01

    . In addition to these air systems, 5 Liebert chillers supply 76 tons of cooling water directly to the CPU's. Three DX systems with economizer cycles provide a total of 96 tons of mechanical cooling to switchgear and UPS equipment rooms. ENERGY UTILIZATION...

  15. Utility spot pricing, California

    E-print Network

    Schweppe, Fred C.

    1982-01-01

    The objective of the present spot pricing study carried out for SCE and PG&E is to develop the concepts which wculd lead to an experimental design for spot pricing in the two utilities. The report suggests a set of experiments ...

  16. Classroom Use and Utilization.

    ERIC Educational Resources Information Center

    Fink, Ira

    2002-01-01

    Discusses how classrooms are distributed by size on a campus, how well they are used, and how their use changes with faculty and student needs and desires. Details how to analyze classroom space, use, and utilization, taking into account such factors as scheduling and classroom stations. (EV)

  17. Advanced fossil energy utilization

    SciTech Connect

    Shekhawat, D.; Berry, D.; Spivey, J.; Pennline, H.; Granite, E.

    2010-01-01

    This special issue of Fuel is a selection of papers presented at the symposium ‘Advanced Fossil Energy Utilization’ co-sponsored by the Fuels and Petrochemicals Division and Research and New Technology Committee in the 2009 American Institute of Chemical Engineers (AIChE) Spring National Meeting Tampa, FL, on April 26–30, 2009.

  18. Industrial - Utility Cogeneration Systems 

    E-print Network

    Harkins, H. L.

    1979-01-01

    electric utility power plant, considerable energy is wasted in the form of heat rejection to the atmosphere thru cooling towers, ponds or lakes, or to rivers. In a cogeneration system heat rejection can be minimized by systems which apply the otherwise...

  19. Technology utilization program report

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The application of aerospace technology to the solution of public health and industrial problems is reported. Data cover: (1) development of an externally rechargeable cardiac pacemaker, (2) utilization of ferrofluids-colloidal suspensions of ferrite particles - in the efficient separation of nonferrous metals as Ni, Zn, Cu, and Al from shredded automobile scrap, and (3) development of a breathing system for fire fighters.

  20. UAL -:,;..; Electric Utility

    E-print Network

    Schrijver, Karel

    ..................... 2. EFFECTS ON ELECTRIC POWER SYSTEMS ........................ 2.1 GENERAL SYSTEM PROBLEMS. Taylor, Jr. Power Systems, Inc. DEPORTMENTOFENERGY*j __: ,, ,,./" #12;#12;ORNL-6665 m I ELECTRIC UTILITY Electric, Phil R. Gattens of I Allegheny Power, and Cliff Bush of Atlantic Electric for measured data

  1. Promoting Physical Activity in Hong Kong Chinese Young People: Factors Influencing Their Subjective Task Values and Expectancy Beliefs in Physical Activity

    ERIC Educational Resources Information Center

    Pang, Bonnie

    2014-01-01

    According to Eccles et al.'s (1983) Expectancy Value Model, the two major constructs that influence young people's activity choice are subjective task value and expectancy beliefs (Eccles et al., 1983). Eccles et al. (1983) conceptually distinguished four dimensions of subjective task value: attainment value, intrinsic value, utility

  2. Maximally informative ensembles for SIC-POVMs in dimension 3

    NASA Astrophysics Data System (ADS)

    Szymusiak, Anna

    2014-11-01

    In order to find out for which initial states of the system the uncertainty of the measurement outcomes will be minimal, one can look for the minimizers of the Shannon entropy of the measurement. In case of group-covariant measurements this question becomes closely related to the problem of how informative the measurement is in the sense of its informational power. Namely, the orbit under group action of the entropy minimizer corresponds to a maximally informative ensemble of equiprobable elements. We give a characterization of such ensembles for three-dimensional group-covariant (Weyl-Heisenberg) symmetric informationally complete positive operator valued measures (SIC-POVMs) in both geometric and algebraic terms. It turns out that a maximally informative ensemble arises from the input state orthogonal to a subspace spanned by three linearly dependent vectors defining a SIC-POVM (geometrically) or from an eigenstate of a certain Weyl matrix (algebraically).

  3. Maximal aerobic power measurement in runners and swimmers.

    PubMed Central

    Corry, I.; Powers, N.

    1982-01-01

    Five cross-country runners and five competitive swimmers performed a pulling exercise with elastic shock cords and a treadmill run to exhaustion. The mean VO2 max related to lean body mass of the runners was significantly higher than the swimmers on the treadmill (p less than 0.05) while, on the pulling test, the mean VO2 max of the swimmers was significantly higher than the runners (p less 0.01). The maximum heart rates achieved pulling were 95% of the running maximum by runners and 96% by swimmers with no significant difference between them. Their mean oxygen pulse was almost the same for maximal running but the swimmers had a significantly higher oxygen pulse than the runners for maximal pulling (p less than 0.01). The swimmers could reach about 79% of their running VO2 max by pulling while the runners used 53% of their running VO2 max. Images Fig. 1 Fig. 3 PMID:7139226

  4. Maximally supersymmetric G-backgrounds of IIB supergravity

    E-print Network

    U. Gran; J. Gutowski; G. Papadopoulos; D. Roest

    2006-04-11

    We classify the geometry of all supersymmetric IIB backgrounds which admit the maximal number of $G$-invariant Killing spinors. For compact stability subgroups $G=G_2, SU(3)$ and SU(2), the spacetime is locally isometric to a product $X_n\\times Y_{10-n}$ with $n=3,4,6$, where $X_n$ is a maximally supersymmetric solution of a $n$-dimensional supergravity theory and $Y_{10-n}$ is a Riemannian manifold with holonomy $G$. For non-compact stability subgroups, $G=K\\ltimes\\bR^8$, $K=Spin(7)$, SU(4), $Sp(2)$, $SU(2)\\times SU(2)$ and $\\{1\\}$, the spacetime is a pp-wave propagating in an eight-dimensional manifold with holonomy $K$. We find new supersymmetric pp-wave solutions of IIB supergravity.

  5. Magellan Project: Evolving enhanced operations efficiency to maximize science value

    NASA Technical Reports Server (NTRS)

    Cheuvront, Allan R.; Neuman, James C.; Mckinney, J. Franklin

    1994-01-01

    Magellan has been one of NASA's most successful spacecraft, returning more science data than all planetary spacecraft combined. The Magellan Spacecraft Team (SCT) has maximized the science return with innovative operational techniques to overcome anomalies and to perform activities for which the spacecraft was not designed. Commanding the spacecraft was originally time consuming because the standard development process was envisioned as manual tasks. The Program understood that reducing mission operations costs were essential for an extended mission. Management created an environment which encouraged automation of routine tasks, allowing staff reduction while maximizing the science data returned. Data analysis and trending, command preparation, and command reviews are some of the tasks that were automated. The SCT has accommodated personnel reductions by improving operations efficiency while returning the maximum science data possible.

  6. Maximize uniformity summation heuristic (MUSH): a highly accurate simple method for intracranial delineation

    NASA Astrophysics Data System (ADS)

    Pierson, Ronald; Harris, Gregory; Johnson, Hans J.; Dunn, Steve; Magnotta, Vincent A.

    2009-02-01

    A common procedure performed by many groups in the analysis of neuroimaging data is separating the brain from other tissues. This procedure is often utilized both by volumetric studies as well as functional imaging studies. Regardless of the intent, an accurate, robust method of identifying the brain or cranial vault is imperative. While this is a common requirement, there are relatively few tools to perform this task. Most of these tools require a T1 weighted image and are therefore not able to accurately define a region that includes surface CSF. In this paper, we have developed a novel brain extraction technique termed Maximize Uniformity by Summation Heuristic (MUSH) optimization. The algorithm was designed for extraction of the brain and surface CSF from a multi-modal magnetic resonance (MR) imaging study. The method forms a linear combination of multi-modal MR imaging data to make the signal intensity within the brain as uniform as possible. The resulting image is thresholded and simple morphological operators are utilized to generate the resulting representation of the brain. The resulting method was applied to a sample of 20 MR brain scans and compared to the results generated by 3dSkullStrip, 3dIntracranial, BET, and BET2. The average Jaccard metrics for the twenty subjects was 0.66 (BET), 0.61 (BET2), 0.88 (3dIntracranial), 0.91 (3dSkullStrip), and 0.94 (MUSH).

  7. Maternal Expectations for Toddlers’ Reactions to Novelty: Relations of Maternal Internalizing Symptoms and Parenting Dimensions to Expectations and Accuracy of Expectations

    PubMed Central

    Kiel, Elizabeth J.; Buss, Kristin A.

    2010-01-01

    SYNOPSIS Objective Although maternal internalizing symptoms and parenting dimensions have been linked to reports and perceptions of children’s behavior, it remains relatively unknown whether these characteristics relate to expectations or the accuracy of expectations for toddlers’ responses to novel situations. Design A community sample of 117 mother-toddler dyads participated in a laboratory visit and questionnaire completion. At the laboratory, mothers were interviewed about their expectations for their toddlers’ behaviors in a variety of novel tasks; toddlers then participated in these activities, and trained coders scored their behaviors. Mothers completed questionnaires assessing demographics, depressive and worry symptoms, and parenting dimensions. Results Mothers who reported more worry expected their toddlers to display more fearful behavior during the laboratory tasks, but worry did not moderate how accurately maternal expectations predicted toddlers’ observed behavior. When also reporting a low level of authoritative-responsive parenting, maternal depressive symptoms moderated the association between maternal expectations and observed toddler behavior, such that, as depressive symptoms increased, maternal expectations related less strongly to toddler behavior. Conclusions When mothers were asked about their expectations for their toddlers’ behavior in the same novel situations from which experimenters observe this behavior, symptoms and parenting had minimal effect on the accuracy of mothers’ expectations. When in the context of low authoritative-responsive parenting, however, depressive symptoms related to less accurate predictions of their toddlers’ fearful behavior. PMID:21037974

  8. Self-dual metrics with maximally superintegrable geodesic flows

    NASA Astrophysics Data System (ADS)

    Filyukov, Sergei; Galajinsky, Anton

    2015-05-01

    A class of self-dual and geodesically complete spacetimes with maximally superintegrable geodesic flows is constructed by applying the Eisenhart lift to mechanics in pseudo-Euclidean spacetime of signature (1,1). It is characterized by the presence of a second-rank Killing tensor. Spacetimes of the ultrahyperbolic signature (2 ,q ) with q >2 , which admit a second-rank Killing tensor and possess superintegrable geodesic flows, are built.

  9. Cardiovascular changes during maximal breath-holding in elite divers.

    PubMed

    Guaraldi, Pietro; Serra, Maria; Barletta, Giorgio; Pierangeli, Giulia; Terlizzi, Rossana; Calandra-Buonaura, Giovanna; Cialoni, Danilo; Cortelli, Pietro

    2009-12-01

    During maximal breath-holding six healthy elite breath-hold divers, after an initial "easy-going" phase in which cardiovascular changes resembled the so-called "diving response", exhibited a sudden and severe rise in blood pressure during the "struggle" phase of the maneuver. These changes may represent the first tangible expression of a defense reaction, which overrides the classic diving reflex, aiming to reduce the hypoxic damage and to break the apnea before the loss of consciousness. PMID:19655193

  10. Maximal mixing as a `sum' of small mixings

    E-print Network

    Joydeep Chakrabortty; Anjan S. Joshipura; Poonam Mehta; Sudhir K. Vempati

    2009-09-17

    In models with two sources of neutrino masses, we look at the possibility of generating maximal/large mixing angles in the total mass matrix, where both the sources have only small mixing angles. We show that in the two generation case, maximal mixing can naturally arise only when the total neutrino mass matrix has a quasi-degenerate pattern. The best way to demonstrate this is by decomposing the quasi-degenerate spectrum in to hierarchial and inverse-hierarchial mass matrices, both with small mixing. Such a decomposition of the quasi-degenerate spectra is in fact very general and can be done irrespective of the mixing present in the mass matrices. With three generations, and two sources, we show that only one or all the three small mixing angles in the total neutrino mass matrix can be converted to maximal/large mixing angles. The decomposition of the degenerate pattern in this case is best realised in to sub-matrices whose dominant eigenvalues have an alternating pattern. On the other hand, it is possible to generate two large and one small mixing angle if either one or both of the sub-matrices contain maximal mixing. We present example textures of this. With three sources of neutrino masses, the results remain almost the same as long as all the sub-matrices contribute equally. The Left-Right Symmetric model where Type I and Type II seesaw mechanisms are related provides a framework where small mixings can be converted to large mixing angles, for degenerate neutrinos.

  11. Dynamics of hydrogen-like atom bounded by maximal acceleration

    E-print Network

    Yaakov Friedman; Emanuel Resin

    2012-04-03

    The existence of a maximal acceleration for massive objects was conjectured by Caianiello 30 years ago based on the Heisenberg uncertainty relations. Many consequences of this hypothesis have been studied, but until now, there has been no evidence that boundedness of the acceleration may lead to quantum behavior. In previous research, we predicted the existence of a universal maximal acceleration and developed a new dynamics for which all admissible solutions have an acceleration bounded by the maximal one. Based on W. K\\"{u}ndig's experiment, as reanalyzed by Kholmetskii et al, we estimated its value to be of the order $10^{19}m/s^2$. We present here a solution of our dynamical equation for a classical hydrogen-like atom and show that this dynamics leads to some aspects of quantum behavior. We show that the position of an electron in a hydrogen-like atom can be described only probabilistically. We also show that in this model, the notion of "center of mass" must be modified. This modification supports the non-existence of a magnetic moment in the atom and explains the relevance of the conformal group in the quantum region.

  12. Polarity related influence maximization in signed social networks.

    PubMed

    Li, Dong; Xu, Zhi-Ming; Chakraborty, Nilanjan; Gupta, Anika; Sycara, Katia; Li, Sheng

    2014-01-01

    Influence maximization in social networks has been widely studied motivated by applications like spread of ideas or innovations in a network and viral marketing of products. Current studies focus almost exclusively on unsigned social networks containing only positive relationships (e.g. friend or trust) between users. Influence maximization in signed social networks containing both positive relationships and negative relationships (e.g. foe or distrust) between users is still a challenging problem that has not been studied. Thus, in this paper, we propose the polarity-related influence maximization (PRIM) problem which aims to find the seed node set with maximum positive influence or maximum negative influence in signed social networks. To address the PRIM problem, we first extend the standard Independent Cascade (IC) model to the signed social networks and propose a Polarity-related Independent Cascade (named IC-P) diffusion model. We prove that the influence function of the PRIM problem under the IC-P model is monotonic and submodular Thus, a greedy algorithm can be used to achieve an approximation ratio of 1-1/e for solving the PRIM problem in signed social networks. Experimental results on two signed social network datasets, Epinions and Slashdot, validate that our approximation algorithm for solving the PRIM problem outperforms state-of-the-art methods. PMID:25061986

  13. The Distribution of Mass in (Disk) Galaxies: Maximal or Not?

    NASA Astrophysics Data System (ADS)

    Courteau, Stéphane

    2015-02-01

    The relative distribution of matter in galaxies ought to be one of the most definitive predictions of galaxy formation models yet its validation is challenged by numerous observational, theoretical, and operational challenges. All galaxies are believed to be dominated by an invisible matter component in their outskirts. A debate has however been blazing for the last two decades regarding the relative fraction of baryons and dark matter in the inner parts of galaxies: whether galaxies are centrally dominated by baryons (``maximal disk'') is of issue. Some of those debates have been misconstrued on account of operational confusion, such as dark matter fractions being measured and compared at different radii. All galaxies are typically baryon-dominated (maximal) at the center and dark-matter dominated (sub-maximal) in their outskirts; for low-mass galaxies (Vtot <~ 200 km s- 1), the mass of the dark halo equals the stellar mass at least within 2 disk scale lengths, the transition occurs at larger effective radii for more massive galaxies. An ultimate goal for galaxy structure studies is to achieve accurate data-model comparisons for the relative fractions of baryonic to total matter at any radius.

  14. Codep: maximizing co-evolutionary interdependencies to discover interacting proteins.

    PubMed

    Tillier, Elisabeth R M; Biro, Laurence; Li, Ginny; Tillo, Desiree

    2006-06-01

    Approaches for the determination of interacting partners from different protein families (such as ligands and their receptors) have made use of the property that interacting proteins follow similar patterns and relative rates of evolution. Interacting protein partners can then be predicted from the similarity of their phylogenetic trees or evolutionary distances matrices. We present a novel method called Codep, for the determination of interacting protein partners by maximizing co-evolutionary signals. The order of sequences in the multiple sequence alignments from two protein families is determined in such a manner as to maximize the similarity of substitution patterns at amino acid sites in the two alignments and, thus, phylogenetic congruency. This is achieved by maximizing the total number of interdependencies of amino acids sites between the alignments. Once ordered, the corresponding sequences in the two alignments indicate the predicted interacting partners. We demonstrate the efficacy of this approach with computer simulations and in analyses of several protein families. A program implementing our method, Codep, is freely available to academic users from our website: http://www.uhnresearch.ca/labs/tillier/. PMID:16634043

  15. Random effects structure for confirmatory hypothesis testing: Keep it maximal

    PubMed Central

    Barr, Dale J.; Levy, Roger; Scheepers, Christoph; Tily, Harry J.

    2013-01-01

    Linear mixed-effects models (LMEMs) have become increasingly prominent in psycholinguistics and related areas. However, many researchers do not seem to appreciate how random effects structures affect the generalizability of an analysis. Here, we argue that researchers using LMEMs for confirmatory hypothesis testing should minimally adhere to the standards that have been in place for many decades. Through theoretical arguments and Monte Carlo simulation, we show that LMEMs generalize best when they include the maximal random effects structure justified by the design. The generalization performance of LMEMs including data-driven random effects structures strongly depends upon modeling criteria and sample size, yielding reasonable results on moderately-sized samples when conservative criteria are used, but with little or no power advantage over maximal models. Finally, random-intercepts-only LMEMs used on within-subjects and/or within-items data from populations where subjects and/or items vary in their sensitivity to experimental manipulations always generalize worse than separate F1 and F2 tests, and in many cases, even worse than F1 alone. Maximal LMEMs should be the ‘gold standard’ for confirmatory hypothesis testing in psycholinguistics and beyond. PMID:24403724

  16. A measurement of the maximal forces in plasmonic tweezers

    NASA Astrophysics Data System (ADS)

    Kim, Jung-Dae; Choi, Jun-Hee; Lee, Yong-Gu

    2015-10-01

    Plasmonic tweezers that are designed to trap nanoscale objects create many new possibilities for single-molecule targeted studies. Numerous novel designs of plasmonic nanostructures are proposed in order to attain stronger forces and weaker laser intensity. Most experiments have consisted only of immobilization observations—that is, particles stick when the laser is turned on and fall away when the laser is turned off. Studies of the exertable forces were only theoretical. A few studies have experimentally measured trap stiffness. However, as far as we know, no studies have addressed maximal forces. In this paper, we present a new experimental design in which the motion of the trapped particle can be monitored in either parallel or orthogonal directions to the plasmonic structure’s symmetric axis. We measured maximal trapping force through such monitoring. Although stiffness would be useful for force-calibration or immobilization purposes, for which most plasmonic tweezers are used, we believe that the maximal endurable force is significant and thus, this paper presents this aspect.

  17. Reference Values of Maximal Oxygen Uptake for Polish Rowers

    PubMed Central

    Klusiewicz, Andrzej; Starczewski, Micha?; ?adyga, Maria; D?ugo??cka, Barbara; Braksator, Wojciech; Mamcarz, Artur; Sitkowski, Dariusz

    2014-01-01

    The aim of this study was to characterize changes in maximal oxygen uptake over several years and to elaborate current reference values of this index based on determinations carried out in large and representative groups of top Polish rowers. For this study 81 female and 159 male rowers from the sub-junior to senior categories were recruited from the Polish National Team and its direct backup. All the subjects performed an incremental exercise test on a rowing ergometer. During the test maximal oxygen uptake was measured with the BxB method. The calculated reference values for elite Polish junior and U23 rowers allowed to evaluate the athletes’ fitness level against the respective reference group and may aid the coach in controlling the training process. Mean values of VO2max achieved by members of the top Polish rowing crews who over the last five years competed in the Olympic Games or World Championships were also presented. The results of the research on the “trainability” of the maximal oxygen uptake may lead to a conclusion that the growth rate of the index is larger in case of high-level athletes and that the index (in absolute values) increases significantly between the age of 19–22 years (U23 category). PMID:25713672

  18. The Business of Expectations: How Promissory Organisations Shape Technology & Innovation 

    E-print Network

    Pollock, N.; Williams, R.

    2010-01-01

    The business of technological expectations has yet to be thoroughly explored by scholars interested in the role of expectations and visions in the emergence of technological innovations. However, intermediaries specialising ...

  19. The Other Half of the Expectancy Equation: Pygmalion

    ERIC Educational Resources Information Center

    Rappaport, Margaret M.; Rappaport, Herbert

    1975-01-01

    Studies the concept of the communication of positive expectancies in the classroom by examining the effects of different sources of expectancy on reading performance for compensatory program pupils. (Author/DEP)

  20. A Stochastic Theory for Self-Other Expectations 

    E-print Network

    Berger, Joseph; Snell, J. Laurie

    2015-07-06

    a Markov model predicting stability or change in expectation states from different behaviors. Markov models assume that change of state (from one expectation pattern to another) depend only on the previous state and the transition probabilities...