Science.gov

Sample records for allocations rate base

  1. Bit-rate allocation for multiple video streams using a pricing-based mechanism.

    PubMed

    Tiwari, Mayank; Groves, Theodore; Cosman, Pamela

    2011-11-01

    We consider the problem of bit-rate allocation for multiple video users sharing a common transmission channel. Previously, overall quality of multiple users was improved by exploiting relative video complexity. Users with high-complexity video benefit at the expense of video quality reduction for other users with simpler videos. The quality of all users can be improved by collectively allocating the bit rate in a centralized fashion which requires sharing video information with a central controller. In this paper, we present an informationally decentralized bit-rate allocation for multiple users where a user only needs to inform his demand to an allocator. Each user separately calculates his bit-rate demand based on his video complexity and bit-rate price, where the bit-rate price is announced by the allocator. The allocator adjusts the bit-rate price for the next period based on the bit rate demanded by the users and the total available bit-rate supply. Simulation results show that all users improve their quality by the pricing-based decentralized bit-rate allocation method compared with their allocation when acting individually. The results of our proposed method are comparable to the centralized bit-rate allocation.

  2. Rate Adaptive Based Resource Allocation with Proportional Fairness Constraints in OFDMA Systems

    PubMed Central

    Yin, Zhendong; Zhuang, Shufeng; Wu, Zhilu; Ma, Bo

    2015-01-01

    Orthogonal frequency division multiple access (OFDMA), which is widely used in the wireless sensor networks, allows different users to obtain different subcarriers according to their subchannel gains. Therefore, how to assign subcarriers and power to different users to achieve a high system sum rate is an important research area in OFDMA systems. In this paper, the focus of study is on the rate adaptive (RA) based resource allocation with proportional fairness constraints. Since the resource allocation is a NP-hard and non-convex optimization problem, a new efficient resource allocation algorithm ACO-SPA is proposed, which combines ant colony optimization (ACO) and suboptimal power allocation (SPA). To reduce the computational complexity, the optimization problem of resource allocation in OFDMA systems is separated into two steps. For the first one, the ant colony optimization algorithm is performed to solve the subcarrier allocation. Then, the suboptimal power allocation algorithm is developed with strict proportional fairness, and the algorithm is based on the principle that the sums of power and the reciprocal of channel-to-noise ratio for each user in different subchannels are equal. To support it, plenty of simulation results are presented. In contrast with root-finding and linear methods, the proposed method provides better performance in solving the proportional resource allocation problem in OFDMA systems. PMID:26426016

  3. Error rate information in attention allocation pilot models

    NASA Technical Reports Server (NTRS)

    Faulkner, W. H.; Onstott, E. D.

    1977-01-01

    The Northrop urgency decision pilot model was used in a command tracking task to compare the optimized performance of multiaxis attention allocation pilot models whose urgency functions were (1) based on tracking error alone, and (2) based on both tracking error and error rate. A matrix of system dynamics and command inputs was employed, to create both symmetric and asymmetric two axis compensatory tracking tasks. All tasks were single loop on each axis. Analysis showed that a model that allocates control attention through nonlinear urgency functions using only error information could not achieve performance of the full model whose attention shifting algorithm included both error and error rate terms. Subsequent to this analysis, tracking performance predictions for the full model were verified by piloted flight simulation. Complete model and simulation data are presented.

  4. Cognitive radio resource allocation based on coupled chaotic genetic algorithm

    NASA Astrophysics Data System (ADS)

    Zu, Yun-Xiao; Zhou, Jie; Zeng, Chang-Chang

    2010-11-01

    A coupled chaotic genetic algorithm for cognitive radio resource allocation which is based on genetic algorithm and coupled Logistic map is proposed. A fitness function for cognitive radio resource allocation is provided. Simulations are conducted for cognitive radio resource allocation by using the coupled chaotic genetic algorithm, simple genetic algorithm and dynamic allocation algorithm respectively. The simulation results show that, compared with simple genetic and dynamic allocation algorithm, coupled chaotic genetic algorithm reduces the total transmission power and bit error rate in cognitive radio system, and has faster convergence speed.

  5. 47 CFR 69.502 - Base factor allocation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 3 2014-10-01 2014-10-01 false Base factor allocation. 69.502 Section 69.502... Segregation of Common Line Element Revenue Requirement § 69.502 Base factor allocation. Projected revenues from the following shall be deducted from the base factor portion to determine the amount that...

  6. 47 CFR 69.502 - Base factor allocation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Base factor allocation. 69.502 Section 69.502... Segregation of Common Line Element Revenue Requirement § 69.502 Base factor allocation. Projected revenues from the following shall be deducted from the base factor portion to determine the amount that...

  7. 47 CFR 69.502 - Base factor allocation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 3 2013-10-01 2013-10-01 false Base factor allocation. 69.502 Section 69.502... Segregation of Common Line Element Revenue Requirement § 69.502 Base factor allocation. Projected revenues from the following shall be deducted from the base factor portion to determine the amount that...

  8. 47 CFR 69.502 - Base factor allocation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 3 2011-10-01 2011-10-01 false Base factor allocation. 69.502 Section 69.502... Segregation of Common Line Element Revenue Requirement § 69.502 Base factor allocation. Projected revenues from the following shall be deducted from the base factor portion to determine the amount that...

  9. 47 CFR 69.502 - Base factor allocation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 3 2012-10-01 2012-10-01 false Base factor allocation. 69.502 Section 69.502... Segregation of Common Line Element Revenue Requirement § 69.502 Base factor allocation. Projected revenues from the following shall be deducted from the base factor portion to determine the amount that...

  10. FOR Allocation to Distribution Systems based on Credible Improvement Potential (CIP)

    NASA Astrophysics Data System (ADS)

    Tiwary, Aditya; Arya, L. D.; Arya, Rajesh; Choube, S. C.

    2016-06-01

    This paper describes an algorithm for forced outage rate (FOR) allocation to each section of an electrical distribution system subject to satisfaction of reliability constraints at each load point. These constraints include threshold values of basic reliability indices, for example, failure rate, interruption duration and interruption duration per year at load points. Component improvement potential measure has been used for FOR allocation. Component with greatest magnitude of credible improvement potential (CIP) measure is selected for improving reliability performance. The approach adopted is a monovariable method where one component is selected for FOR allocation and in the next iteration another component is selected for FOR allocation based on the magnitude of CIP. The developed algorithm is implemented on sample radial distribution system.

  11. Ethical considerations surrounding survival benefit-based liver allocation.

    PubMed

    Keller, Eric J; Kwo, Paul Y; Helft, Paul R

    2014-02-01

    The disparity between the demand for and supply of donor livers has continued to grow over the last 2 decades, and this has placed greater weight on the need for efficient and effective liver allocation. Although the use of extended criteria donors has shown great potential, it remains unregulated. A survival benefit-based model was recently proposed to answer calls to increase efficiency and reduce futile transplants. However, it was previously determined that the current allocation system was not in need of modification and that instead geographic disparities should be addressed. In contrast, we believe that there is a significant need to replace the current allocation system and complement efforts to improve donor liver distribution. We illustrate this need first by identifying major ethical concerns shaping liver allocation and then by using these concerns to identify strengths and shortcomings of the Model for End-Stage Liver Disease/Pediatric End-Stage Liver Disease system and a survival benefit-based model. The latter model is a promising means of improving liver allocation: it incorporates a greater number of ethical principles, uses a sophisticated statistical model to increase efficiency and reduce waste, minimizes bias, and parallels developments in the allocation of other organs. However, it remains limited in its posttransplant predictive accuracy and may raise potential issues regarding informed consent. In addition, the proposed model fails to include quality-of-life concerns and prioritize younger patients. We feel that it is time to take the next steps toward better liver allocation not only through reductions in geographic disparities but also through the adoption of a model better equipped to balance the many ethical concerns shaping organ allocation. Thus, we support the development of a similar model with suggested amendments. PMID:24166860

  12. 76 FR 61472 - Revised Fiscal Year 2011 Tariff-Rate Quota Allocations for Refined Sugar

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-04

    ... Proclamation 6763 (60 FR 1007). On August 5, 2010, the Secretary of Agriculture established the FY 2011... TRADE REPRESENTATIVE Revised Fiscal Year 2011 Tariff-Rate Quota Allocations for Refined Sugar AGENCY... the fiscal year (FY) 2011 in-quota quantity of the tariff-rate quota (TRQ) for imported refined...

  13. Negative correlation between male allocation and rate of self-fertilization in a hermaphroditic animal

    PubMed Central

    Johnston, Mark O.; Das, Bijon; Hoeh, Walter R.

    1998-01-01

    Sex-allocation theory predicts that the evolution of increased rates of self-fertilization should be accompanied by decreased allocation to male reproduction (sperm production and broadcast). This prediction has found support in plants but has not previously been tested in animals, which, in contrast to biotically pollinated plants, are free of complications associated with incorporating the costs of attractive structures such as petals. Here we report rates of self-fertilization as well as proportional allocation to male reproductive tissues within populations of the simultaneous hermaphrodite Utterbackia imbecillis, a freshwater mussel. Individuals from populations with higher selfing rates devoted a lower proportion of reproductive tissue to sperm production (correlation = −0.99), in support of theory. PMID:9435241

  14. Auction-based resource allocation game under a hierarchical structure

    NASA Astrophysics Data System (ADS)

    Cui, Yingying; Zou, Suli; Ma, Zhongjing

    2016-01-01

    This paper studies a class of auction-based resource allocation games under a hierarchical structure, such that each supplier is assigned a certain amount of resource from a single provider and allocates it to its buyers with auction mechanisms. To implement the efficient allocations for the underlying hierarchical system, we first design an auction mechanism, for each local system composed of a supplier and its buyers, which inherits the advantages of the progressive second price mechanism. By employing a dynamic algorithm, each local system converges to its own efficient Nash equilibrium, at which the efficient resource allocation is achieved and the bidding prices of all the buyers in this local system are identical with each other. After the local systems reach their own equilibria respectively, the resources assigned to suppliers are readjusted via a dynamic hierarchical algorithm with respect to the bidding prices associated with the implemented equilibria of local systems. By applying the proposed hierarchical process, the formulated hierarchical system can converge to the efficient allocation under certain mild conditions. The developed results in this work are demonstrated with simulations.

  15. 23 CFR 1240.15 - Allocations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false Allocations. 1240.15 Section 1240.15 Highways NATIONAL... GUIDELINES SAFETY INCENTIVE GRANTS FOR USE OF SEAT BELTS-ALLOCATIONS BASED ON SEAT BELT USE RATES Determination of Allocations § 1240.15 Allocations. (a) Funds allocated under this part shall be available...

  16. Allocating Great Lakes forage bases in response to multiple demand

    USGS Publications Warehouse

    Brown, Edward H.; Busiahn, Thomas R.; Jones, Michael L.; Argyle, Ray L.; Taylor, William W.; Ferreri, C. Paola

    1999-01-01

    Forage base allocation, which has become an important issue because of major changes in the fish communities and fisheries of the Great Lakes since the 1950s is examined and documented in this chapter. Management initiatives that were used to address the issue, and supporting research and development that provided new or improved methods of field sampling and analysis are also highlighted.

  17. Respiratory allocation and standard rate of metabolism in the African lungfish, Protopterus aethiopicus.

    PubMed

    Seifert, Ashley W; Chapman, Lauren J

    2006-01-01

    This paper quantifies the relationship between respiratory allocation (air vs. water) and the standard rate of metabolism (SMR) in the primitive air-breathing lungfish, Protopterus aethiopicus. Simultaneous measurements of oxygen consumed from both air and water were made to determine the SMR at ecologically relevant aquatic oxygen levels for juveniles 2 to 221 g. Total metabolic rate was positively correlated with body mass with a scaling exponent of 0.78. Aerial oxygen consumption averaged 98% (range=94% to 100%) of total respiratory allocation under low aquatic oxygen levels. Measurements of oxygen consumption made across a gradient of dissolved oxygen from normoxia to anoxia showed that P. aethiopicus maintains its SMR despite a change in respiratory allocation between water and air. PMID:16380279

  18. Earning points for moral behavior: organ allocation based on reciprocity.

    PubMed

    Ravelingien, An; Krom, Andre

    2005-01-01

    Anticipating the reevaluation of the Dutch organ procurement system, in late 2003 the Rathenau Institute published a study entitled 'Gift or Contribution?' In this study, the author, Govert den Hartogh, carries out a thorough moral analysis of the problem of organ shortage and fair allocation of organs. He suggests there should be a change in mentality whereby organ donation is no longer viewed in terms of charity and the volunteer spirit, but rather in terms of duty and reciprocity. The procurement and allocation of donor organs should be seen as a system of mutually assured help. Fair allocation would imply to give priority to those who recognize and comply with their duty: the registered donors. The idea of viewing organ donation as an undertaking involving mutual benefit rather than as a matter of charity, however, is not new. Notwithstanding the fact that reference to charity and altruism is not required in order for the organ donation to be of moral significance, we will argue against the reciprocity-based scenario. Steering organ allocation towards those who are themselves willing to donate organs is both an ineffective and morally questionable means of attempting to counter organ shortage.

  19. FLDA: Latent Dirichlet Allocation Based Unsteady Flow Analysis.

    PubMed

    Hong, Fan; Lai, Chufan; Guo, Hanqi; Shen, Enya; Yuan, Xiaoru; Li, Sikun

    2014-12-01

    In this paper, we present a novel feature extraction approach called FLDA for unsteady flow fields based on Latent Dirichlet allocation (LDA) model. Analogous to topic modeling in text analysis, in our approach, pathlines and features in a given flow field are defined as documents and words respectively. Flow topics are then extracted based on Latent Dirichlet allocation. Different from other feature extraction methods, our approach clusters pathlines with probabilistic assignment, and aggregates features to meaningful topics at the same time. We build a prototype system to support exploration of unsteady flow field with our proposed LDA-based method. Interactive techniques are also developed to explore the extracted topics and to gain insight from the data. We conduct case studies to demonstrate the effectiveness of our proposed approach. PMID:26356968

  20. Hierarchical dynamic allocation procedures based on modified Zelen's approach in multiregional studies with unequal allocation.

    PubMed

    Kuznetsova, Olga M; Tymofyeyev, Yevgen

    2014-01-01

    Morrissey, McEntegart, and Lang (2010) showed that in multicenter studies with equal allocation to several treatment arms, the modified Zelen's approach provides excellent within-center and across-study balance in treatment assignments. In this article, hierarchical balancing procedures for equal allocation to more than two arms (with some elements different from earlier versions) and their unequal allocation expansions that incorporate modified Zelen's approach at the center level are described. The balancing properties of the described procedures for a case study of a multiregional clinical trial with 1:2 allocation where balance within regions as well as in other covariates is required are examined through simulations.

  1. Multi-user cognitive radio network resource allocation based on the adaptive niche immune genetic algorithm

    NASA Astrophysics Data System (ADS)

    Zu, Yun-Xiao; Zhou, Jie

    2012-01-01

    Multi-user cognitive radio network resource allocation based on the adaptive niche immune genetic algorithm is proposed, and a fitness function is provided. Simulations are conducted using the adaptive niche immune genetic algorithm, the simulated annealing algorithm, the quantum genetic algorithm and the simple genetic algorithm, respectively. The results show that the adaptive niche immune genetic algorithm performs better than the other three algorithms in terms of the multi-user cognitive radio network resource allocation, and has quick convergence speed and strong global searching capability, which effectively reduces the system power consumption and bit error rate.

  2. [Optimal allocation of irrigation water resources based on systematical strategy].

    PubMed

    Cheng, Shuai; Zhang, Shu-qing

    2015-01-01

    With the development of the society and economy, as well as the rapid increase of population, more and more water is needed by human, which intensified the shortage of water resources. The scarcity of water resources and growing competition of water in different water use sectors reduce water availability for irrigation, so it is significant to plan and manage irrigation water resources scientifically and reasonably for improving water use efficiency (WUE) and ensuring food security. Many investigations indicate that WUE can be increased by optimization of water use. However, present studies focused primarily on a particular aspect or scale, which lack systematic analysis on the problem of irrigation water allocation. By summarizing previous related studies, especially those based on intelligent algorithms, this article proposed a multi-level, multi-scale framework for allocating irrigation water, and illustrated the basic theory of each component of the framework. Systematical strategy of optimal irrigation water allocation can not only control the total volume of irrigation water on the time scale, but also reduce water loss on the spatial scale. It could provide scientific basis and technical support for improving the irrigation water management level and ensuring the food security. PMID:25985685

  3. Joint source-channel rate allocation and client clustering for scalable multistream IPTV.

    PubMed

    Chakareski, Jacob

    2015-08-01

    We design a system framework for streaming scalable internet protocol television (IPTV) content to heterogenous clients. The backbone bandwidth is optimally allocated between source and parity data layers that are delivered to the client population. The assignment of stream layers to clients is done based on their access link data rate and packet loss characteristics, and is part of the optimization. We design three techniques for jointly computing the optimal number of multicast sessions, their respective source and parity rates, and client membership, either exactly or approximatively, at lower complexity. The latter is achieved via an iterative coordinate descent algorithm that only marginally underperforms relative to the exact analytic solution. Through experiments, we study the advantages of our framework over common IPTV systems that deliver the same source and parity streams to every client. We observe substantial gains in video quality in terms of both its average value and standard deviation over the client population. In addition, for energy efficiency, we propose to move the parity data generation part to the edge of the backbone network, where each client connects to its IPTV stream. We analytically study the conditions under which such an approach delivers energy savings relative to the conventional case of source and parity data generation at the IPTV streaming server. Finally, we demonstrate that our system enables more consistent streaming performance, when the clients' access link packet loss distribution is varied, relative to the two baseline methods used in our investigation, and maintains the same performance as an ideal system that serves each client independently.

  4. 75 FR 53013 - Fiscal Year 2011 Tariff-rate Quota Allocations for Raw Cane Sugar, Refined and Specialty Sugar...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-30

    ... TRADE REPRESENTATIVE Fiscal Year 2011 Tariff-rate Quota Allocations for Raw Cane Sugar, Refined and Specialty Sugar, and Sugar-containing Products; Revision AGENCY: Office of the United States Trade... allocations of raw cane sugar, refined and special sugar, and sugar-containing products. USTR is revising...

  5. Regional bit allocation and rate distortion optimization for multiview depth video coding with view synthesis distortion model.

    PubMed

    Zhang, Yun; Kwong, Sam; Xu, Long; Hu, Sudeng; Jiang, Gangyi; Kuo, C-C Jay

    2013-09-01

    In this paper, we propose a view synthesis distortion model (VSDM) that establishes the relationship between depth distortion and view synthesis distortion for the regions with different characteristics: color texture area corresponding depth (CTAD) region and color smooth area corresponding depth (CSAD), respectively. With this VSDM, we propose regional bit allocation (RBA) and rate distortion optimization (RDO) algorithms for multiview depth video coding (MDVC) by allocating more bits on CTAD for rendering quality and fewer bits on CSAD for compression efficiency. Experimental results show that the proposed VSDM based RBA and RDO can improve the coding efficiency significantly for the test sequences. In addition, for the proposed overall MDVC algorithm that integrates VSDM based RBA and RDO, it achieves 9.99% and 14.51% bit rate reduction on average for the high and low bit rate, respectively. It can improve virtual view image quality 0.22 and 0.24 dB on average at the high and low bit rate, respectively, when compared with the original joint multiview video coding model. The RD performance comparisons using five different metrics also validate the effectiveness of the proposed overall algorithm. In addition, the proposed algorithms can be applied to both INTRA and INTER frames.

  6. 15 CFR 335.4 - Allocation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Allocation. 335.4 Section 335.4... § 335.4 Allocation. (a) For HTS 9902.51.11 and HTS 9902.51.15 each Tariff Rate Quota will be allocated separately. Allocation will be based on an applicant's Worsted Wool Suit production, on a weighted...

  7. Analysis and simulation of the dynamic spectrum allocation based on parallel immune optimization in cognitive wireless networks.

    PubMed

    Huixin, Wu; Duo, Mo; He, Li

    2014-01-01

    Spectrum allocation is one of the key issues to improve spectrum efficiency and has become the hot topic in the research of cognitive wireless network. This paper discusses the real-time feature and efficiency of dynamic spectrum allocation and presents a new spectrum allocation algorithm based on the master-slave parallel immune optimization model. The algorithm designs a new encoding scheme for the antibody based on the demand for convergence rate and population diversity. For improving the calculating efficiency, the antibody affinity in the population is calculated in multiple computing nodes at the same time. Simulation results show that the algorithm reduces the total spectrum allocation time and can achieve higher network profits. Compared with traditional serial algorithms, the algorithm proposed in this paper has better speedup ratio and parallel efficiency.

  8. Analysis and Simulation of the Dynamic Spectrum Allocation Based on Parallel Immune Optimization in Cognitive Wireless Networks

    PubMed Central

    Huixin, Wu; Duo, Mo; He, Li

    2014-01-01

    Spectrum allocation is one of the key issues to improve spectrum efficiency and has become the hot topic in the research of cognitive wireless network. This paper discusses the real-time feature and efficiency of dynamic spectrum allocation and presents a new spectrum allocation algorithm based on the master-slave parallel immune optimization model. The algorithm designs a new encoding scheme for the antibody based on the demand for convergence rate and population diversity. For improving the calculating efficiency, the antibody affinity in the population is calculated in multiple computing nodes at the same time. Simulation results show that the algorithm reduces the total spectrum allocation time and can achieve higher network profits. Compared with traditional serial algorithms, the algorithm proposed in this paper has better speedup ratio and parallel efficiency. PMID:25254255

  9. Auction-based bandwidth allocation in the Internet

    NASA Astrophysics Data System (ADS)

    Wei, Jiaolong; Zhang, Chi

    2002-07-01

    It has been widely accepted that auctioning which is the pricing approach with minimal information requirement is a proper tool to manage scare network resources. Previous works focus on Vickrey auction which is incentive compatible in classic auction theory. In the beginning of this paper, the faults of the most representative auction-based mechanisms are discussed. And then a new method called uniform-price auction (UPA), which has the simplest auction rule is proposed and it's incentive compatibility in the network environment is also proved. Finally, the basic mode is extended to support applications which require minimum bandwidth guarantees for a given time period by introducing derivative market, and a market mechanism for network resource allocation which is predictable, riskless, and simple for end-users is completed.

  10. S-EMG signal compression based on domain transformation and spectral shape dynamic bit allocation

    PubMed Central

    2014-01-01

    Background Surface electromyographic (S-EMG) signal processing has been emerging in the past few years due to its non-invasive assessment of muscle function and structure and because of the fast growing rate of digital technology which brings about new solutions and applications. Factors such as sampling rate, quantization word length, number of channels and experiment duration can lead to a potentially large volume of data. Efficient transmission and/or storage of S-EMG signals are actually a research issue. That is the aim of this work. Methods This paper presents an algorithm for the data compression of surface electromyographic (S-EMG) signals recorded during isometric contractions protocol and during dynamic experimental protocols such as the cycling activity. The proposed algorithm is based on discrete wavelet transform to proceed spectral decomposition and de-correlation, on a dynamic bit allocation procedure to code the wavelets transformed coefficients, and on an entropy coding to minimize the remaining redundancy and to pack all data. The bit allocation scheme is based on mathematical decreasing spectral shape models, which indicates a shorter digital word length to code high frequency wavelets transformed coefficients. Four bit allocation spectral shape methods were implemented and compared: decreasing exponential spectral shape, decreasing linear spectral shape, decreasing square-root spectral shape and rotated hyperbolic tangent spectral shape. Results The proposed method is demonstrated and evaluated for an isometric protocol and for a dynamic protocol using a real S-EMG signal data bank. Objective performance evaluations metrics are presented. In addition, comparisons with other encoders proposed in scientific literature are shown. Conclusions The decreasing bit allocation shape applied to the quantized wavelet coefficients combined with arithmetic coding results is an efficient procedure. The performance comparisons of the proposed S-EMG data

  11. Adjacency Matrix-Based Transmit Power Allocation Strategies in Wireless Sensor Networks

    PubMed Central

    Consolini, Luca; Medagliani, Paolo; Ferrari, Gianluigi

    2009-01-01

    In this paper, we present an innovative transmit power control scheme, based on optimization theory, for wireless sensor networks (WSNs) which use carrier sense multiple access (CSMA) with collision avoidance (CA) as medium access control (MAC) protocol. In particular, we focus on schemes where several remote nodes send data directly to a common access point (AP). Under the assumption of finite overall network transmit power and low traffic load, we derive the optimal transmit power allocation strategy that minimizes the packet error rate (PER) at the AP. This approach is based on modeling the CSMA/CA MAC protocol through a finite state machine and takes into account the network adjacency matrix, depending on the transmit power distribution and determining the network connectivity. It will be then shown that the transmit power allocation problem reduces to a convex constrained minimization problem. Our results show that, under the assumption of low traffic load, the power allocation strategy, which guarantees minimal delay, requires the maximization of network connectivity, which can be equivalently interpreted as the maximization of the number of non-zero entries of the adjacency matrix. The obtained theoretical results are confirmed by simulations for unslotted Zigbee WSNs. PMID:22346705

  12. Random property allocation: A novel geographic imputation procedure based on a complete geocoded address file.

    PubMed

    Walter, Scott R; Rose, Nectarios

    2013-09-01

    Allocating an incomplete address to randomly selected property coordinates within a locality, known as random property allocation, has many advantages over other geoimputation techniques. We compared the performance of random property allocation to four other methods under various conditions using a simulation approach. All methods performed well for large spatial units, but random property allocation was the least prone to bias and error under volatile scenarios with small units and low prevalence. Both its coordinate based approach as well as the random process of assignment contribute to its increased accuracy and reduced bias in many scenarios. Hence it is preferable to fixed or areal geoimputation for many epidemiological and surveillance applications.

  13. Evidence-based medicine: a new tool for resource allocation?

    PubMed

    Nunes, Rui

    2003-01-01

    Evidence-Based Medicine (EBM) is defined as the conscious, and judicious use of current best evidence in making decisions about the care of individual patients. The greater the level of evidence the greater the grade of recommendation. This pioneering explicit concept of EBM is embedded in a particular view of medical practice namely the singular nature of the patient-physician relation and the commitment of the latter towards a specific goal: the treatment and the well being of his or her client. Nevertheless, in many European countries as well as in the United States, this "integration of the best evidence from systematic research with clinical expertise and patient values" appears to be re-interpreted in light of the scarcity of healthcare resources. The purpose of this paper is double. First, to claim that from an ethical perspective EBM should be a guideline to clinical practice; and second, that in specific circumstances EBM might be a useful tool in macro-allocation of healthcare resources. Methodologically the author follows Norman Daniels' theory of "democratic accountability" to justify this assumption. That is, choices in healthcare must be accountable by democratic procedures. This perspective of distributive justice is responsible for the scope and limits of healthcare services. It follows that particular entitlements to healthcare--namely expensive innovative treatments and medicines--may be fairly restricted as long as this decision is socially and democratically accountable and imposed by financial restrictions of the system. In conclusion, the implementation of EBM, as long as it limits the access to drugs and treatments of unproven scientific results is in accordance with this perspective. The use of EBM is regarded as an instrument to facilitate the access of all citizens to a reasonable level of healthcare and to promote the efficiency of the system.

  14. Classification Based on Tree-Structured Allocation Rules

    ERIC Educational Resources Information Center

    Vaughn, Brandon K.; Wang, Qui

    2008-01-01

    The authors consider the problem of classifying an unknown observation into 1 of several populations by using tree-structured allocation rules. Although many parametric classification procedures are robust to certain assumption violations, there is need for classification procedures that can be used regardless of the group-conditional…

  15. Impact-Based Area Allocation for Yield Optimization in Integrated Circuits

    NASA Astrophysics Data System (ADS)

    Abraham, Billion; Widodo, Arif; Chen, Poki

    2016-06-01

    In analog integrated circuit (IC) layout, area allocation is a very important issue for achieving good mismatch cancellation. However, most IC layout papers focus only on layout strategy to reduce systematic mismatch. In 2006, an outstanding paper presenting area allocation strategy was published to introduce technique for random mismatch reduction. Instead of using general theoretical study to prove the strategy, this research presented close-to-optimum simulations only on case-bycase basis. The impact-based area allocation for yield optimization in integrated circuits is proposed in this chapter. To demonstrate the corresponding strategy, not only a theoretical analysis but also an integral nonlinearity-based yield simulation will be given to derive optimum area allocation for binary weighted current steering digital-to-analog converter (DAC). The result will be concluded to convince IC designers how to allocate area for critical devices in an optimum way.

  16. 77 FR 57180 - Fiscal Year 2013 Tariff-rate Quota Allocations for Raw Cane Sugar, Refined and Specialty Sugar...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-17

    ... Trade Representative in Presidential Proclamation 6763 (60 FR 1007). On September 10, 2012, the... REPRESENTATIVE Fiscal Year 2013 Tariff-rate Quota Allocations for Raw Cane Sugar, Refined and Specialty Sugar, and Sugar-Containing Products AGENCY: Office of the United States Trade Representative. ACTION:...

  17. 75 FR 50796 - Fiscal Year 2011 Tariff-Rate Quota Allocations for Raw Cane Sugar, Refined and Specialty Sugar...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-17

    ... United States Trade Representative under Presidential Proclamation 6763 (60 FR 1007). On July 30, 2010... TRADE REPRESENTATIVE Fiscal Year 2011 Tariff-Rate Quota Allocations for Raw Cane Sugar, Refined and Specialty Sugar, and Sugar-Containing Products AGENCY: Office of the United States Trade...

  18. 76 FR 50285 - Fiscal Year 2012 Tariff-Rate Quota Allocations for Raw Cane Sugar, Refined and Specialty Sugar...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-12

    ... under Presidential Proclamation 6763 (60 FR 1007). On August 1, 2011, the Secretary of Agriculture... TRADE REPRESENTATIVE Fiscal Year 2012 Tariff-Rate Quota Allocations for Raw Cane Sugar, Refined and Specialty Sugar and Sugar-Containing Products AGENCY: Office of the United States Trade...

  19. 78 FR 57445 - Fiscal Year 2014 WTO Tariff-Rate Quota Allocations for Raw Cane Sugar, Refined and Specialty...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-18

    ... United States Trade Representative in Presidential Proclamation 6763 (60 FR 1007). On September 13, 2013... TRADE REPRESENTATIVE Fiscal Year 2014 WTO Tariff-Rate Quota Allocations for Raw Cane Sugar, Refined and Specialty Sugar, and Sugar-Containing Products AGENCY: Office of the United States Trade...

  20. 77 FR 58525 - Notice of Solicitation of Applications for Allocation of Tariff Rate Quotas on the Import of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-21

    ... on the Import of Certain Worsted Wool Fabrics to Persons Who Weave Such Fabrics in the United States... wool fabric to persons who weave such fabrics in the United States. SUMMARY: The Department hereby... worsted wool fabrics in the United States for an allocation of the 2013 tariff rate quotas on...

  1. 75 FR 54599 - Notice of Solicitation of Applications for Allocation of Tariff Rate Quotas on the Import of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-08

    ... on the Import of Certain Worsted Wool Fabrics to Persons Who Weave Such Fabrics in the United States... wool fabric to persons who weave such fabrics in the United States. SUMMARY: The Department hereby... worsted wool fabrics in the United States for an allocation of the 2011 tariff rate quotas on...

  2. Resource Allocation.

    ERIC Educational Resources Information Center

    Stennett, R. G.

    A research allocation formula employed in London, Ontario elementary schools, as well as supporting data on the method, are provided in this report. Attempts to improve on the traditional methods of resource allocation in London's schools were based on two principles: (1) that need for a particular service could and should be determined…

  3. Neural Network Based Modeling and Analysis of LP Control Surface Allocation

    NASA Technical Reports Server (NTRS)

    Langari, Reza; Krishnakumar, Kalmanje; Gundy-Burlet, Karen

    2003-01-01

    This paper presents an approach to interpretive modeling of LP based control allocation in intelligent flight control. The emphasis is placed on a nonlinear interpretation of the LP allocation process as a static map to support analytical study of the resulting closed loop system, albeit in approximate form. The approach makes use of a bi-layer neural network to capture the essential functioning of the LP allocation process. It is further shown via Lyapunov based analysis that under certain relatively mild conditions the resulting closed loop system is stable. Some preliminary conclusions from a study at Ames are stated and directions for further research are given at the conclusion of the paper.

  4. Functional Allocation for Ground-Based Automated Separation Assurance in NextGen

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas; Mercer, Joey; Martin, Lynne; Homola, Jeffrey; Cabrall, Christopher; Brasil, Connie

    2010-01-01

    As part of an ongoing research effort into functional allocation in a NextGen environment, a controller-in-the-loop study on ground-based automated separation assurance was conducted at NASA Ames' Airspace Operations Laboratory in February 2010. Participants included six FAA front line managers, who are currently certified professional controllers and four recently retired controllers. Traffic scenarios were 15 and 30 minutes long where controllers interacted with advanced technologies for ground-based separation assurance, weather avoidance, and arrival metering. The automation managed the separation by resolving conflicts automatically and involved controllers only by exception, e.g., when the automated resolution would have been outside preset limits. Results from data analyses show that workload was low despite high levels of traffic, Operational Errors did occur but were closely tied to local complexity, and safety acceptability ratings varied with traffic levels. Positive feedback was elicited for the overall concept with discussion on the proper allocation of functions and trust in automation.

  5. 41 CFR Appendix B to Chapter 301 - Allocation of M&IE Rates To Be Used in Making Deductions From the M&IE Allowance

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 4 2010-07-01 2010-07-01 false Allocation of M&IE Rates To Be Used in Making Deductions From the M&IE Allowance B Appendix B to Chapter 301 Public Contracts.... 301, App. B Appendix B to Chapter 301—Allocation of M&IE Rates To Be Used in Making Deductions...

  6. Routing and spectrum allocation in multi-ring based data center networks

    NASA Astrophysics Data System (ADS)

    Zhang, Zitian; Hu, Weisheng; Ye, Tong; Sun, Weiqiang; Zhao, Li; Zhang, Kuo

    2016-02-01

    Recently, we proposed a multi-ring based optical circuit switching (OCS) network following the principle of a Clos network. The network can provide connectivity to a large number of racks which may be distributed across a relatively large geographical space in a data center. However, property of the ring based switch in the central stage of the multi-ring based OCS network introduces a unique routing and spectrum allocation (RSA) problem which is more complex than the routing problem in a classical Clos switching network. In this paper, we extend our work to investigate the RSA problem. For a given set of inter-rack traffic requests, we consider two spectrum allocation schemes, namely fixed spectrum allocation and flexible spectrum allocation. For the fixed case, we show that the RSA problem degenerates into the traditional routing problem of the Clos network. As for the flexible case, property of spectrum division multiplexing technology and bandwidth limitation of the ring based switches should be taken into consideration during allocation of the central module, such that the system throughput can be maximized. This paper presents an integer linear program (ILP) formulation as well as a heuristic algorithm we developed to solve the flexible RSA problem. We evaluate the performance of both the two spectrum allocation schemes under different traffic patterns. Our results demonstrate that, to handle uneven inter-rack traffic pattern in general data center networks, flexible spectrum allocation can lead to an increase of about 120% in system throughput, although its computational complexity is slightly higher than that of the fixed spectrum allocation scheme.

  7. Optimal conservation resource allocation under variable economic and ecological time discounting rates in boreal forest.

    PubMed

    Mazziotta, Adriano; Pouzols, Federico Montesino; Mönkkönen, Mikko; Kotiaho, Janne S; Strandman, Harri; Moilanen, Atte

    2016-09-15

    Resource allocation to multiple alternative conservation actions is a complex task. A common trade-off occurs between protection of smaller, expensive, high-quality areas versus larger, cheaper, partially degraded areas. We investigate optimal allocation into three actions in boreal forest: current standard forest management rules, setting aside of mature stands, or setting aside of clear-cuts. We first estimated how habitat availability for focal indicator species and economic returns from timber harvesting develop through time as a function of forest type and action chosen. We then developed an optimal resource allocation by accounting for budget size and habitat availability of indicator species in different forest types. We also accounted for the perspective adopted towards sustainability, modeled via temporal preference and economic and ecological time discounting. Controversially, we found that in boreal forest set-aside followed by protection of clear-cuts can become a winning cost-effective strategy when accounting for habitat requirements of multiple species, long planning horizon, and limited budget. It is particularly effective when adopting a long-term sustainability perspective, and accounting for present revenues from timber harvesting. The present analysis assesses the cost-effective conditions to allocate resources into an inexpensive conservation strategy that nevertheless has potential to produce high ecological values in the future. PMID:27262031

  8. Model-based metrics of human-automation function allocation in complex work environments

    NASA Astrophysics Data System (ADS)

    Kim, So Young

    issues with function allocation. Then, based on the eight issues, eight types of metrics are established. The purpose of these metrics is to assess the extent to which each issue exists with a given function allocation. Specifically, the eight types of metrics assess workload, coherency of a function allocation, mismatches between responsibility and authority, interruptive automation, automation boundary conditions, human adaptation to context, stability of the human's work environment, and mission performance. Finally, to validate the modeling framework and the metrics, a case study was conducted modeling four different function allocations between a pilot and flight deck automation during the arrival and approach phases of flight. A range of pilot cognitive control modes and maximum human taskload limits were also included in the model. The metrics were assessed for these four function allocations and analyzed to validate capability of the metrics to identify important issues in given function allocations. In addition, the design insights provided by the metrics are highlighted. This thesis concludes with a discussion of mechanisms for further validating the modeling framework and function allocation metrics developed here, and highlights where these developments can be applied in research and in the design of function allocations in complex work environments such as aviation operations.

  9. Rate of Belowground Carbon Allocation Differs with Successional Habit of Two Afromontane Trees

    PubMed Central

    Shibistova, Olga; Yohannes, Yonas; Boy, Jens; Richter, Andreas; Wild, Birgit; Watzka, Margarethe; Guggenberger, Georg

    2012-01-01

    Background Anthropogenic disturbance of old-growth tropical forests increases the abundance of early successional tree species at the cost of late successional ones. Quantifying differences in terms of carbon allocation and the proportion of recently fixed carbon in soil CO2 efflux is crucial for addressing the carbon footprint of creeping degradation. Methodology We compared the carbon allocation pattern of the late successional gymnosperm Podocarpus falcatus (Thunb.) Mirb. and the early successional (gap filling) angiosperm Croton macrostachyus Hochst. es Del. in an Ethiopian Afromontane forest by whole tree 13CO2 pulse labeling. Over a one-year period we monitored the temporal resolution of the label in the foliage, the phloem sap, the arbuscular mycorrhiza, and in soil-derived CO2. Further, we quantified the overall losses of assimilated 13C with soil CO2 efflux. Principal Findings 13C in leaves of C. macrostachyus declined more rapidly with a larger size of a fast pool (64% vs. 50% of the assimilated carbon), having a shorter mean residence time (14 h vs. 55 h) as in leaves of P. falcatus. Phloem sap velocity was about 4 times higher for C. macrostachyus. Likewise, the label appeared earlier in the arbuscular mycorrhiza of C. macrostachyus and in the soil CO2 efflux as in case of P. falcatus (24 h vs. 72 h). Within one year soil CO2 efflux amounted to a loss of 32% of assimilated carbon for the gap filling tree and to 15% for the late successional one. Conclusions Our results showed clear differences in carbon allocation patterns between tree species, although we caution that this experiment was unreplicated. A shift in tree species composition of tropical montane forests (e.g., by degradation) accelerates carbon allocation belowground and increases respiratory carbon losses by the autotrophic community. If ongoing disturbance keeps early successional species in dominance, the larger allocation to fast cycling compartments may deplete soil organic carbon in

  10. A Minimalistic Resource Allocation Model to Explain Ubiquitous Increase in Protein Expression with Growth Rate

    PubMed Central

    Keren, Leeat; Segal, Eran; Milo, Ron

    2016-01-01

    Most proteins show changes in level across growth conditions. Many of these changes seem to be coordinated with the specific growth rate rather than the growth environment or the protein function. Although cellular growth rates, gene expression levels and gene regulation have been at the center of biological research for decades, there are only a few models giving a base line prediction of the dependence of the proteome fraction occupied by a gene with the specific growth rate. We present a simple model that predicts a widely coordinated increase in the fraction of many proteins out of the proteome, proportionally with the growth rate. The model reveals how passive redistribution of resources, due to active regulation of only a few proteins, can have proteome wide effects that are quantitatively predictable. Our model provides a potential explanation for why and how such a coordinated response of a large fraction of the proteome to the specific growth rate arises under different environmental conditions. The simplicity of our model can also be useful by serving as a baseline null hypothesis in the search for active regulation. We exemplify the usage of the model by analyzing the relationship between growth rate and proteome composition for the model microorganism E.coli as reflected in recent proteomics data sets spanning various growth conditions. We find that the fraction out of the proteome of a large number of proteins, and from different cellular processes, increases proportionally with the growth rate. Notably, ribosomal proteins, which have been previously reported to increase in fraction with growth rate, are only a small part of this group of proteins. We suggest that, although the fractions of many proteins change with the growth rate, such changes may be partially driven by a global effect, not necessarily requiring specific cellular control mechanisms. PMID:27073913

  11. Larval growth rate and sex determine resource allocation and stress responsiveness across life stages in juvenile frogs.

    PubMed

    Warne, Robin W; Crespi, Erica J

    2015-03-01

    The extent to which interactions between environmental stressors and phenotypic variation during larval life stages impose carry-over effects on adult phenotypes in wildlife are not clear. Using semi-natural mesocosms, we examined how chronically low food availability and size-specific phenotypes in larval amphibians interact and carry over to influence frog growth, resource allocation, endocrine activity and survival. We tagged three cohorts of larvae that differed in body size and developmental stage at 3 weeks after hatching, and tracked them through 10 weeks after metamorphosis in high and low food conditions. We found that growth and development rates during the early tadpole stage not only affected metamorphic rates, but also shaped resource allocation and stress responsiveness in frogs: the slowest growing larvae from low-food mesocosms exhibited a suppressed glucocorticoid response to a handling stressor; reduced growth rate and fat storage as frogs. We also show for the first time that larval developmental trajectories varied with sex, where females developed faster than males especially in food-restricted conditions. Last, while larval food restriction profoundly affected body size in larvae and frogs, time to metamorphosis was highly constrained, which suggests that the physiology and development of this ephemeral pond-breeding amphibian is adapted for rapid metamorphosis despite large potential variation in nutrient availability. Taken together, these results suggest that larval phenotypic variation significantly influences multiple dimensions of post-metamorphic physiology and resource allocation, which likely affect overall performance.

  12. Rational selective exploitation and distress: employee reactions to performance-based and mobility-based reward allocations.

    PubMed

    Rusbult, C E; Campbell, M A; Price, M E

    1990-09-01

    Prior research has demonstrated that allocators frequently distribute greater rewards to persons with high professional and geographic mobility than to persons with constrained mobility, especially among the very competent. This phenomenon has been termed rational selective exploitation. Do the recipients of such allocations actually experience this distribution rule as unjust and distressing, or is it a misnomer to refer to this phenomenon as exploitation? Two studies were conducted to explore this question. Study 1 was a laboratory experiment in which we manipulated relative performance level, relative mobility level, and allocation standard: performance based versus mobility based. Study 2 was a cross-sectional survey of actual employees in which subjects reported the degree to which performance and mobility were the basis for pay decisions at their places of employment, as well as the degree to which they perceived each standard to be fair. Both studies demonstrated that people regard mobility-based allocations as less fair and more distressing than performance-based allocations. Furthermore, the degree of distress resulting from mobility-based allocations is greater among persons who are disadvantaged by that standard: among people with constrained mobility, especially those who perform at high levels. These findings provide good support for the assertion that so-called rational selective exploitation is indeed distressing to employees. Reactions to this form of distress are also explored, and the implications of these findings for the allocation process are discussed.

  13. Rational allocation of water resources based on ecological groundwater levels:a case study in Jinghui Irrigation District in China

    NASA Astrophysics Data System (ADS)

    Li, H.; Zhou, W. B.; Dong, Q. G.; Liu, B. Y.; Ma, C.

    2016-08-01

    Aimed at the hydrogeological environmental problems caused by over-exploitation and unreasonable utilization of water resources in Jinghui Irrigation District, this paper discusses the ecological groundwater level of the study area and establishes a three-layer optimal allocation model of water resources based on the theory of large scale systems. Then, the genetic algorithm method was employed to optimize the model and obtain the optimal allocation of crop irrigation schedule and water resources under the condition of a 75% assurance rate. Finally, the numerical simulation model of the groundwater was applied to analyze the balance of the groundwater on the basis of the optimal allocation scheme. The results show that the upper limitation of the ecological groundwater in Jinghui Irrigation District ranged from 1.8m to 4.2m, while the lower limitation level ranged from 8m to 28m. By 2020, the condition of the groundwater imbalance that results from adopting the optimal allocation scheme will be much better than that caused by current water utilization scheme. With the exception of only a few areas, the groundwater level in most parts of Jinghui Irrigation District will not exceed the lower limitation of ecological groundwater level.

  14. Stackelberg Game Based Power Allocation for Physical Layer Security of Device-to-device Communication Underlaying Cellular Networks

    NASA Astrophysics Data System (ADS)

    Qu, Junyue; Cai, Yueming; Wu, Dan; Chen, Hualiang

    2014-05-01

    The problem of power allocation for device-to-device (D2D) underlay communication to improve physical layer security is addressed. Specifically, to improve the secure communication of the cellular users, we introduce a Stackelberg game for allocating the power of the D2D link under a total power constraint and a rate constraint at the D2D pair. In the introduced Stackelberg game the D2D pair works as a seller and the cellular UEs work as buyers. Firstly, because the interference signals from D2D pair are unknown to both the legitimate receiver and the illegitimate eavesdropper, it is possible that a cellular UE decline to participate in the introduced Stackelberg game. So the condition under which a legitimate user will participate in the introduced Stackelberg game is discussed. Then, based on the Stackelberg game, we propose a semi-distributed power allocation algorithm, which is proved to conclude after finite-time iterations. In the end, some simulations are presented to verify the performance improvement in the physical layer security of cellular UEs using the proposed power allocation algorithm. We can determine that with the proposed algorithm, while the D2D pair's communication demand is met, the physical layer security of cellular UEs can be improved.

  15. A trust-based sensor allocation algorithm in cooperative space search problems

    NASA Astrophysics Data System (ADS)

    Shen, Dan; Chen, Genshe; Pham, Khanh; Blasch, Erik

    2011-06-01

    Sensor allocation is an important and challenging problem within the field of multi-agent systems. The sensor allocation problem involves deciding how to assign a number of targets or cells to a set of agents according to some allocation protocol. Generally, in order to make efficient allocations, we need to design mechanisms that consider both the task performers' costs for the service and the associated probability of success (POS). In our problem, the costs are the used sensor resource, and the POS is the target tracking performance. Usually, POS may be perceived differently by different agents because they typically have different standards or means of evaluating the performance of their counterparts (other sensors in the search and tracking problem). Given this, we turn to the notion of trust to capture such subjective perceptions. In our approach, we develop a trust model to construct a novel mechanism that motivates sensor agents to limit their greediness or selfishness. Then we model the sensor allocation optimization problem with trust-in-loop negotiation game and solve it using a sub-game perfect equilibrium. Numerical simulations are performed to demonstrate the trust-based sensor allocation algorithm in cooperative space situation awareness (SSA) search problems.

  16. Comparing administered and market-based water allocation systems using an agent-based modeling approach

    NASA Astrophysics Data System (ADS)

    Zhao, J.; Cai, X.; Wang, Z.

    2009-12-01

    It also has been well recognized that market-based systems can have significant advantages over administered systems for water allocation. However there are not many successful water markets around the world yet and administered systems exist commonly in water allocation management practice. This paradox has been under discussion for decades and still calls for attention for both research and practice. This paper explores some insights for the paradox and tries to address why market systems have not been widely implemented for water allocation. Adopting the theory of agent-based system we develop a consistent analytical model to interpret both systems. First we derive some theorems based on the analytical model, with respect to the necessary conditions for economic efficiency of water allocation. Following that the agent-based model is used to illustrate the coherence and difference between administered and market-based systems. The two systems are compared from three aspects: 1) the driving forces acting on the system state, 2) system efficiency, and 3) equity. Regarding economic efficiency, penalty on the violation of water use permits (or rights) under an administered system can lead to system-wide economic efficiency, as well as being acceptable by some agents, which follows the theory of the so-call rational violation. Ideal equity will be realized if penalty equals incentive with an administered system and if transaction costs are zero with a market system. The performances of both agents and the over system are explained with an administered system and market system, respectively. The performances of agents are subject to different mechanisms of interactions between agents under the two systems. The system emergency (i.e., system benefit, equilibrium market price, etc), resulting from the performance at the agent level, reflects the different mechanism of the two systems, the “invisible hand” with the market system and administrative measures (penalty

  17. A Control Allocation Technique to Recover From Pilot-Induced Oscillations (CAPIO) Due to Actuator Rate Limiting

    NASA Technical Reports Server (NTRS)

    Yildiz, Yildiray; Kolmanovsky, Ilya V.

    2010-01-01

    This paper proposes a control allocation technique that can help pilots recover from pilot induced oscillations (PIO). When actuators are rate-saturated due to aggressive pilot commands, high gain flight control systems or some anomaly in the system, the effective delay in the control loop may increase depending on the nature of the cause. This effective delay increase manifests itself as a phase shift between the commanded and actual system signals and can instigate PIOs. The proposed control allocator reduces the effective time delay by minimizing the phase shift between the commanded and the actual attitude accelerations. Simulation results are reported, which demonstrate phase shift minimization and recovery from PIOs. Conversion of the objective function to be minimized and constraints to a form that is suitable for implementation is given.

  18. Developing Subdomain Allocation Algorithms Based on Spatial and Communicational Constraints to Accelerate Dust Storm Simulation

    PubMed Central

    Gui, Zhipeng; Yu, Manzhu; Yang, Chaowei; Jiang, Yunfeng; Chen, Songqing; Xia, Jizhe; Huang, Qunying; Liu, Kai; Li, Zhenlong; Hassan, Mohammed Anowarul; Jin, Baoxuan

    2016-01-01

    Dust storm has serious disastrous impacts on environment, human health, and assets. The developments and applications of dust storm models have contributed significantly to better understand and predict the distribution, intensity and structure of dust storms. However, dust storm simulation is a data and computing intensive process. To improve the computing performance, high performance computing has been widely adopted by dividing the entire study area into multiple subdomains and allocating each subdomain on different computing nodes in a parallel fashion. Inappropriate allocation may introduce imbalanced task loads and unnecessary communications among computing nodes. Therefore, allocation is a key factor that may impact the efficiency of parallel process. An allocation algorithm is expected to consider the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire simulation. This research introduces three algorithms to optimize the allocation by considering the spatial and communicational constraints: 1) an Integer Linear Programming (ILP) based algorithm from combinational optimization perspective; 2) a K-Means and Kernighan-Lin combined heuristic algorithm (K&K) integrating geometric and coordinate-free methods by merging local and global partitioning; 3) an automatic seeded region growing based geometric and local partitioning algorithm (ASRG). The performance and effectiveness of the three algorithms are compared based on different factors. Further, we adopt the K&K algorithm as the demonstrated algorithm for the experiment of dust model simulation with the non-hydrostatic mesoscale model (NMM-dust) and compared the performance with the MPI default sequential allocation. The results demonstrate that K&K method significantly improves the simulation performance with better subdomain allocation. This method can also be adopted for other relevant atmospheric and numerical

  19. Developing Subdomain Allocation Algorithms Based on Spatial and Communicational Constraints to Accelerate Dust Storm Simulation.

    PubMed

    Gui, Zhipeng; Yu, Manzhu; Yang, Chaowei; Jiang, Yunfeng; Chen, Songqing; Xia, Jizhe; Huang, Qunying; Liu, Kai; Li, Zhenlong; Hassan, Mohammed Anowarul; Jin, Baoxuan

    2016-01-01

    Dust storm has serious disastrous impacts on environment, human health, and assets. The developments and applications of dust storm models have contributed significantly to better understand and predict the distribution, intensity and structure of dust storms. However, dust storm simulation is a data and computing intensive process. To improve the computing performance, high performance computing has been widely adopted by dividing the entire study area into multiple subdomains and allocating each subdomain on different computing nodes in a parallel fashion. Inappropriate allocation may introduce imbalanced task loads and unnecessary communications among computing nodes. Therefore, allocation is a key factor that may impact the efficiency of parallel process. An allocation algorithm is expected to consider the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire simulation. This research introduces three algorithms to optimize the allocation by considering the spatial and communicational constraints: 1) an Integer Linear Programming (ILP) based algorithm from combinational optimization perspective; 2) a K-Means and Kernighan-Lin combined heuristic algorithm (K&K) integrating geometric and coordinate-free methods by merging local and global partitioning; 3) an automatic seeded region growing based geometric and local partitioning algorithm (ASRG). The performance and effectiveness of the three algorithms are compared based on different factors. Further, we adopt the K&K algorithm as the demonstrated algorithm for the experiment of dust model simulation with the non-hydrostatic mesoscale model (NMM-dust) and compared the performance with the MPI default sequential allocation. The results demonstrate that K&K method significantly improves the simulation performance with better subdomain allocation. This method can also be adopted for other relevant atmospheric and numerical

  20. Developing Subdomain Allocation Algorithms Based on Spatial and Communicational Constraints to Accelerate Dust Storm Simulation.

    PubMed

    Gui, Zhipeng; Yu, Manzhu; Yang, Chaowei; Jiang, Yunfeng; Chen, Songqing; Xia, Jizhe; Huang, Qunying; Liu, Kai; Li, Zhenlong; Hassan, Mohammed Anowarul; Jin, Baoxuan

    2016-01-01

    Dust storm has serious disastrous impacts on environment, human health, and assets. The developments and applications of dust storm models have contributed significantly to better understand and predict the distribution, intensity and structure of dust storms. However, dust storm simulation is a data and computing intensive process. To improve the computing performance, high performance computing has been widely adopted by dividing the entire study area into multiple subdomains and allocating each subdomain on different computing nodes in a parallel fashion. Inappropriate allocation may introduce imbalanced task loads and unnecessary communications among computing nodes. Therefore, allocation is a key factor that may impact the efficiency of parallel process. An allocation algorithm is expected to consider the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire simulation. This research introduces three algorithms to optimize the allocation by considering the spatial and communicational constraints: 1) an Integer Linear Programming (ILP) based algorithm from combinational optimization perspective; 2) a K-Means and Kernighan-Lin combined heuristic algorithm (K&K) integrating geometric and coordinate-free methods by merging local and global partitioning; 3) an automatic seeded region growing based geometric and local partitioning algorithm (ASRG). The performance and effectiveness of the three algorithms are compared based on different factors. Further, we adopt the K&K algorithm as the demonstrated algorithm for the experiment of dust model simulation with the non-hydrostatic mesoscale model (NMM-dust) and compared the performance with the MPI default sequential allocation. The results demonstrate that K&K method significantly improves the simulation performance with better subdomain allocation. This method can also be adopted for other relevant atmospheric and numerical

  1. An Optimization Model for the Allocation of University Based Merit Aid

    ERIC Educational Resources Information Center

    Sugrue, Paul K.

    2010-01-01

    The allocation of merit-based financial aid during the college admissions process presents postsecondary institutions with complex and financially expensive decisions. This article describes the application of linear programming as a decision tool in merit based financial aid decisions at a medium size private university. The objective defined for…

  2. Word Learning and Attention Allocation Based on Word Class and Category Knowledge

    ERIC Educational Resources Information Center

    Hupp, Julie M.

    2015-01-01

    Attention allocation in word learning may vary developmentally based on the novelty of the object. It has been suggested that children differentially learn verbs based on the novelty of the agent, but adults do not because they automatically infer the object's category and thus treat it like a familiar object. The current research examined…

  3. Research on multirobot pursuit task allocation algorithm based on emotional cooperation factor.

    PubMed

    Fang, Baofu; Chen, Lu; Wang, Hao; Dai, Shuanglu; Zhong, Qiubo

    2014-01-01

    Multirobot task allocation is a hot issue in the field of robot research. A new emotional model is used with the self-interested robot, which gives a new way to measure self-interested robots' individual cooperative willingness in the problem of multirobot task allocation. Emotional cooperation factor is introduced into self-interested robot; it is updated based on emotional attenuation and external stimuli. Then a multirobot pursuit task allocation algorithm is proposed, which is based on emotional cooperation factor. Combined with the two-step auction algorithm recruiting team leaders and team collaborators, set up pursuit teams, and finally use certain strategies to complete the pursuit task. In order to verify the effectiveness of this algorithm, some comparing experiments have been done with the instantaneous greedy optimal auction algorithm; the results of experiments show that the total pursuit time and total team revenue can be optimized by using this algorithm.

  4. Efficient Allocation of Resources for Defense of Spatially Distributed Networks Using Agent-Based Simulation.

    PubMed

    Kroshl, William M; Sarkani, Shahram; Mazzuchi, Thomas A

    2015-09-01

    This article presents ongoing research that focuses on efficient allocation of defense resources to minimize the damage inflicted on a spatially distributed physical network such as a pipeline, water system, or power distribution system from an attack by an active adversary, recognizing the fundamental difference between preparing for natural disasters such as hurricanes, earthquakes, or even accidental systems failures and the problem of allocating resources to defend against an opponent who is aware of, and anticipating, the defender's efforts to mitigate the threat. Our approach is to utilize a combination of integer programming and agent-based modeling to allocate the defensive resources. We conceptualize the problem as a Stackelberg "leader follower" game where the defender first places his assets to defend key areas of the network, and the attacker then seeks to inflict the maximum damage possible within the constraints of resources and network structure. The criticality of arcs in the network is estimated by a deterministic network interdiction formulation, which then informs an evolutionary agent-based simulation. The evolutionary agent-based simulation is used to determine the allocation of resources for attackers and defenders that results in evolutionary stable strategies, where actions by either side alone cannot increase its share of victories. We demonstrate these techniques on an example network, comparing the evolutionary agent-based results to a more traditional, probabilistic risk analysis (PRA) approach. Our results show that the agent-based approach results in a greater percentage of defender victories than does the PRA-based approach.

  5. Efficient Allocation of Resources for Defense of Spatially Distributed Networks Using Agent-Based Simulation.

    PubMed

    Kroshl, William M; Sarkani, Shahram; Mazzuchi, Thomas A

    2015-09-01

    This article presents ongoing research that focuses on efficient allocation of defense resources to minimize the damage inflicted on a spatially distributed physical network such as a pipeline, water system, or power distribution system from an attack by an active adversary, recognizing the fundamental difference between preparing for natural disasters such as hurricanes, earthquakes, or even accidental systems failures and the problem of allocating resources to defend against an opponent who is aware of, and anticipating, the defender's efforts to mitigate the threat. Our approach is to utilize a combination of integer programming and agent-based modeling to allocate the defensive resources. We conceptualize the problem as a Stackelberg "leader follower" game where the defender first places his assets to defend key areas of the network, and the attacker then seeks to inflict the maximum damage possible within the constraints of resources and network structure. The criticality of arcs in the network is estimated by a deterministic network interdiction formulation, which then informs an evolutionary agent-based simulation. The evolutionary agent-based simulation is used to determine the allocation of resources for attackers and defenders that results in evolutionary stable strategies, where actions by either side alone cannot increase its share of victories. We demonstrate these techniques on an example network, comparing the evolutionary agent-based results to a more traditional, probabilistic risk analysis (PRA) approach. Our results show that the agent-based approach results in a greater percentage of defender victories than does the PRA-based approach. PMID:25683347

  6. A subcarrier-pair based resource allocation scheme using proportional fairness for cooperative OFDM-based cognitive radio networks.

    PubMed

    Ma, Yongtao; Zhou, Liuji; Liu, Kaihua

    2013-08-09

    The paper presents a joint subcarrier-pair based resource allocation algorithm in order to improve the efficiency and fairness of cooperative multiuser orthogonal frequency division multiplexing (MU-OFDM) cognitive radio (CR) systems. A communication model where one source node communicates with one destination node assisted by one half-duplex decode-and-forward (DF) relay is considered in the paper. An interference-limited environment is considered, with the constraint of transmitted sum-power over all channels and aggregate average interference towards multiple primary users (PUs). The proposed resource allocation algorithm is capable of maximizing both the system transmission efficiency and fairness among secondary users (SUs). Besides, the proposed algorithm can also keep the interference introduced to the PU bands below a threshold. A proportional fairness constraint is used to assure that each SU can achieve a required data rate, with quality of service guarantees. Moreover, we extend the analysis to the scenario where each cooperative SU has no channel state information (CSI) about non-adjacent links. We analyzed the throughput and fairness tradeoff in CR system. A detailed analysis of the performance of the proposed algorithm is presented with the simulation results.

  7. Intra-District Resource Allocation and Criteria Used for Student Based Funding in Urban School Districts

    ERIC Educational Resources Information Center

    Aloo, Peter Mangla

    2011-01-01

    Resource allocation to school sites in public school districts is inequitable. While Student Based Funding (SBF) has been implemented in several major urban school districts, there are few empirical studies about how SBF policies are derived and implemented. Current efforts to align resources with student need are hindered by a lack of systematic,…

  8. Outcome Based Budgeting: Connecting Budget Development, Allocation and Outcomes.

    ERIC Educational Resources Information Center

    Anderes, Thomas

    This plan for outcome-based budgeting (OBB) is the result of growing demands for increased fiscal accountability, measurable outcomes, strengthened assessment processes, and more meaningful performance indicators as mandated by many State and Federal legislators. OBB focuses on linking funding with outputs and outcomes. Higher education…

  9. MDP-based resource allocation for triple-play transmission on xDSL systems

    NASA Astrophysics Data System (ADS)

    de Souza, Lamartine V.; de Carvalho, Glaucio H. S.; Cardoso, Diego L.; de Carvalho, Solon V.; Frances, Carlos R. L.; Costa, João C. W. A.; Riu, Jaume Rius i.

    2007-09-01

    Many broadband services are based on multimedia applications, such as voice over internet protocol (VoIP), video conferencing, video on demand (VoD), and internet protocol television (IPTV). The combination "triple-play" is often used with IPTV. It simply means offering voice, video and data. IPTV and others services uses digital broadband networks such as ADSL2+ (Asymmetric Digital Subscriber Line) and VDSL (Very High Rate DSL) to transmit the data. We have formulated a MDP (Markov Decision Process) for a triple-play transmission on DSL environment. In this paper, we establish the relationship between DSL transmission characteristics and its finite-state Markov model for a triple-play transmission system. This relationship can be used for a resource management for multimedia applications delivered through a broadband infrastructure. The solution to our optimization problem can be found using dynamic programming (DP) techniques, such as value iteration and its variants. Our study results in a transmission strategy that chooses the optimal resource allocation according the triple-play traffic requirements, defined in technical report TR-126 (Triple-Play Services Quality of Experience Requirements) from DSL Forum, minimizing quality of service (QoS) violations with respect to bandwidth. Three traffic classes (video, audio, and best effort internet data) are defined and analyzed. Our simulation results show parameters like as blocking probability for each class, link utilization and optimal control policies. The MDP-based approach provides a satisfactory way of resource management for a DSL system.

  10. A Control Allocation System for Automatic Detection and Compensation of Phase Shift Due to Actuator Rate Limiting

    NASA Technical Reports Server (NTRS)

    Yildiz, Yidiray; Kolmanovsky, Ilya V.; Acosta, Diana

    2011-01-01

    This paper proposes a control allocation system that can detect and compensate the phase shift between the desired and the actual total control effort due to rate limiting of the actuators. Phase shifting is an important problem in control system applications since it effectively introduces a time delay which may destabilize the closed loop dynamics. A relevant example comes from flight control where aggressive pilot commands, high gain of the flight control system or some anomaly in the system may cause actuator rate limiting and effective time delay introduction. This time delay can instigate Pilot Induced Oscillations (PIO), which is an abnormal coupling between the pilot and the aircraft resulting in unintentional and undesired oscillations. The proposed control allocation system reduces the effective time delay by first detecting the phase shift and then minimizing it using constrained optimization techniques. Flight control simulation results for an unstable aircraft with inertial cross coupling are reported, which demonstrate phase shift minimization and recovery from a PIO event.

  11. Iterative resource allocation based on propagation feature of node for identifying the influential nodes

    NASA Astrophysics Data System (ADS)

    Zhong, Lin-Feng; Liu, Jian-Guo; Shang, Ming-Sheng

    2015-10-01

    The identification of the influential nodes in networks is one of the most promising domains. In this paper, we present an improved iterative resource allocation (IIRA) method by considering the centrality information of neighbors and the influence of spreading rate for a target node. Comparing with the results of the Susceptible Infected Recovered (SIR) model for four real networks, the IIRA method could identify influential nodes more accurately than the tradition IRA method. Specially, in the Erdös network, Kendall's tau could be enhanced 23% when the spreading rate is 0.12. In the Protein network, Kendall's tau could be enhanced 24% when the spreading rate is 0.08.

  12. Multi-Agent Based Simulation of Optimal Urban Land Use Allocation in the Middle Reaches of the Yangtze River, China

    NASA Astrophysics Data System (ADS)

    Zeng, Y.; Huang, W.; Jin, W.; Li, S.

    2016-06-01

    The optimization of land-use allocation is one of important approaches to achieve regional sustainable development. This study selects Chang-Zhu-Tan agglomeration as study area and proposed a new land use optimization allocation model. Using multi-agent based simulation model, the future urban land use optimization allocation was simulated in 2020 and 2030 under three different scenarios. This kind of quantitative information about urban land use optimization allocation and urban expansions in future would be of great interest to urban planning, water and land resource management, and climate change research.

  13. Effects of reinforcer rate and reinforcer quality on time allocation: Extensions of matching theory to educational settings.

    PubMed

    Neef, N A

    1992-01-01

    We examined how 3 special education students allocated their responding across two concurrently available tasks associated with unequal rates and equal versus unequal qualities of reinforcement. The students completed math problems from two alternative sets on concurrent variable-interval (VI) 30-s VI 120-s schedules of reinforcement. During the equal-quality reinforcer condition, high-quality (nickels) and low-quality items ("program money" in the school's token economy) were alternated across sessions as the reinforcer for both sets of problems. During the unequal-quality reinforcer condition, the low-quality reinforcer was used for the set of problems on the VI 30-s schedule, and the high-quality reinforcer was used for the set of problems on the VI 120-s schedule. Equal- and unequal-quality reinforcer conditions were alternated using a reversal design. Results showed that sensitivity to the features of the VI reinforcement schedules developed only after the reinforcement intervals were signaled through countdown timers. Thereafter, when reinforcer quality was equal, the time allocated to concurrent response alternatives was approximately proportional to obtained reinforcement, as predicted by the matching law. However the matching relation was disrupted when, as occurs in most natural choice situations, the quality of the reinforcers differed across the response options.

  14. Staged optimization algorithms based MAC dynamic bandwidth allocation for OFDMA-PON

    NASA Astrophysics Data System (ADS)

    Liu, Yafan; Qian, Chen; Cao, Bingyao; Dun, Han; Shi, Yan; Zou, Junni; Lin, Rujian; Wang, Min

    2016-06-01

    Orthogonal frequency division multiple access passive optical network (OFDMA-PON) has being considered as a promising solution for next generation PONs due to its high spectral efficiency and flexible bandwidth allocation scheme. In order to take full advantage of these merits of OFDMA-PON, a high-efficiency medium access control (MAC) dynamic bandwidth allocation (DBA) scheme is needed. In this paper, we propose two DBA algorithms which can act on two different stages of a resource allocation process. To achieve higher bandwidth utilization and ensure the equity of ONUs, we propose a DBA algorithm based on frame structure for the stage of physical layer mapping. Targeting the global quality of service (QoS) of OFDMA-PON, we propose a full-range DBA algorithm with service level agreement (SLA) and class of service (CoS) for the stage of bandwidth allocation arbitration. The performance of the proposed MAC DBA scheme containing these two algorithms is evaluated using numerical simulations. Simulations of a 15 Gbps network with 1024 sub-carriers and 32 ONUs demonstrate the maximum network throughput of 14.87 Gbps and the maximum packet delay of 1.45 ms for the highest priority CoS under high load condition.

  15. Motion-Based Piloted Simulation Evaluation of a Control Allocation Technique to Recover from Pilot Induced Oscillations

    NASA Technical Reports Server (NTRS)

    Craun, Robert W.; Acosta, Diana M.; Beard, Steven D.; Leonard, Michael W.; Hardy, Gordon H.; Weinstein, Michael; Yildiz, Yildiray

    2013-01-01

    This paper describes the maturation of a control allocation technique designed to assist pilots in the recovery from pilot induced oscillations (PIOs). The Control Allocation technique to recover from Pilot Induced Oscillations (CAPIO) is designed to enable next generation high efficiency aircraft designs. Energy efficient next generation aircraft require feedback control strategies that will enable lowering the actuator rate limit requirements for optimal airframe design. One of the common issues flying with actuator rate limits is PIOs caused by the phase lag between the pilot inputs and control surface response. CAPIO utilizes real-time optimization for control allocation to eliminate phase lag in the system caused by control surface rate limiting. System impacts of the control allocator were assessed through a piloted simulation evaluation of a non-linear aircraft simulation in the NASA Ames Vertical Motion Simulator. Results indicate that CAPIO helps reduce oscillatory behavior, including the severity and duration of PIOs, introduced by control surface rate limiting.

  16. Knowledge-based load leveling and task allocation in human-machine systems

    NASA Technical Reports Server (NTRS)

    Chignell, M. H.; Hancock, P. A.

    1986-01-01

    Conventional human-machine systems use task allocation policies which are based on the premise of a flexible human operator. This individual is most often required to compensate for and augment the capabilities of the machine. The development of artificial intelligence and improved technologies have allowed for a wider range of task allocation strategies. In response to these issues a Knowledge Based Adaptive Mechanism (KBAM) is proposed for assigning tasks to human and machine in real time, using a load leveling policy. This mechanism employs an online workload assessment and compensation system which is responsive to variations in load through an intelligent interface. This interface consists of a loading strategy reasoner which has access to information about the current status of the human-machine system as well as a database of admissible human/machine loading strategies. Difficulties standing in the way of successful implementation of the load leveling strategy are examined.

  17. Optimizing insecticide allocation strategies based on houses and livestock shelters for visceral leishmaniasis control in Bihar, India.

    PubMed

    Gorahava, Kaushik K; Rosenberger, Jay M; Mubayi, Anuj

    2015-07-01

    Visceral leishmaniasis (VL) is the most deadly form of the leishmaniasis family of diseases, which affects numerous developing countries. The Indian state of Bihar has the highest prevalence and mortality rate of VL in the world. Insecticide spraying is believed to be an effective vector control program for controlling the spread of VL in Bihar; however, it is expensive and less effective if not implemented systematically. This study develops and analyzes a novel optimization model for VL control in Bihar that identifies an optimal (best possible) allocation of chosen insecticide (dichlorodiphenyltrichloroethane [DDT] or deltamethrin) based on the sizes of human and cattle populations in the region. The model maximizes the insecticide-induced sandfly death rate in human and cattle dwellings while staying within the current state budget for VL vector control efforts. The model results suggest that deltamethrin might not be a good replacement for DDT because the insecticide-induced sandfly deaths are 3.72 times more in case of DDT even after 90 days post spray. Different insecticide allocation strategies between the two types of sites (houses and cattle sheds) are suggested based on the state VL-control budget and have a direct implication on VL elimination efforts in a resource-limited region.

  18. Optimizing insecticide allocation strategies based on houses and livestock shelters for visceral leishmaniasis control in Bihar, India.

    PubMed

    Gorahava, Kaushik K; Rosenberger, Jay M; Mubayi, Anuj

    2015-07-01

    Visceral leishmaniasis (VL) is the most deadly form of the leishmaniasis family of diseases, which affects numerous developing countries. The Indian state of Bihar has the highest prevalence and mortality rate of VL in the world. Insecticide spraying is believed to be an effective vector control program for controlling the spread of VL in Bihar; however, it is expensive and less effective if not implemented systematically. This study develops and analyzes a novel optimization model for VL control in Bihar that identifies an optimal (best possible) allocation of chosen insecticide (dichlorodiphenyltrichloroethane [DDT] or deltamethrin) based on the sizes of human and cattle populations in the region. The model maximizes the insecticide-induced sandfly death rate in human and cattle dwellings while staying within the current state budget for VL vector control efforts. The model results suggest that deltamethrin might not be a good replacement for DDT because the insecticide-induced sandfly deaths are 3.72 times more in case of DDT even after 90 days post spray. Different insecticide allocation strategies between the two types of sites (houses and cattle sheds) are suggested based on the state VL-control budget and have a direct implication on VL elimination efforts in a resource-limited region. PMID:25940194

  19. Reduced memory multi-layer multi-component rate allocation for JPEG2000

    NASA Astrophysics Data System (ADS)

    Kulkarni, Prajit; Bilgin, Ali; Marcellin, Michael W.; Dagher, Joseph C.; Flohr, Thomas; Rountree, Janet

    2005-03-01

    Remote sensing images are often multispectral in nature and are acquired by on-board sensors in a "push-broom" fashion. These images are compressed and transmitted to ground stations for further analysis. Since they are extremely large, buffering all acquired data before encoding requires huge amounts of memory and introduces latency. Incremental compression schemes work on small chunks of raw data as soon as they are acquired and help reduce buffer memory requirements. However, incremental processing leads to large variations in quality across the reconstructed image. We propose two "leaky bucket" rate control algorithms that can be employed for incrementally compressing hyperspectral images using JPEG2000. Both schemes perform rate control using the fine granularity afforded by JPEG2000. The proposed algorithms have low memory requirements and enable SNR scalability through the use of quality layers. Experiments show that the proposed schemes provide significant reduction in quality variation with no loss in mean overall PSNR performance.

  20. Allocation Variable-Based Probabilistic Algorithm to Deal with Label Switching Problem in Bayesian Mixture Models

    PubMed Central

    Pan, Jia-Chiun; Liu, Chih-Min; Hwu, Hai-Gwo; Huang, Guan-Hua

    2015-01-01

    The label switching problem occurs as a result of the nonidentifiability of posterior distribution over various permutations of component labels when using Bayesian approach to estimate parameters in mixture models. In the cases where the number of components is fixed and known, we propose a relabelling algorithm, an allocation variable-based (denoted by AVP) probabilistic relabelling approach, to deal with label switching problem. We establish a model for the posterior distribution of allocation variables with label switching phenomenon. The AVP algorithm stochastically relabel the posterior samples according to the posterior probabilities of the established model. Some existing deterministic and other probabilistic algorithms are compared with AVP algorithm in simulation studies, and the success of the proposed approach is demonstrated in simulation studies and a real dataset. PMID:26458185

  1. 77 FR 58524 - Notice of Solicitation of Applications for Allocation of Tariff Rate Quotas on the Import of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-21

    ... procedures for allocating the TRQ. See 70 FR 61363; 19 CFR 335. In order to be eligible for an allocation, an... on the Import of Certain Worsted Wool Fabrics to Persons Who Cut and Sew Men's and Boys' Worsted Wool... sew men's and boys' worsted wool suits, suit-type jackets and trousers in the United States....

  2. Comparing administered and market-based water allocation systems through a consistent agent-based modeling framework.

    PubMed

    Zhao, Jianshi; Cai, Ximing; Wang, Zhongjing

    2013-07-15

    Water allocation can be undertaken through administered systems (AS), market-based systems (MS), or a combination of the two. The debate on the performance of the two systems has lasted for decades but still calls for attention in both research and practice. This paper compares water users' behavior under AS and MS through a consistent agent-based modeling framework for water allocation analysis that incorporates variables particular to both MS (e.g., water trade and trading prices) and AS (water use violations and penalties/subsidies). Analogous to the economic theory of water markets under MS, the theory of rational violation justifies the exchange of entitled water under AS through the use of cross-subsidies. Under water stress conditions, a unique water allocation equilibrium can be achieved by following a simple bargaining rule that does not depend upon initial market prices under MS, or initial economic incentives under AS. The modeling analysis shows that the behavior of water users (agents) depends on transaction, or administrative, costs, as well as their autonomy. Reducing transaction costs under MS or administrative costs under AS will mitigate the effect that equity constraints (originating with primary water allocation) have on the system's total net economic benefits. Moreover, hydrologic uncertainty is shown to increase market prices under MS and penalties/subsidies under AS and, in most cases, also increases transaction, or administrative, costs. PMID:23597927

  3. Comparing administered and market-based water allocation systems through a consistent agent-based modeling framework.

    PubMed

    Zhao, Jianshi; Cai, Ximing; Wang, Zhongjing

    2013-07-15

    Water allocation can be undertaken through administered systems (AS), market-based systems (MS), or a combination of the two. The debate on the performance of the two systems has lasted for decades but still calls for attention in both research and practice. This paper compares water users' behavior under AS and MS through a consistent agent-based modeling framework for water allocation analysis that incorporates variables particular to both MS (e.g., water trade and trading prices) and AS (water use violations and penalties/subsidies). Analogous to the economic theory of water markets under MS, the theory of rational violation justifies the exchange of entitled water under AS through the use of cross-subsidies. Under water stress conditions, a unique water allocation equilibrium can be achieved by following a simple bargaining rule that does not depend upon initial market prices under MS, or initial economic incentives under AS. The modeling analysis shows that the behavior of water users (agents) depends on transaction, or administrative, costs, as well as their autonomy. Reducing transaction costs under MS or administrative costs under AS will mitigate the effect that equity constraints (originating with primary water allocation) have on the system's total net economic benefits. Moreover, hydrologic uncertainty is shown to increase market prices under MS and penalties/subsidies under AS and, in most cases, also increases transaction, or administrative, costs.

  4. 47 CFR 65.800 - Rate base.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 3 2014-10-01 2014-10-01 false Rate base. 65.800 Section 65.800 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) INTERSTATE RATE OF RETURN PRESCRIPTION PROCEDURES AND METHODOLOGIES Rate Base § 65.800 Rate base. The rate base...

  5. 47 CFR 65.800 - Rate base.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 3 2011-10-01 2011-10-01 false Rate base. 65.800 Section 65.800 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) INTERSTATE RATE OF RETURN PRESCRIPTION PROCEDURES AND METHODOLOGIES Rate Base § 65.800 Rate base. The rate base...

  6. 47 CFR 65.800 - Rate base.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 3 2012-10-01 2012-10-01 false Rate base. 65.800 Section 65.800 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) INTERSTATE RATE OF RETURN PRESCRIPTION PROCEDURES AND METHODOLOGIES Rate Base § 65.800 Rate base. The rate base...

  7. 47 CFR 65.800 - Rate base.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 3 2013-10-01 2013-10-01 false Rate base. 65.800 Section 65.800 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) INTERSTATE RATE OF RETURN PRESCRIPTION PROCEDURES AND METHODOLOGIES Rate Base § 65.800 Rate base. The rate base...

  8. 47 CFR 65.800 - Rate base.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Rate base. 65.800 Section 65.800 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) INTERSTATE RATE OF RETURN PRESCRIPTION PROCEDURES AND METHODOLOGIES Rate Base § 65.800 Rate base. The rate base...

  9. Co-allocation model for complex equipment project risk based on gray information

    NASA Astrophysics Data System (ADS)

    Zhi-geng, Fang; Jin-yu, Sun

    2013-10-01

    As the fact that complex equipment project is a multi-level co-development network system and milestones connect with each other in accordance with the logical relationship between different levels, we can decompose the complex equipment project into several multi-level milestones. This paper has designed several connecting nodes of collaborative milestone and established a new co-allocation model for complex equipment project risk based on gray information. Take comprehensive trial phase of a large aircraft developed project as an example to prove the effectiveness and feasibility of the above models and algorithms, which provides a new analysis methods and research ideas.

  10. Outcome based state budget allocation for diabetes prevention programs using multi-criteria optimization with robust weights.

    PubMed

    Mehrotra, Sanjay; Kim, Kibaek

    2011-12-01

    We consider the problem of outcomes based budget allocations to chronic disease prevention programs across the United States (US) to achieve greater geographical healthcare equity. We use Diabetes Prevention and Control Programs (DPCP) by the Center for Disease Control and Prevention (CDC) as an example. We present a multi-criteria robust weighted sum model for such multi-criteria decision making in a group decision setting. The principal component analysis and an inverse linear programming techniques are presented and used to study the actual 2009 budget allocation by CDC. Our results show that the CDC budget allocation process for the DPCPs is not likely model based. In our empirical study, the relative weights for different prevalence and comorbidity factors and the corresponding budgets obtained under different weight regions are discussed. Parametric analysis suggests that money should be allocated to states to promote diabetes education and to increase patient-healthcare provider interactions to reduce disparity across the US.

  11. Non-structural carbon dynamics and allocation relate to growth rate and leaf habit in California oaks.

    PubMed

    Trumbore, Susan; Czimczik, Claudia I; Sierra, Carlos A; Muhr, Jan; Xu, Xiaomei

    2015-11-01

    Trees contain non-structural carbon (NSC), but it is unclear for how long these reserves are stored and to what degree they are used to support plant activity. We used radiocarbon ((14)C) to show that the carbon (C) in stemwood NSC can achieve ages of several decades in California oaks. We separated NSC into two fractions: soluble (∼50% sugars) and insoluble (mostly starch) NSC. Soluble NSC contained more C than insoluble NSC, but we found no consistent trend in the amount of either pool with depth in the stem. There was no systematic difference in C age between the two fractions, although ages increased with stem depth. The C in both NSC fractions was consistently younger than the structural C from which they were extracted. Together, these results indicate considerable inward mixing of NSC within the stem and rapid exchange between soluble and insoluble pools, compared with the timescale of inward mixing. We observed similar patterns in sympatric evergreen and deciduous oaks and the largest differences among tree stems with different growth rates. The (14)C signature of carbon dioxide (CO2) emitted from tree stems was higher than expected from very recent photoassimilates, indicating that the mean age of C in respiration substrates included a contribution from C fixed years previously. A simple model that tracks NSC produced each year, followed by loss (through conversion to CO2) in subsequent years, matches our observations of inward mixing of NSC in the stem and higher (14)C signature of stem CO2 efflux. Together, these data support the idea of continuous accumulation of NSC in stemwood and that 'vigor' (growth rate) and leaf habit (deciduous vs evergreen) control NSC pool size and allocation.

  12. Non-structural carbon dynamics and allocation relate to growth rate and leaf habit in California oaks.

    PubMed

    Trumbore, Susan; Czimczik, Claudia I; Sierra, Carlos A; Muhr, Jan; Xu, Xiaomei

    2015-11-01

    Trees contain non-structural carbon (NSC), but it is unclear for how long these reserves are stored and to what degree they are used to support plant activity. We used radiocarbon ((14)C) to show that the carbon (C) in stemwood NSC can achieve ages of several decades in California oaks. We separated NSC into two fractions: soluble (∼50% sugars) and insoluble (mostly starch) NSC. Soluble NSC contained more C than insoluble NSC, but we found no consistent trend in the amount of either pool with depth in the stem. There was no systematic difference in C age between the two fractions, although ages increased with stem depth. The C in both NSC fractions was consistently younger than the structural C from which they were extracted. Together, these results indicate considerable inward mixing of NSC within the stem and rapid exchange between soluble and insoluble pools, compared with the timescale of inward mixing. We observed similar patterns in sympatric evergreen and deciduous oaks and the largest differences among tree stems with different growth rates. The (14)C signature of carbon dioxide (CO2) emitted from tree stems was higher than expected from very recent photoassimilates, indicating that the mean age of C in respiration substrates included a contribution from C fixed years previously. A simple model that tracks NSC produced each year, followed by loss (through conversion to CO2) in subsequent years, matches our observations of inward mixing of NSC in the stem and higher (14)C signature of stem CO2 efflux. Together, these data support the idea of continuous accumulation of NSC in stemwood and that 'vigor' (growth rate) and leaf habit (deciduous vs evergreen) control NSC pool size and allocation. PMID:26452766

  13. K-Shortest-Path-Based Evacuation Routing with Police Resource Allocation in City Transportation Networks

    PubMed Central

    He, Yunyue; Liu, Zhong; Shi, Jianmai; Wang, Yishan; Zhang, Jiaming; Liu, Jinyuan

    2015-01-01

    Emergency evacuation aims to transport people from dangerous places to safe shelters as quickly as possible. Police play an important role in the evacuation process, as they can handle traffic accidents immediately and help people move smoothly on roads. This paper investigates an evacuation routing problem that involves police resource allocation. We propose a novel k-th-shortest-path-based technique that uses explicit congestion control to optimize evacuation routing and police resource allocation. A nonlinear mixed-integer programming model is presented to formulate the problem. The model’s objective is to minimize the overall evacuation clearance time. Two algorithms are given to solve the problem. The first one linearizes the original model and solves the linearized problem with CPLEX. The second one is a heuristic algorithm that uses a police resource utilization efficiency index to directly solve the original model. This police resource utilization efficiency index significantly aids in the evaluation of road links from an evacuation throughput perspective. The proposed algorithms are tested with a number of examples based on real data from cities of different sizes. The computational results show that the police resource utilization efficiency index is very helpful in finding near-optimal solutions. Additionally, comparing the performance of the heuristic algorithm and the linearization method by using randomly generated examples indicates that the efficiency of the heuristic algorithm is superior. PMID:26226109

  14. K-Shortest-Path-Based Evacuation Routing with Police Resource Allocation in City Transportation Networks.

    PubMed

    He, Yunyue; Liu, Zhong; Shi, Jianmai; Wang, Yishan; Zhang, Jiaming; Liu, Jinyuan

    2015-01-01

    Emergency evacuation aims to transport people from dangerous places to safe shelters as quickly as possible. Police play an important role in the evacuation process, as they can handle traffic accidents immediately and help people move smoothly on roads. This paper investigates an evacuation routing problem that involves police resource allocation. We propose a novel k-th-shortest-path-based technique that uses explicit congestion control to optimize evacuation routing and police resource allocation. A nonlinear mixed-integer programming model is presented to formulate the problem. The model's objective is to minimize the overall evacuation clearance time. Two algorithms are given to solve the problem. The first one linearizes the original model and solves the linearized problem with CPLEX. The second one is a heuristic algorithm that uses a police resource utilization efficiency index to directly solve the original model. This police resource utilization efficiency index significantly aids in the evaluation of road links from an evacuation throughput perspective. The proposed algorithms are tested with a number of examples based on real data from cities of different sizes. The computational results show that the police resource utilization efficiency index is very helpful in finding near-optimal solutions. Additionally, comparing the performance of the heuristic algorithm and the linearization method by using randomly generated examples indicates that the efficiency of the heuristic algorithm is superior.

  15. K-Shortest-Path-Based Evacuation Routing with Police Resource Allocation in City Transportation Networks.

    PubMed

    He, Yunyue; Liu, Zhong; Shi, Jianmai; Wang, Yishan; Zhang, Jiaming; Liu, Jinyuan

    2015-01-01

    Emergency evacuation aims to transport people from dangerous places to safe shelters as quickly as possible. Police play an important role in the evacuation process, as they can handle traffic accidents immediately and help people move smoothly on roads. This paper investigates an evacuation routing problem that involves police resource allocation. We propose a novel k-th-shortest-path-based technique that uses explicit congestion control to optimize evacuation routing and police resource allocation. A nonlinear mixed-integer programming model is presented to formulate the problem. The model's objective is to minimize the overall evacuation clearance time. Two algorithms are given to solve the problem. The first one linearizes the original model and solves the linearized problem with CPLEX. The second one is a heuristic algorithm that uses a police resource utilization efficiency index to directly solve the original model. This police resource utilization efficiency index significantly aids in the evaluation of road links from an evacuation throughput perspective. The proposed algorithms are tested with a number of examples based on real data from cities of different sizes. The computational results show that the police resource utilization efficiency index is very helpful in finding near-optimal solutions. Additionally, comparing the performance of the heuristic algorithm and the linearization method by using randomly generated examples indicates that the efficiency of the heuristic algorithm is superior. PMID:26226109

  16. Base Rates: Both Neglected and Intuitive

    ERIC Educational Resources Information Center

    Pennycook, Gordon; Trippas, Dries; Handley, Simon J.; Thompson, Valerie A.

    2014-01-01

    Base-rate neglect refers to the tendency for people to underweight base-rate probabilities in favor of diagnostic information. It is commonly held that base-rate neglect occurs because effortful (Type 2) reasoning is required to process base-rate information, whereas diagnostic information is accessible to fast, intuitive (Type 1) processing…

  17. Agent-based evacuation simulation for spatial allocation assessment of urban shelters

    NASA Astrophysics Data System (ADS)

    Yu, Jia; Wen, Jiahong; Jiang, Yong

    2015-12-01

    The construction of urban shelters is one of the most important work in urban planning and disaster prevention. The spatial allocation assessment is a fundamental pre-step for spatial location-allocation of urban shelters. This paper introduces a new method which makes use of agent-based technology to implement evacuation simulation so as to conduct dynamic spatial allocation assessment of urban shelters. The method can not only accomplish traditional geospatial evaluation for urban shelters, but also simulate the evacuation process of the residents to shelters. The advantage of utilizing this method lies into three aspects: (1) the evacuation time of each citizen from a residential building to the shelter can be estimated more reasonably; (2) the total evacuation time of all the residents in a region is able to be obtained; (3) the road congestions in evacuation in sheltering can be detected so as to take precautionary measures to prevent potential risks. In this study, three types of agents are designed: shelter agents, government agents and resident agents. Shelter agents select specified land uses as shelter candidates for different disasters. Government agents delimitate the service area of each shelter, in other words, regulate which shelter a person should take, in accordance with the administrative boundaries and road distance between the person's position and the location of the shelter. Resident agents have a series of attributes, such as ages, positions, walking speeds, and so on. They also have several behaviors, such as reducing speed when walking in the crowd, helping old people and children, and so on. Integrating these three types of agents which are correlated with each other, evacuation procedures can be simulated and dynamic allocation assessment of shelters will be achieved. A case study in Jing'an District, Shanghai, China, was conducted to demonstrate the feasibility of the method. A scenario of earthquake disaster which occurs in nighttime

  18. Dynamic Allocation of SPM Based on Time-Slotted Cache Conflict Graph for System Optimization

    NASA Astrophysics Data System (ADS)

    Wu, Jianping; Ling, Ming; Zhang, Yang; Mei, Chen; Wang, Huan

    This paper proposes a novel dynamic Scratch-pad Memory allocation strategy to optimize the energy consumption of the memory sub-system. Firstly, the whole program execution process is sliced into several time slots according to the temporal dimension; thereafter, a Time-Slotted Cache Conflict Graph (TSCCG) is introduced to model the behavior of Data Cache (D-Cache) conflicts within each time slot. Then, Integer Nonlinear Programming (INP) is implemented, which can avoid time-consuming linearization process, to select the most profitable data pages. Virtual Memory System (VMS) is adopted to remap those data pages, which will cause severe Cache conflicts within a time slot, to SPM. In order to minimize the swapping overhead of dynamic SPM allocation, a novel SPM controller with a tightly coupled DMA is introduced to issue the swapping operations without CPU's intervention. Last but not the least, this paper discusses the fluctuation of system energy profit based on different MMU page size as well as the Time Slot duration quantitatively. According to our design space exploration, the proposed method can optimize all of the data segments, including global data, heap and stack data in general, and reduce the total energy consumption by 27.28% on average, up to 55.22% with a marginal performance promotion. And comparing to the conventional static CCG (Cache Conflicts Graph), our approach can obtain 24.7% energy profit on average, up to 30.5% with a sight boost in performance.

  19. Risk-based objectives for the allocation of chemical, biological, and radiological air emissions sensors.

    PubMed

    Lambert, James H; Farrington, Mark W

    2006-12-01

    This article addresses the problem of allocating devices for localized hazard protection across a region. Each identical device provides only local protection, and the devices serve localities that are exposed to nonidentical intensities of hazard. A method for seeking the optimal allocation Policy Decisions is described, highlighting the potentially competing objectives of maximizing local risk reductions and coverage risk reductions. The metric for local risk reductions is the sum of the local economic risks avoided. The metric for coverage risk reductions is adapted from the p-median problem and equal to the sum of squares of the distances from all unserved localities to their closest associated served locality. Three graphical techniques for interpreting the Policy Decisions are presented. The three linked graphical techniques are applied serially. The first technique identifies Policy Decisions that are nearly Pareto optimal. The second identifies locations where sensor placements are most justified, based on a risk-cost-benefit analysis under uncertainty. The third displays the decision space for any particular policy decision. The method is illustrated in an application to chemical, biological, and/or radiological weapon sensor placement, but has implications for disaster preparedness, transportation safety, and other arenas of public safety.

  20. A Web-based graphical user interface for evidence-based decision making for health care allocations in rural areas

    PubMed Central

    Schuurman, Nadine; Leight, Margo; Berube, Myriam

    2008-01-01

    Background The creation of successful health policy and location of resources increasingly relies on evidence-based decision-making. The development of intuitive, accessible tools to analyse, display and disseminate spatial data potentially provides the basis for sound policy and resource allocation decisions. As health services are rationalized, the development of tools such graphical user interfaces (GUIs) is especially valuable at they assist decision makers in allocating resources such that the maximum number of people are served. GIS can used to develop GUIs that enable spatial decision making. Results We have created a Web-based GUI (wGUI) to assist health policy makers and administrators in the Canadian province of British Columbia make well-informed decisions about the location and allocation of time-sensitive service capacities in rural regions of the province. This tool integrates datasets for existing hospitals and services, regional populations and road networks to allow users to ascertain the percentage of population in any given service catchment who are served by a specific health service, or baskets of linked services. The wGUI allows policy makers to map trauma and obstetric services against rural populations within pre-specified travel distances, illustrating service capacity by region. Conclusion The wGUI can be used by health policy makers and administrators with little or no formal GIS training to visualize multiple health resource allocation scenarios. The GUI is poised to become a critical decision-making tool especially as evidence is increasingly required for distribution of health services. PMID:18793428

  1. A swarm intelligence based memetic algorithm for task allocation in distributed systems

    NASA Astrophysics Data System (ADS)

    Sarvizadeh, Raheleh; Haghi Kashani, Mostafa

    2011-12-01

    This paper proposes a Swarm Intelligence based Memetic algorithm for Task Allocation and scheduling in distributed systems. The tasks scheduling in distributed systems is known as an NP-complete problem. Hence, many genetic algorithms have been proposed for searching optimal solutions from entire solution space. However, these existing approaches are going to scan the entire solution space without considering the techniques that can reduce the complexity of the optimization. Spending too much time for doing scheduling is considered the main shortcoming of these approaches. Therefore, in this paper memetic algorithm has been used to cope with this shortcoming. With regard to load balancing efficiently, Bee Colony Optimization (BCO) has been applied as local search in the proposed memetic algorithm. Extended experimental results demonstrated that the proposed method outperformed the existing GA-based method in terms of CPU utilization.

  2. A swarm intelligence based memetic algorithm for task allocation in distributed systems

    NASA Astrophysics Data System (ADS)

    Sarvizadeh, Raheleh; Haghi Kashani, Mostafa

    2012-01-01

    This paper proposes a Swarm Intelligence based Memetic algorithm for Task Allocation and scheduling in distributed systems. The tasks scheduling in distributed systems is known as an NP-complete problem. Hence, many genetic algorithms have been proposed for searching optimal solutions from entire solution space. However, these existing approaches are going to scan the entire solution space without considering the techniques that can reduce the complexity of the optimization. Spending too much time for doing scheduling is considered the main shortcoming of these approaches. Therefore, in this paper memetic algorithm has been used to cope with this shortcoming. With regard to load balancing efficiently, Bee Colony Optimization (BCO) has been applied as local search in the proposed memetic algorithm. Extended experimental results demonstrated that the proposed method outperformed the existing GA-based method in terms of CPU utilization.

  3. Challenges in defining an optimal approach to formula-based allocations of public health funds in the United States

    PubMed Central

    Buehler, James W; Holtgrave, David R

    2007-01-01

    Background Controversy and debate can arise whenever public health agencies determine how program funds should be allocated among constituent jurisdictions. Two common strategies for making such allocations are expert review of competitive applications and the use of funding formulas. Despite widespread use of funding formulas by public health agencies in the United States, formula allocation strategies in public health have been subject to relatively little formal scrutiny, with the notable exception of the attention focused on formula funding of HIV care programs. To inform debates and deliberations in the selection of a formula-based approach, we summarize key challenges to formula-based funding, based on prior reviews of federal programs in the United States. Discussion The primary challenge lies in identifying data sources and formula calculation methods that both reflect and serve program objectives, with or without adjustments for variations in the cost of delivering services, the availability of local resources, capacity, or performance. Simplicity and transparency are major advantages of formula-based allocations, but these advantages can be offset if formula-based allocations are perceived to under- or over-fund some jurisdictions, which may result from how guaranteed minimum funding levels are set or from "hold-harmless" provisions intended to blunt the effects of changes in formula design or random variations in source data. While fairness is considered an advantage of formula-based allocations, the design of a formula may implicitly reflect unquestioned values concerning equity versus equivalence in setting funding policies. Whether or how past or projected trends are taken into account can also have substantial impacts on allocations. Summary Insufficient attention has been focused on how the approach to designing funding formulas in public health should differ for treatment or service versus prevention programs. Further evaluations of formula-based

  4. Using bi-directional communications in a market-based resource allocation system

    SciTech Connect

    Chassin, David P; Pratt, Robert G

    2014-04-01

    Disclosed herein are representative embodiments of methods, apparatus, and systems for distributing a resource (such as electricity) using a resource allocation system. In one exemplary embodiment, a plurality of requests for electricity are received from a plurality of end-use consumers. The requests indicate a requested quantity of electricity and a consumer-requested index value indicative of a maximum price a respective end-use consumer will pay for the requested quantity of electricity. A plurality of offers for supplying electricity are received from a plurality of resource suppliers. The offers indicate an offered quantity of electricity and a supplier-requested index value indicative of a minimum price for which a respective supplier will produce the offered quantity of electricity. A dispatched index value is computed at which electricity is to be supplied based at least in part on the consumer-requested index values and the supplier-requested index values.

  5. Electric power grid control using a market-based resource allocation system

    SciTech Connect

    Chassin, David P.

    2015-07-21

    Disclosed herein are representative embodiments of methods, apparatus, and systems for distributing a resource (such as electricity) using a resource allocation system. In one exemplary embodiment, a plurality of requests for electricity are received from a plurality of end-use consumers. The requests indicate a requested quantity of electricity and a consumer-requested index value indicative of a maximum price a respective end-use consumer will pay for the requested quantity of electricity. A plurality of offers for supplying electricity are received from a plurality of resource suppliers. The offers indicate an offered quantity of electricity and a supplier-requested index value indicative of a minimum price for which a respective supplier will produce the offered quantity of electricity. A dispatched index value is computed at which electricity is to be supplied based at least in part on the consumer-requested index values and the supplier-requested index values.

  6. Using bi-directional communications in a market-based resource allocation system

    SciTech Connect

    Chassin, David P.; Pratt, Robert G.

    2015-09-08

    Disclosed herein are representative embodiments of methods, apparatus, and systems for distributing a resource (such as electricity) using a resource allocation system. In one exemplary embodiment, a plurality of requests for electricity are received from a plurality of end-use consumers. The requests indicate a requested quantity of electricity and a consumer-requested index value indicative of a maximum price a respective end-use consumer will pay for the requested quantity of electricity. A plurality of offers for supplying electricity are received from a plurality of resource suppliers. The offers indicate an offered quantity of electricity and a supplier-requested index value indicative of a minimum price for which a respective supplier will produce the offered quantity of electricity. A dispatched index value is computed at which electricity is to be supplied based at least in part on the consumer-requested index values and the supplier-requested index values.

  7. Electric power grid control using a market-based resource allocation system

    DOEpatents

    Chassin, David P

    2014-01-28

    Disclosed herein are representative embodiments of methods, apparatus, and systems for distributing a resource (such as electricity) using a resource allocation system. In one exemplary embodiment, a plurality of requests for electricity are received from a plurality of end-use consumers. The requests indicate a requested quantity of electricity and a consumer-requested index value indicative of a maximum price a respective end-use consumer will pay for the requested quantity of electricity. A plurality of offers for supplying electricity are received from a plurality of resource suppliers. The offers indicate an offered quantity of electricity and a supplier-requested index value indicative of a minimum price for which a respective supplier will produce the offered quantity of electricity. A dispatched index value is computed at which electricity is to be supplied based at least in part on the consumer-requested index values and the supplier-requested index values.

  8. Using bi-directional communications in a market-based resource allocation system

    SciTech Connect

    Chassin, David P; Pratt, Robert G

    2015-05-05

    Disclosed herein are representative embodiments of methods, apparatus, and systems for distributing a resource (such as electricity) using a resource allocation system. In one exemplary embodiment, a plurality of requests for electricity are received from a plurality of end-use consumers. The requests indicate a requested quantity of electricity and a consumer-requested index value indicative of a maximum price a respective end-use consumer will pay for the requested quantity of electricity. A plurality of offers for supplying electricity are received from a plurality of resource suppliers. The offers indicate an offered quantity of electricity and a supplier-requested index value indicative of a minimum price for which a respective supplier will produce the offered quantity of electricity. A dispatched index value is computed at which electricity is to be supplied based at least in part on the consumer-requested index values and the supplier-requested index values.

  9. Using one-way communications in a market-based resource allocation system

    DOEpatents

    Chassin, David P.; Pratt, Robert G.

    2014-07-22

    Disclosed herein are representative embodiments of methods, apparatus, and systems for distributing a resource (such as electricity) using a resource allocation system. In one exemplary embodiment, a plurality of requests for electricity are received from a plurality of end-use consumers. The requests indicate a requested quantity of electricity and a consumer-requested index value indicative of a maximum price a respective end-use consumer will pay for the requested quantity of electricity. A plurality of offers for supplying electricity are received from a plurality of resource suppliers. The offers indicate an offered quantity of electricity and a supplier-requested index value indicative of a minimum price for which a respective supplier will produce the offered quantity of electricity. A dispatched index value is computed at which electricity is to be supplied based at least in part on the consumer-requested index values and the supplier-requested index values.

  10. 75 FR 54598 - Notice of Solicitation of Applications for Allocation of Tariff Rate Quotas on the Import of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-08

    ... regulations establishing procedures for allocating the TRQ. See 70 FR 61363; 19 CFR part 335. In order to be... on the Import of Certain Worsted Wool Fabrics to Persons Who Cut and Sew Men's and Boys' Worsted Wool... boys' worsted wool suits, suit-type jackets and trousers in the United States. SUMMARY: The...

  11. 76 FR 1402 - Notice of allocation of Tariff Rate Quotas (TRQ) on the Import of Certain Worsted Wool Fabrics...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-10

    ... establishing procedures for applying for, and determining, such allocations (66 FR 6459, 15 CFR 335). These interim regulations were adopted, without change, as a final rule published on October 24, 2005 (70 FR 61363). On September 8, 2010, the Department published a notice in the Federal Register (75 FR...

  12. 76 FR 58465 - Notice of Solicitation of Applications for Allocation of Tariff Rate Quotas on the Import of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-21

    ... diameters of 18.5 microns or less (HTS heading 9902.51.12). On August 6, 2002, President Bush signed into..., 2006, the Act was further amended pursuant to the Pension Protection Act of 2006, Public Law 109-280... adopted final regulations establishing procedures for allocating the TRQ. See 70 FR 61363; 19 CFR 335....

  13. 78 FR 5775 - Notice of Allocation of Tariff Rate Quotas (TRQ) on the Import of Certain Worsted Wool Fabrics...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-28

    ... Miscellaneous Trade Act of 2004 (Pub. L. 108-249), and the Pension Protection Act of 2006 (Pub. L. 109- 280..., the Miscellaneous Trade Act of 2004, the Pension Protection Act of 2006, and the Emergency Economic... Miscellaneous Trade Act of 2004 requires the President to ensure that such fabrics are fairly allocated...

  14. 76 FR 58465 - Notice of Solicitation of Applications for Allocation of Tariff Rate Quotas on the Import of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-21

    ....12). On August 6, 2002, President Bush signed into law the Trade Act of 2002, which includes several... Pension Protection Act of 2006, Public Law 109-280, which extended both TRQs, 9902.51.11 and 9902.51.15... Department adopted final regulations establishing procedures for allocating the TRQ. See 70 FR 61363; 19...

  15. Allocation of new growth between shoot, root and mycorrhiza in relation to carbon, nitrogen and phosphate supply: teleonomy with maximum growth rate.

    PubMed

    Thornley, John H M; Parsons, Anthony J

    2014-02-01

    Treating resource allocation within plants, and between plants and associated organisms, is essential for plant, crop and ecosystem modelling. However, it is still an unresolved issue. It is also important to consider quantitatively when it is efficient and to what extent a plant can invest profitably in a mycorrhizal association. A teleonomic model is used to address these issues. A six state-variable model giving exponential growth is constructed. This represents carbon (C), nitrogen (N) and phosphorus (P) substrates with structure in shoot, root and mycorrhiza. The shoot is responsible for uptake of substrate C, the root for substrates N and P, and the mycorrhiza also for substrates N and P. A teleonomic goal, maximizing proportional growth rate, is solved analytically for the allocation fractions. Expressions allocating new dry matter to shoot, root and mycorrhiza are derived which maximize growth rate. These demonstrate several key intuitive phenomena concerning resource sharing between plant components and associated mycorrhizae. For instance, if root uptake rate for phosphorus is equal to that achievable by mycorrhiza and without detriment to root uptake rate for nitrogen, then this gives a faster growing mycorrhizal-free plant. However, if root phosphorus uptake is below that achievable by mycorrhiza, then a mycorrhizal association may be a preferred strategy. The approach offers a methodology for introducing resource sharing between species into ecosystem models. Applying teleonomy may provide a valuable short-term means of modelling allocation, avoiding the circularity of empirical models, and circumventing the complexities and uncertainties inherent in mechanistic approaches. However it is subjective and brings certain irreducible difficulties with it.

  16. Allocating physicians' overhead costs to services: an econometric/accounting-activity based-approach.

    PubMed

    Peden, Al; Baker, Judith J

    2002-01-01

    Using the optimizing properties of econometric analysis, this study analyzes how physician overhead costs (OC) can be allocated to multiple activities to maximize precision in reimbursing the costs of services. Drawing on work by Leibenstein and Friedman, the analysis also shows that allocating OC to multiple activities unbiased by revenue requires controlling for revenue when making the estimates. Further econometric analysis shows that it is possible to save about 10 percent of OC by paying only for those that are necessary.

  17. Efficient Nash Equilibrium Resource Allocation Based on Game Theory Mechanism in Cloud Computing by Using Auction

    PubMed Central

    Nezarat, Amin; Dastghaibifard, GH

    2015-01-01

    One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer’s utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider. PMID:26431035

  18. Efficient Nash Equilibrium Resource Allocation Based on Game Theory Mechanism in Cloud Computing by Using Auction.

    PubMed

    Nezarat, Amin; Dastghaibifard, G H

    2015-01-01

    One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer's utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider.

  19. Asymmetric programming: a highly reliable metadata allocation strategy for MLC NAND flash memory-based sensor systems.

    PubMed

    Huang, Min; Liu, Zhaoqing; Qiao, Liyan

    2014-01-01

    While the NAND flash memory is widely used as the storage medium in modern sensor systems, the aggressive shrinking of process geometry and an increase in the number of bits stored in each memory cell will inevitably degrade the reliability of NAND flash memory. In particular, it's critical to enhance metadata reliability, which occupies only a small portion of the storage space, but maintains the critical information of the file system and the address translations of the storage system. Metadata damage will cause the system to crash or a large amount of data to be lost. This paper presents Asymmetric Programming, a highly reliable metadata allocation strategy for MLC NAND flash memory storage systems. Our technique exploits for the first time the property of the multi-page architecture of MLC NAND flash memory to improve the reliability of metadata. The basic idea is to keep metadata in most significant bit (MSB) pages which are more reliable than least significant bit (LSB) pages. Thus, we can achieve relatively low bit error rates for metadata. Based on this idea, we propose two strategies to optimize address mapping and garbage collection. We have implemented Asymmetric Programming on a real hardware platform. The experimental results show that Asymmetric Programming can achieve a reduction in the number of page errors of up to 99.05% with the baseline error correction scheme.

  20. Dynamic resource allocation engine for cloud-based real-time video transcoding in mobile cloud computing environments

    NASA Astrophysics Data System (ADS)

    Adedayo, Bada; Wang, Qi; Alcaraz Calero, Jose M.; Grecos, Christos

    2015-02-01

    The recent explosion in video-related Internet traffic has been driven by the widespread use of smart mobile devices, particularly smartphones with advanced cameras that are able to record high-quality videos. Although many of these devices offer the facility to record videos at different spatial and temporal resolutions, primarily with local storage considerations in mind, most users only ever use the highest quality settings. The vast majority of these devices are optimised for compressing the acquired video using a single built-in codec and have neither the computational resources nor battery reserves to transcode the video to alternative formats. This paper proposes a new low-complexity dynamic resource allocation engine for cloud-based video transcoding services that are both scalable and capable of being delivered in real-time. Firstly, through extensive experimentation, we establish resource requirement benchmarks for a wide range of transcoding tasks. The set of tasks investigated covers the most widely used input formats (encoder type, resolution, amount of motion and frame rate) associated with mobile devices and the most popular output formats derived from a comprehensive set of use cases, e.g. a mobile news reporter directly transmitting videos to the TV audience of various video format requirements, with minimal usage of resources both at the reporter's end and at the cloud infrastructure end for transcoding services.

  1. Asymmetric programming: a highly reliable metadata allocation strategy for MLC NAND flash memory-based sensor systems.

    PubMed

    Huang, Min; Liu, Zhaoqing; Qiao, Liyan

    2014-01-01

    While the NAND flash memory is widely used as the storage medium in modern sensor systems, the aggressive shrinking of process geometry and an increase in the number of bits stored in each memory cell will inevitably degrade the reliability of NAND flash memory. In particular, it's critical to enhance metadata reliability, which occupies only a small portion of the storage space, but maintains the critical information of the file system and the address translations of the storage system. Metadata damage will cause the system to crash or a large amount of data to be lost. This paper presents Asymmetric Programming, a highly reliable metadata allocation strategy for MLC NAND flash memory storage systems. Our technique exploits for the first time the property of the multi-page architecture of MLC NAND flash memory to improve the reliability of metadata. The basic idea is to keep metadata in most significant bit (MSB) pages which are more reliable than least significant bit (LSB) pages. Thus, we can achieve relatively low bit error rates for metadata. Based on this idea, we propose two strategies to optimize address mapping and garbage collection. We have implemented Asymmetric Programming on a real hardware platform. The experimental results show that Asymmetric Programming can achieve a reduction in the number of page errors of up to 99.05% with the baseline error correction scheme. PMID:25310473

  2. Allocation of environmental loads in recycling: a model based in the qualitative value of recycled material

    NASA Astrophysics Data System (ADS)

    Ferreira, Jose V. R.; Domingos, Idalina; Antunes, Paula

    2001-02-01

    When the life cycle of a product, material or service, studied in a Life Cycle Assessment (LCA) affect other life cycles not included in the system in analysis, it is necessary to apply allocation rules. Allocation can be defined, in LCA context, as the act of assigning the environmental loads of a system to the functions of that system in proportionate shares. Most of the available allocation methods are of generic application, they do not take into account the quality of the materials to be recycled and the few that do take it into account attribute to it a quantitative value. Methods that are both generic and scientifically correct are generally difficult to apply. Rules that are too simple for a case by case application could lead to studies of life cycle assessment that cannot be compared. This way, the best will be to dispose of simple methods but just and mathematically correct with a fundamentally driven application to a sector of economic activity. In this study a method of allocation is proposed which takes into account primarily the "qualitative" value of the material to be recycled, irrespective of being a pos-consumption product/material or a secondary material of a production process. Through a case study, the potential of application of the proposed method is demonstrated, in the recycling of products coming from wood industries and its derivatives, when compared with other allocation methods.

  3. Dynamic bandwidth allocation algorithms for local storage based VoD delivery: Comparison between single and dual receiver configurations

    NASA Astrophysics Data System (ADS)

    Abeywickrama, Sandu; Wong, Elaine

    2015-02-01

    The benefits of using distributed caching servers to optimize the traditional video-on-demand delivery have been extensively discussed in literature. In our previous work, we introduced a dual-receiver based dynamic bandwidth allocation algorithm to improve video-on-demand services using a local storage placed within the access network. The main drawback of this algorithm lies in the additional power consumption at the optical network unit that arises from using two receivers. In this paper, we present two novel single-receiver based dynamic bandwidth allocation algorithms to further optimize local storage aided video-on-demand over passive optical networks. The quality-of-service and power performances of the algorithms are critically analyzed using packet level simulations and formulation of power consumption models. We show that the energy-efficiency of a local storage based video-on-demand scheme can be increased without compromising the quality-of-service by the use of single receiver algorithms. Further, we compare the two newly introduced algorithms against dual-receiver based and without local storage schemes to find the most appropriate bandwidth allocation algorithm for local storage based video-on-demand delivery over passive optical networks.

  4. Priority-based time-slot allocation in wireless body area networks during medical emergency situations: an evolutionary game-theoretic perspective.

    PubMed

    Misra, Sudip; Sarkar, Subhadeep

    2015-03-01

    In critical medical emergency situations, wireless body area network (WBAN) equipped health monitoring systems treat data packets with critical information regarding patients' health in the same way as data packets bearing regular healthcare information. This snag results in a higher average waiting time for the local data processing units (LDPUs) transmitting data packets of higher importance. In this paper, we formulate an algorithm for Priority-based Allocation of Time Slots (PATS) that considers a fitness parameter characterizing the criticality of health data that a packet carries, energy consumption rate for a transmitting LDPU, and other crucial LDPU properties. Based on this fitness parameter, we design the constant model hawk-dove game that ensures prioritizing the LDPUs based on crucial properties. In comparison with the existing works on priority-based wireless transmission, we measure and take into consideration the urgency, seriousness, and criticality associated with an LDPU and, thus, allocate transmission time slots proportionately. We show that the number of transmitting LDPUs in medical emergency situations can be reduced by 25.97%, in comparison with the existing time-division-based techniques. PMID:24686307

  5. Priority-based time-slot allocation in wireless body area networks during medical emergency situations: an evolutionary game-theoretic perspective.

    PubMed

    Misra, Sudip; Sarkar, Subhadeep

    2015-03-01

    In critical medical emergency situations, wireless body area network (WBAN) equipped health monitoring systems treat data packets with critical information regarding patients' health in the same way as data packets bearing regular healthcare information. This snag results in a higher average waiting time for the local data processing units (LDPUs) transmitting data packets of higher importance. In this paper, we formulate an algorithm for Priority-based Allocation of Time Slots (PATS) that considers a fitness parameter characterizing the criticality of health data that a packet carries, energy consumption rate for a transmitting LDPU, and other crucial LDPU properties. Based on this fitness parameter, we design the constant model hawk-dove game that ensures prioritizing the LDPUs based on crucial properties. In comparison with the existing works on priority-based wireless transmission, we measure and take into consideration the urgency, seriousness, and criticality associated with an LDPU and, thus, allocate transmission time slots proportionately. We show that the number of transmitting LDPUs in medical emergency situations can be reduced by 25.97%, in comparison with the existing time-division-based techniques.

  6. 45 CFR 402.31 - Determination of allocations.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... State Allocations § 402.31 Determination of allocations. (a) Allocation formula. Allocations will be computed according to a formula using the following factors and weights: (1) 50 percent based on the...

  7. Budgeting based on need: a model to determine sub-national allocation of resources for health services in Indonesia

    PubMed Central

    2012-01-01

    Background Allocating national resources to regions based on need is a key policy issue in most health systems. Many systems utilise proxy measures of need as the basis for allocation formulae. Increasingly these are underpinned by complex statistical methods to separate need from supplier induced utilisation. Assessment of need is then used to allocate existing global budgets to geographic areas. Many low and middle income countries are beginning to use formula methods for funding however these attempts are often hampered by a lack of information on utilisation, relative needs and whether the budgets allocated bear any relationship to cost. An alternative is to develop bottom-up estimates of the cost of providing for local need. This method is viable where public funding is focused on a relatively small number of targeted services. We describe a bottom-up approach to developing a formula for the allocation of resources. The method is illustrated in the context of the state minimum service package mandated to be provided by the Indonesian public health system. Methods A standardised costing methodology was developed that is sensitive to the main expected drivers of local cost variation including demographic structure, epidemiology and location. Essential package costing is often undertaken at a country level. It is less usual to utilise the methods across different parts of a country in a way that takes account of variation in population needs and location. Costing was based on best clinical practice in Indonesia and province specific data on distribution and costs of facilities. The resulting model was used to estimate essential package costs in a representative district in each province of the country. Findings Substantial differences in the costs of providing basic services ranging from USD 15 in urban Yogyakarta to USD 48 in sparsely populated North Maluku. These costs are driven largely by the structure of the population, particularly numbers of births

  8. Activity-based resource allocation: a system for predicting nursing costs.

    PubMed

    Crockett, M J; DiBlasi, M; Flaherty, P; Sampson, K

    1997-01-01

    As hospital-based managers are being confronted with changing patterns of reimbursement, ranging from revenue generating to cost management, it is imperative that hospitals know the exact nursing costs associated with the actual care delivered to specific patients. Nursing care has traditionally been bundled into the room rate for patients. This approach is extremely limiting when facilities are negotiating per diem rates and capitated rate contracts. At Braintree Hospital Rehabilitation Network, the nursing department has developed and implemented an activity-based management system to determine the actual cost of nursing care provided to each patient. This approach, which differentiates nursing costs accurately by diagnostic group and by intensity of nursing care, has contributed to the hospital's success in negotiating individual patient contracts with insurers in the managed care environment that increasingly focuses on costs and outcomes. Another result has been to enhance the accuracy of the network's cost accounting system. PMID:9416189

  9. Planning for emission reduction credit allocation in Naval base closure and realignment actions in the San Francisco Bay region

    SciTech Connect

    Peoples, C.; Kannapel, P.; Heroy-Rogalski, K.

    1997-12-31

    Several Naval bases in the California San Francisco Bay Area are closing due to the Defense Base Closure and Realignment Acts recently passed by the US Congress. These were home to significant manufacturing and repair facilities that, when fully operating, generated over 100 tons per year of nitrogen oxide and ozone precursor compound emissions. As the bases close, these emissions are dropping, and there is an opportunity to gain credit for them through emissions banking. In order to distribute these emission reductions to meet both the needs of the government and the local redevelopment objectives, an allocation plan was developed. The allocation plan included generating emission reduction credits from the shutdown of permitted air emission sources. This paper highlights the benefits of early planning for emission reductions and subsequent allocation of those emission reductions in the case of the closure of Naval facilities in the San Francisco Bay Area. It illustrates that early planning can ensure that the needs of the military and local community are satisfied and that the highest quantity of bankable emission reductions are identified, and properly quantified and documented, and that emission reduction credits (ERCs) are generated. This paper also presents lessons learned from the experience of banking emissions from closing Navy facilities.

  10. A QoS aware resource allocation strategy for 3D A/V streaming in OFDMA based wireless systems.

    PubMed

    Chung, Young-Uk; Choi, Yong-Hoon; Park, Suwon; Lee, Hyukjoon

    2014-01-01

    Three-dimensional (3D) video is expected to be a "killer app" for OFDMA-based broadband wireless systems. The main limitation of 3D video streaming over a wireless system is the shortage of radio resources due to the large size of the 3D traffic. This paper presents a novel resource allocation strategy to address this problem. In the paper, the video-plus-depth 3D traffic type is considered. The proposed resource allocation strategy focuses on the relationship between 2D video and the depth map, handling them with different priorities. It is formulated as an optimization problem and is solved using a suboptimal heuristic algorithm. Numerical results show that the proposed scheme provides a better quality of service compared to conventional schemes.

  11. Metro-access integrated network based on optical OFDMA with dynamic sub-carrier allocation and power distribution.

    PubMed

    Zhang, Chongfu; Zhang, Qiongli; Chen, Chen; Jiang, Ning; Liu, Deming; Qiu, Kun; Liu, Shuang; Wu, Baojian

    2013-01-28

    We propose and demonstrate a novel optical orthogonal frequency-division multiple access (OFDMA)-based metro-access integrated network with dynamic resource allocation. It consists of a single fiber OFDMA ring and many single fiber OFDMA trees, which transparently integrates metropolitan area networks with optical access networks. The single fiber OFDMA ring connects the core network and the central nodes (CNs), the CNs are on demand reconfigurable and use multiple orthogonal sub-carriers to realize parallel data transmission and dynamic resource allocation, meanwhile, they can also implement flexible power distribution. The remote nodes (RNs) distributed in the user side are connected by the single fiber OFDMA trees with the corresponding CN. The obtained results indicate that our proposed metro-access integrated network is feasible and the power distribution is agile.

  12. 76 FR 21418 - Fiscal Year 2011 Allocation of Additional Tariff-Rate Quota Volume for Raw Cane Sugar and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-15

    ... 6763 (60 FR 1007). On April 11, 2011, The Secretary of Agriculture announced an additional in-quota... Sugar and Reallocation of Unused Fiscal Year 2011 Tariff-Rate Quota Volume for Raw Cane Sugar AGENCY... Fiscal Year (FY) 2011 in-quota quantity of the tariff-rate quota (TRQ) for imported raw cane sugar and...

  13. 77 FR 25012 - Fiscal Year 2012 Allocation of Additional Tariff-Rate Quota Volume for Raw Cane Sugar and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-26

    ... Representative under Presidential Proclamation 6763 (60 FR 1007). On April 19, 2012, the Secretary of Agriculture... Sugar and Reallocation of Unused Fiscal Year 2012 Tariff-Rate Quota Volume for Raw Cane Sugar AGENCY... Fiscal Year (FY) 2012 in-quota quantity of the tariff-rate quota (TRQ) for imported raw cane sugar and...

  14. A New Subcarrier Allocation Strategy for MIMO-OFDMA Multicellular Networks Based on Cooperative Interference Mitigation

    PubMed Central

    Gkonis, Panagiotis K.; Seimeni, Maria A.; Asimakis, Nikolaos P.; Kaklamani, Dimitra I.; Venieris, Iakovos S.

    2014-01-01

    The goal of the study presented in this paper is to investigate the performance of a new subcarrier allocation strategy for Orthogonal Frequency Division Multiple Access (OFDMA) multicellular networks which employ Multiple Input Multiple Output (MIMO) architecture. For this reason, a hybrid system-link level simulator has been developed executing independent Monte Carlo (MC) simulations in parallel. Up to two tiers of cells around the central cell are taken into consideration and increased loading per cell. The derived results indicate that this strategy can provide up to 12% capacity gain for 16-QAM modulation and two tiers of cells around the central cell in a symmetric 2 × 2 MIMO configuration. This gain is derived when comparing the proposed strategy to the traditional approach of allocating subcarriers that maximize only the desired user's signal. PMID:24683351

  15. [Spatial optimum allocation of shelter-forest types in Three Gorges Reservoir Area based on multiple objective grey situation decision].

    PubMed

    Ma, Hao; Zhou, Zhi-xiang; Wang, Peng-cheng; Wu, Chang-guang; Xiao, Wen-fa

    2010-12-01

    Based on the 2007 Landsat TM images and the dominant environmental factors of shelter forest, the forest sites in the Three Gorges Reservoir Area were classified, and by using multiple objective grey situation decision model, three indices including water conservation amount, biomass, and stand productivity were selected to make the spatial optimum allocation of the present four kinds of shelter forest (coniferous forest, broadleaf forest, mixed coniferous-broadleaf forest, and shrub) in the Area. The forest sites in the Area in 2007 could be classified into 40 types, and after the optimization of spatial allocation, the proportion of coniferous forest, broadleaf forest, mixed coniferous-broadleaf forest, and shrub would be 32.55%, 29.43%, 34.95%, and 3.07%, respectively. Comparing with that before optimization, the proportion of coniferous forest and shrub after optimization was reduced by 8.79% and 28.55%, while that of broadleaf forest and mixed coniferous-broadleaf forest was increased by 10.23% and 27.11%, respectively. After the optimization of spatial allocation, the amount of water conservation, biomass, and stand productivity of the shelter forests in the area would be increased by 14.09 x 10(8) m3, 0.35 x 10(8) t, and 1.08 x 10(6) t, respectively.

  16. Optimization model for the allocation of water resources based on the maximization of employment in the agriculture and industry sectors

    NASA Astrophysics Data System (ADS)

    Habibi Davijani, M.; Banihabib, M. E.; Nadjafzadeh Anvar, A.; Hashemi, S. R.

    2016-02-01

    In many discussions, work force is mentioned as the most important factor of production. Principally, work force is a factor which can compensate for the physical and material limitations and shortcomings of other factors to a large extent which can help increase the production level. On the other hand, employment is considered as an effective factor in social issues. The goal of the present research is the allocation of water resources so as to maximize the number of jobs created in the industry and agriculture sectors. An objective that has attracted the attention of policy makers involved in water supply and distribution is the maximization of the interests of beneficiaries and consumers in case of certain policies adopted. The present model applies the particle swarm optimization (PSO) algorithm in order to determine the optimum amount of water allocated to each water-demanding sector, area under cultivation, agricultural production, employment in the agriculture sector, industrial production and employment in the industry sector. Based on the results obtained from this research, by optimally allocating water resources in the central desert region of Iran, 1096 jobs can be created in the industry and agriculture sectors, which constitutes an improvement of about 13% relative to the previous situation (non-optimal water utilization). It is also worth mentioning that by optimizing the employment factor as a social parameter, the other areas such as the economic sector are influenced as well. For example, in this investigation, the resulting economic benefits (incomes) have improved from 73 billion Rials at baseline employment figures to 112 billion Rials in the case of optimized employment condition. Therefore, it is necessary to change the inter-sector and intra-sector water allocation models in this region, because this change not only leads to more jobs in this area, but also causes an improvement in the region's economic conditions.

  17. Comparison of Ground-Based and Airborne Function Allocation Concepts for NextGen Using Human-In-The-Loop Simulations

    NASA Technical Reports Server (NTRS)

    Wing, David J.; Prevot, Thomas; Murdoch, Jennifer L.; Cabrall, Christopher D.; Homola, Jeffrey R.; Martin, Lynne H.; Mercer, Joey S.; Hoadley, Sherwood T.; Wilson, Sara R.; Hubbs, Clay E.; Chamberlain, James P.; Chartrand, Ryan C.; Consiglio, Maria C.; Palmer, Michael T.

    2010-01-01

    Investigation of function allocation for the Next Generation Air Transportation System is being conducted by the National Aeronautics and Space Administration (NASA). To provide insight on comparability of different function allocations for separation assurance, two human-in-the-loop simulation experiments were conducted on homogeneous airborne and ground-based approaches to four-dimensional trajectory-based operations, one referred to as ground-based automated separation assurance (groundbased) and the other as airborne trajectory management with self-separation (airborne). In the coordinated simulations at NASA s Ames and Langley Research Centers, controllers for the ground-based concept at Ames and pilots for the airborne concept at Langley managed the same traffic scenarios using the two different concepts. The common scenarios represented a significant increase in airspace demand over current operations. Using common independent variables, the simulations varied traffic density, scheduling constraints, and the timing of trajectory change events. Common metrics were collected to enable a comparison of relevant results. Where comparisons were possible, no substantial differences in performance or operator acceptability were observed. Mean schedule conformance and flight path deviation were considered adequate for both approaches. Conflict detection warning times and resolution times were mostly adequate, but certain conflict situations were detected too late to be resolved in a timely manner. This led to some situations in which safety was compromised and/or workload was rated as being unacceptable in both experiments. Operators acknowledged these issues in their responses and ratings but gave generally positive assessments of the respective concept and operations they experienced. Future studies will evaluate technical improvements and procedural enhancements to achieve the required level of safety and acceptability and will investigate the integration of

  18. 15 CFR 700.11 - Priority ratings.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... OF INDUSTRY AND SECURITY, DEPARTMENT OF COMMERCE NATIONAL SECURITY INDUSTRIAL BASE REGULATIONS DEFENSE PRIORITIES AND ALLOCATIONS SYSTEM Industrial Priorities § 700.11 Priority ratings. (a) Levels...

  19. Soluble carbohydrate allocation to roots, photosynthetic rate of leaves, and nitrate assimilation as affected by nitrogen stress and irradiance

    NASA Technical Reports Server (NTRS)

    Henry, L. T.; Raper, C. D. Jr

    1991-01-01

    Upon resupply of exogenous nitrogen to nitrogen-stressed plants, uptake rate of nitrogen is enhanced relative to nonstressed plants. Absorption of nitrogen presumably is dependent on availability of carbohydrates in the roots. A buildup in soluble carbohydrates thus should occur in roots of nitrogen-stressed plants, and upon resupply of exogenous nitrogen the increased uptake rate should be accompanied by a rapid decline in carbohydrates to prestress levels. To evaluate this relationship, three sets of tobacco plants growing in a complete hydroponic solution containing 1.0 mM NO3- were either continued in the complete solution for 21 d, transferred to a minus-nitrogen solution for 21 d, or transferred to a minus-nitrogen solution for 8-9 d and then returned to the 1.0 mM NO3- solution. These nitrogen treatments were imposed upon plants growing at photosynthetic photon flux densities of 700 and 350 micromoles m-2 s-1. Soluble carbohydrate levels in roots increased during onset of nitrogen stress to levels that were fourfold greater than in roots of non-stressed plants. Following resupply of external nitrogen, a rapid resumption of nitrogen uptake was accompanied by a decline in soluble carbohydrates in roots to levels characteristic of nonstressed plants. This pattern of soluble carbohydrate levels in roots during onset of and recovery from nitrogen stress occurred at both irradiance levels. The response of net photosynthetic rate to nitrogen stress could be expressed as a nonlinear function of concentration of reduced nitrogen in leaves. The net photosynthetic rate at a given concentration of reduced nitrogen, however, averaged 10% less at the lower than at the higher irradiance. The decline in net photosynthetic rate per unit of reduced nitrogen in leaves at the lower irradiance was accompanied by an increase in the nitrate fraction of total nitrogen in leaves from 20% at the higher irradiance to 38% at the lower irradiance.

  20. Soluble carbohydrate allocation to roots, photosynthetic rate of leaves, and nitrate assimilation as affected by nitrogen stress and irradiance.

    PubMed

    Henry, L T; Raper, C D

    1991-03-01

    Upon resupply of exogenous nitrogen to nitrogen-stressed plants, uptake rate of nitrogen is enhanced relative to nonstressed plants. Absorption of nitrogen presumably is dependent on availability of carbohydrates in the roots. A buildup in soluble carbohydrates thus should occur in roots of nitrogen-stressed plants, and upon resupply of exogenous nitrogen the increased uptake rate should be accompanied by a rapid decline in carbohydrates to prestress levels. To evaluate this relationship, three sets of tobacco plants growing in a complete hydroponic solution containing 1.0 mM NO3- were either continued in the complete solution for 21 d, transferred to a minus-nitrogen solution for 21 d, or transferred to a minus-nitrogen solution for 8-9 d and then returned to the 1.0 mM NO3- solution. These nitrogen treatments were imposed upon plants growing at photosynthetic photon flux densities of 700 and 350 micromoles m-2 s-1. Soluble carbohydrate levels in roots increased during onset of nitrogen stress to levels that were fourfold greater than in roots of non-stressed plants. Following resupply of external nitrogen, a rapid resumption of nitrogen uptake was accompanied by a decline in soluble carbohydrates in roots to levels characteristic of nonstressed plants. This pattern of soluble carbohydrate levels in roots during onset of and recovery from nitrogen stress occurred at both irradiance levels. The response of net photosynthetic rate to nitrogen stress could be expressed as a nonlinear function of concentration of reduced nitrogen in leaves. The net photosynthetic rate at a given concentration of reduced nitrogen, however, averaged 10% less at the lower than at the higher irradiance. The decline in net photosynthetic rate per unit of reduced nitrogen in leaves at the lower irradiance was accompanied by an increase in the nitrate fraction of total nitrogen in leaves from 20% at the higher irradiance to 38% at the lower irradiance.

  1. Mental workload prediction based on attentional resource allocation and information processing.

    PubMed

    Xiao, Xu; Wanyan, Xiaoru; Zhuang, Damin

    2015-01-01

    Mental workload is an important component in complex human-machine systems. The limited applicability of empirical workload measures produces the need for workload modeling and prediction methods. In the present study, a mental workload prediction model is built on the basis of attentional resource allocation and information processing to ensure pilots' accuracy and speed in understanding large amounts of flight information on the cockpit display interface. Validation with an empirical study of an abnormal attitude recovery task showed that this model's prediction of mental workload highly correlated with experimental results. This mental workload prediction model provides a new tool for optimizing human factors interface design and reducing human errors. PMID:26406085

  2. Base-Rate Neglect as a Function of Base Rates in Probabilistic Contingency Learning

    ERIC Educational Resources Information Center

    Kutzner, Florian; Freytag, Peter; Vogel, Tobias; Fiedler, Klaus

    2008-01-01

    When humans predict criterion events based on probabilistic predictors, they often lend excessive weight to the predictor and insufficient weight to the base rate of the criterion event. In an operant analysis, using a matching-to-sample paradigm, Goodie and Fantino (1996) showed that humans exhibit base-rate neglect when predictors are associated…

  3. Optimality-based modeling of nitrogen allocation and photoacclimation in photosynthesis

    NASA Astrophysics Data System (ADS)

    Armstrong, Robert A.

    2006-03-01

    The ability to predict phytoplankton growth rates under light and nutrient limitation is fundamental to modeling the ocean carbon cycle. Equally fundamental is the ability to predict chlorophyll:carbon ratios, since satellite-based chlorophyll estimates are one of the few data sets to which model output can be compared globally. Because the Geider et al. [Geider, R.J., MacIntyre, H.L., Kana, T.M., 1998. A dynamic regulatory model of phytoplanktonic acclimation to light, nutrients, and temperature. Limnology and Oceanography 43, 679-694] model addresses both desiderata, it has become the model of choice for representing photosynthesis in a wide range of ecosystem models. The Geider et al. [Geider, R.J., MacIntyre, H.L., Kana, T.M., 1998. A dynamic regulatory model of phytoplanktonic acclimation to light, nutrients, and temperature. Limnology and Oceanography 43, 679-694] model follows previous models in positing that maximum photosynthetic rate can be reached only when nitrogen cell quota (nitrogen:carbon ratio) reaches a fixed maximum value qN,maxC. Empirically, this assumption is contradicted by the extremely thorough data set of Laws and Bannister [Laws, E.A., Bannister, T.T., 1980. Nutrient- and light-limited growth of Thalassiosira fluviatilis in continuous culture, with implications for phytoplankton growth in the ocean. Limnology and Oceanography 25, 457-473] and by other studies: maximum growth rate does not seem to require maximum nitrogen cell quota. To the extent that existing models do not reflect this key characteristic, they may fail to yield reliable predictions of chlorophyll:carbon ratios as functions of nitrogen:carbon ratios. They also may fail to reflect differences in growth rates of competing phytoplankton species, an essential feature of state-of-the-art ecosystem models used in biogeochemistry simulations. In the present paper I replace the nitrogen limitation function of previous models by one that does not require maximum nitrogen cell

  4. Improved waiting-list outcomes in Argentina after the adoption of a model for end-stage liver disease-based liver allocation policy.

    PubMed

    Cejas, Nora Gabriela; Villamil, Federico G; Lendoire, Javier C; Tagliafichi, Viviana; Lopez, Arturo; Krogh, Daniela Hansen; Soratti, Carlos A; Bisigniano, Liliana

    2013-07-01

    In July 2005, Argentina became the first country after the United States to introduce the Model for End-Stage Liver Disease (MELD) for organ allocation. In this study, we investigated waiting-list (WL) outcomes (n = 3272) and post-liver transplantation (LT) survival in 2 consecutive periods of 5 years before and after the implementation of a MELD-based allocation policy. Data were obtained from the database of the national institute for organ allocation in Argentina. After the adoption of the MELD system, there were significant reductions in WL mortality [28.5% versus 21.9%, P < 0.001, hazard ratio (HR) = 1.57, 95% confidence interval (CI) = 1.37-1.81] and total dropout rates (38.6% versus 29.1%, P < 0.001, HR = 1.31, 95% CI = 1.16-1.48) despite significantly less LT accessibility (57.4% versus 50.7%, P < 0.001, HR = 1.53, 95% CI = 1.39-1.68). The annual number of deaths per 1000 patient-years at risk decreased from 273 in 2005 to 173 in 2010, and the number of LT procedures per 1000 patient-years at risk decreased from 564 to 422. MELD and Model for End-Stage Liver Disease-Sodium scores were excellent predictors of 3-month WL mortality with c statistics of 0.828 and 0.857, respectively (P < 0.001). No difference was observed in 1-year posttransplant survival between the 2 periods (81.1% versus 81.3%). Although patients with a MELD score > 30 had lower posttransplant survival, the global accuracy of the score for predicting outcomes was poor, as indicated by a c statistic of only 0.523. Patients with granted MELD exceptions (158 for hepatocellular carcinoma and 52 for other reasons) had significantly higher access to LT (80.4%) in comparison with nonexception patients with equivalent listing priority (MELD score = 18-25; 54.6%, P < 0.001, HR = 0.49, 95% CI = 0.40-0.61). In conclusion, the adoption of the MELD model in Argentina has resulted in improved liver organ allocation without compromising

  5. Statistical mechanics of competitive resource allocation using agent-based models

    NASA Astrophysics Data System (ADS)

    Chakraborti, Anirban; Challet, Damien; Chatterjee, Arnab; Marsili, Matteo; Zhang, Yi-Cheng; Chakrabarti, Bikas K.

    2015-01-01

    Demand outstrips available resources in most situations, which gives rise to competition, interaction and learning. In this article, we review a broad spectrum of multi-agent models of competition (El Farol Bar problem, Minority Game, Kolkata Paise Restaurant problem, Stable marriage problem, Parking space problem and others) and the methods used to understand them analytically. We emphasize the power of concepts and tools from statistical mechanics to understand and explain fully collective phenomena such as phase transitions and long memory, and the mapping between agent heterogeneity and physical disorder. As these methods can be applied to any large-scale model of competitive resource allocation made up of heterogeneous adaptive agent with non-linear interaction, they provide a prospective unifying paradigm for many scientific disciplines.

  6. PIYAS-proceeding to intelligent service oriented memory allocation for flash based data centric sensor devices in wireless sensor networks.

    PubMed

    Rizvi, Sanam Shahla; Chung, Tae-Sun

    2010-01-01

    Flash memory has become a more widespread storage medium for modern wireless devices because of its effective characteristics like non-volatility, small size, light weight, fast access speed, shock resistance, high reliability and low power consumption. Sensor nodes are highly resource constrained in terms of limited processing speed, runtime memory, persistent storage, communication bandwidth and finite energy. Therefore, for wireless sensor networks supporting sense, store, merge and send schemes, an efficient and reliable file system is highly required with consideration of sensor node constraints. In this paper, we propose a novel log structured external NAND flash memory based file system, called Proceeding to Intelligent service oriented memorY Allocation for flash based data centric Sensor devices in wireless sensor networks (PIYAS). This is the extended version of our previously proposed PIYA [1]. The main goals of the PIYAS scheme are to achieve instant mounting and reduced SRAM space by keeping memory mapping information to a very low size of and to provide high query response throughput by allocation of memory to the sensor data by network business rules. The scheme intelligently samples and stores the raw data and provides high in-network data availability by keeping the aggregate data for a longer period of time than any other scheme has done before. We propose effective garbage collection and wear-leveling schemes as well. The experimental results show that PIYAS is an optimized memory management scheme allowing high performance for wireless sensor networks.

  7. PIYAS-Proceeding to Intelligent Service Oriented Memory Allocation for Flash Based Data Centric Sensor Devices in Wireless Sensor Networks

    PubMed Central

    Rizvi, Sanam Shahla; Chung, Tae-Sun

    2010-01-01

    Flash memory has become a more widespread storage medium for modern wireless devices because of its effective characteristics like non-volatility, small size, light weight, fast access speed, shock resistance, high reliability and low power consumption. Sensor nodes are highly resource constrained in terms of limited processing speed, runtime memory, persistent storage, communication bandwidth and finite energy. Therefore, for wireless sensor networks supporting sense, store, merge and send schemes, an efficient and reliable file system is highly required with consideration of sensor node constraints. In this paper, we propose a novel log structured external NAND flash memory based file system, called Proceeding to Intelligent service oriented memorY Allocation for flash based data centric Sensor devices in wireless sensor networks (PIYAS). This is the extended version of our previously proposed PIYA [1]. The main goals of the PIYAS scheme are to achieve instant mounting and reduced SRAM space by keeping memory mapping information to a very low size of and to provide high query response throughput by allocation of memory to the sensor data by network business rules. The scheme intelligently samples and stores the raw data and provides high in-network data availability by keeping the aggregate data for a longer period of time than any other scheme has done before. We propose effective garbage collection and wear-leveling schemes as well. The experimental results show that PIYAS is an optimized memory management scheme allowing high performance for wireless sensor networks. PMID:22315541

  8. PIYAS-proceeding to intelligent service oriented memory allocation for flash based data centric sensor devices in wireless sensor networks.

    PubMed

    Rizvi, Sanam Shahla; Chung, Tae-Sun

    2010-01-01

    Flash memory has become a more widespread storage medium for modern wireless devices because of its effective characteristics like non-volatility, small size, light weight, fast access speed, shock resistance, high reliability and low power consumption. Sensor nodes are highly resource constrained in terms of limited processing speed, runtime memory, persistent storage, communication bandwidth and finite energy. Therefore, for wireless sensor networks supporting sense, store, merge and send schemes, an efficient and reliable file system is highly required with consideration of sensor node constraints. In this paper, we propose a novel log structured external NAND flash memory based file system, called Proceeding to Intelligent service oriented memorY Allocation for flash based data centric Sensor devices in wireless sensor networks (PIYAS). This is the extended version of our previously proposed PIYA [1]. The main goals of the PIYAS scheme are to achieve instant mounting and reduced SRAM space by keeping memory mapping information to a very low size of and to provide high query response throughput by allocation of memory to the sensor data by network business rules. The scheme intelligently samples and stores the raw data and provides high in-network data availability by keeping the aggregate data for a longer period of time than any other scheme has done before. We propose effective garbage collection and wear-leveling schemes as well. The experimental results show that PIYAS is an optimized memory management scheme allowing high performance for wireless sensor networks. PMID:22315541

  9. A group-based tasks allocation algorithm for the optimization of long leave opportunities in academic departments

    NASA Astrophysics Data System (ADS)

    Eyono Obono, S. D.; Basak, Sujit Kumar

    2011-12-01

    The general formulation of the assignment problem consists in the optimal allocation of a given set of tasks to a workforce. This problem is covered by existing literature for different domains such as distributed databases, distributed systems, transportation, packets radio networks, IT outsourcing, and teaching allocation. This paper presents a new version of the assignment problem for the allocation of academic tasks to staff members in departments with long leave opportunities. It presents the description of a workload allocation scheme and its algorithm, for the allocation of an equitable number of tasks in academic departments where long leaves are necessary.

  10. Acoustically based fetal heart rate monitor

    NASA Technical Reports Server (NTRS)

    Baker, Donald A.; Zuckerwar, Allan J.

    1991-01-01

    The acoustically based fetal heart rate monitor permits an expectant mother to perform the fetal Non-Stress Test in her home. The potential market would include the one million U.S. pregnancies per year requiring this type of prenatal surveillance. The monitor uses polyvinylidene fluoride (PVF2) piezoelectric polymer film for the acoustic sensors, which are mounted in a seven-element array on a cummerbund. Evaluation of the sensor ouput signals utilizes a digital signal processor, which performs a linear prediction routine in real time. Clinical tests reveal that the acoustically based monitor provides Non-Stress Test records which are comparable to those obtained with a commercial ultrasonic transducer.

  11. Constant time worker thread allocation via configuration caching

    DOEpatents

    Eichenberger, Alexandre E; O'Brien, John K. P.

    2014-11-04

    Mechanisms are provided for allocating threads for execution of a parallel region of code. A request for allocation of worker threads to execute the parallel region of code is received from a master thread. Cached thread allocation information identifying prior thread allocations that have been performed for the master thread are accessed. Worker threads are allocated to the master thread based on the cached thread allocation information. The parallel region of code is executed using the allocated worker threads.

  12. Performance impact of mutation operators of a subpopulation-based genetic algorithm for multi-robot task allocation problems.

    PubMed

    Liu, Chun; Kroll, Andreas

    2016-01-01

    Multi-robot task allocation determines the task sequence and distribution for a group of robots in multi-robot systems, which is one of constrained combinatorial optimization problems and more complex in case of cooperative tasks because they introduce additional spatial and temporal constraints. To solve multi-robot task allocation problems with cooperative tasks efficiently, a subpopulation-based genetic algorithm, a crossover-free genetic algorithm employing mutation operators and elitism selection in each subpopulation, is developed in this paper. Moreover, the impact of mutation operators (swap, insertion, inversion, displacement, and their various combinations) is analyzed when solving several industrial plant inspection problems. The experimental results show that: (1) the proposed genetic algorithm can obtain better solutions than the tested binary tournament genetic algorithm with partially mapped crossover; (2) inversion mutation performs better than other tested mutation operators when solving problems without cooperative tasks, and the swap-inversion combination performs better than other tested mutation operators/combinations when solving problems with cooperative tasks. As it is difficult to produce all desired effects with a single mutation operator, using multiple mutation operators (including both inversion and swap) is suggested when solving similar combinatorial optimization problems.

  13. Performance impact of mutation operators of a subpopulation-based genetic algorithm for multi-robot task allocation problems.

    PubMed

    Liu, Chun; Kroll, Andreas

    2016-01-01

    Multi-robot task allocation determines the task sequence and distribution for a group of robots in multi-robot systems, which is one of constrained combinatorial optimization problems and more complex in case of cooperative tasks because they introduce additional spatial and temporal constraints. To solve multi-robot task allocation problems with cooperative tasks efficiently, a subpopulation-based genetic algorithm, a crossover-free genetic algorithm employing mutation operators and elitism selection in each subpopulation, is developed in this paper. Moreover, the impact of mutation operators (swap, insertion, inversion, displacement, and their various combinations) is analyzed when solving several industrial plant inspection problems. The experimental results show that: (1) the proposed genetic algorithm can obtain better solutions than the tested binary tournament genetic algorithm with partially mapped crossover; (2) inversion mutation performs better than other tested mutation operators when solving problems without cooperative tasks, and the swap-inversion combination performs better than other tested mutation operators/combinations when solving problems with cooperative tasks. As it is difficult to produce all desired effects with a single mutation operator, using multiple mutation operators (including both inversion and swap) is suggested when solving similar combinatorial optimization problems. PMID:27588254

  14. Application of portfolio theory to risk-based allocation of surveillance resources in animal populations.

    PubMed

    Prattley, D J; Morris, R S; Stevenson, M A; Thornton, R

    2007-09-14

    Distribution of finite levels of resources between multiple competing tasks can be a challenging problem. Resources need to be distributed across time periods and geographic locations to increase the probability of detection of a disease incursion or significant change in disease pattern. Efforts should focus primarily on areas and populations where risk factors for a given disease reach relatively high levels. In order to target resources into these areas, the overall risk level can be evaluated periodically across locations to create a dynamic national risk landscape. Methods are described to integrate the levels of various risk factors into an overall risk score for each area, to account for the certainty or variability around those measures and then to allocate surveillance resources across this risk landscape. In addition to targeting resources into high risk areas, surveillance continues in lower risk areas where there is a small yet positive chance of disease occurrence. In this paper we describe the application of portfolio theory concepts, routinely used in finance, to design surveillance portfolios for a series of examples. The appropriate level of resource investment is chosen for each disease or geographical area and time period given the degree of disease risk and uncertainty present. PMID:17509705

  15. Application of portfolio theory to risk-based allocation of surveillance resources in animal populations.

    PubMed

    Prattley, D J; Morris, R S; Stevenson, M A; Thornton, R

    2007-09-14

    Distribution of finite levels of resources between multiple competing tasks can be a challenging problem. Resources need to be distributed across time periods and geographic locations to increase the probability of detection of a disease incursion or significant change in disease pattern. Efforts should focus primarily on areas and populations where risk factors for a given disease reach relatively high levels. In order to target resources into these areas, the overall risk level can be evaluated periodically across locations to create a dynamic national risk landscape. Methods are described to integrate the levels of various risk factors into an overall risk score for each area, to account for the certainty or variability around those measures and then to allocate surveillance resources across this risk landscape. In addition to targeting resources into high risk areas, surveillance continues in lower risk areas where there is a small yet positive chance of disease occurrence. In this paper we describe the application of portfolio theory concepts, routinely used in finance, to design surveillance portfolios for a series of examples. The appropriate level of resource investment is chosen for each disease or geographical area and time period given the degree of disease risk and uncertainty present.

  16. Region-of-interest based rate control for UAV video coding

    NASA Astrophysics Data System (ADS)

    Zhao, Chun-lei; Dai, Ming; Xiong, Jing-ying

    2016-05-01

    To meet the requirement of high-quality transmission of videos captured by unmanned aerial vehicles (UAV) with low bandwidth, a novel rate control (RC) scheme based on region-of-interest (ROI) is proposed. First, the ROI information is sent to the encoder with the latest high efficient video coding (HEVC) standard to generate an ROI map. Then, by using the ROI map, bit allocation methods are developed at frame level and large coding unit (LCU) level, to avoid inaccurate bit allocation produced by camera movement. At last, by using a better robustness R- λ model, the quantization parameter ( QP) for each LCU is calculated. The experimental results show that the proposed RC method can get a lower bitrate error and a higher quality for reconstructed video by choosing appropriate pixel weight on the HEVC platform.

  17. Constrained Allocation Flux Balance Analysis.

    PubMed

    Mori, Matteo; Hwa, Terence; Martin, Olivier C; De Martino, Andrea; Marinari, Enzo

    2016-06-01

    New experimental results on bacterial growth inspire a novel top-down approach to study cell metabolism, combining mass balance and proteomic constraints to extend and complement Flux Balance Analysis. We introduce here Constrained Allocation Flux Balance Analysis, CAFBA, in which the biosynthetic costs associated to growth are accounted for in an effective way through a single additional genome-wide constraint. Its roots lie in the experimentally observed pattern of proteome allocation for metabolic functions, allowing to bridge regulation and metabolism in a transparent way under the principle of growth-rate maximization. We provide a simple method to solve CAFBA efficiently and propose an "ensemble averaging" procedure to account for unknown protein costs. Applying this approach to modeling E. coli metabolism, we find that, as the growth rate increases, CAFBA solutions cross over from respiratory, growth-yield maximizing states (preferred at slow growth) to fermentative states with carbon overflow (preferred at fast growth). In addition, CAFBA allows for quantitatively accurate predictions on the rate of acetate excretion and growth yield based on only 3 parameters determined by empirical growth laws.

  18. Constrained Allocation Flux Balance Analysis.

    PubMed

    Mori, Matteo; Hwa, Terence; Martin, Olivier C; De Martino, Andrea; Marinari, Enzo

    2016-06-01

    New experimental results on bacterial growth inspire a novel top-down approach to study cell metabolism, combining mass balance and proteomic constraints to extend and complement Flux Balance Analysis. We introduce here Constrained Allocation Flux Balance Analysis, CAFBA, in which the biosynthetic costs associated to growth are accounted for in an effective way through a single additional genome-wide constraint. Its roots lie in the experimentally observed pattern of proteome allocation for metabolic functions, allowing to bridge regulation and metabolism in a transparent way under the principle of growth-rate maximization. We provide a simple method to solve CAFBA efficiently and propose an "ensemble averaging" procedure to account for unknown protein costs. Applying this approach to modeling E. coli metabolism, we find that, as the growth rate increases, CAFBA solutions cross over from respiratory, growth-yield maximizing states (preferred at slow growth) to fermentative states with carbon overflow (preferred at fast growth). In addition, CAFBA allows for quantitatively accurate predictions on the rate of acetate excretion and growth yield based on only 3 parameters determined by empirical growth laws. PMID:27355325

  19. Constrained Allocation Flux Balance Analysis

    PubMed Central

    Mori, Matteo; Hwa, Terence; Martin, Olivier C.

    2016-01-01

    New experimental results on bacterial growth inspire a novel top-down approach to study cell metabolism, combining mass balance and proteomic constraints to extend and complement Flux Balance Analysis. We introduce here Constrained Allocation Flux Balance Analysis, CAFBA, in which the biosynthetic costs associated to growth are accounted for in an effective way through a single additional genome-wide constraint. Its roots lie in the experimentally observed pattern of proteome allocation for metabolic functions, allowing to bridge regulation and metabolism in a transparent way under the principle of growth-rate maximization. We provide a simple method to solve CAFBA efficiently and propose an “ensemble averaging” procedure to account for unknown protein costs. Applying this approach to modeling E. coli metabolism, we find that, as the growth rate increases, CAFBA solutions cross over from respiratory, growth-yield maximizing states (preferred at slow growth) to fermentative states with carbon overflow (preferred at fast growth). In addition, CAFBA allows for quantitatively accurate predictions on the rate of acetate excretion and growth yield based on only 3 parameters determined by empirical growth laws. PMID:27355325

  20. Energy-efficient orthogonal frequency division multiplexing-based passive optical network based on adaptive sleep-mode control and dynamic bandwidth allocation

    NASA Astrophysics Data System (ADS)

    Zhang, Chongfu; Xiao, Nengwu; Chen, Chen; Yuan, Weicheng; Qiu, Kun

    2016-02-01

    We propose an energy-efficient orthogonal frequency division multiplexing-based passive optical network (OFDM-PON) using adaptive sleep-mode control and dynamic bandwidth allocation. In this scheme, a bidirectional-centralized algorithm named the receiver and transmitter accurate sleep control and dynamic bandwidth allocation (RTASC-DBA), which has an overall bandwidth scheduling policy, is employed to enhance the energy efficiency of the OFDM-PON. The RTASC-DBA algorithm is used in an optical line terminal (OLT) to control the sleep mode of an optical network unit (ONU) sleep and guarantee the quality of service of different services of the OFDM-PON. The obtained results show that, by using the proposed scheme, the average power consumption of the ONU is reduced by ˜40% when the normalized ONU load is less than 80%, compared with the average power consumption without using the proposed scheme.

  1. Water consumption and allocation strategies along the river oases of Tarim River based on large-scale hydrological modelling

    NASA Astrophysics Data System (ADS)

    Yu, Yang; Disse, Markus; Yu, Ruide

    2016-04-01

    With the mainstream of 1,321km and located in an arid area in northwest China, the Tarim River is China's longest inland river. The Tarim basin on the northern edge of the Taklamakan desert is an extremely arid region. In this region, agricultural water consumption and allocation management are crucial to address the conflicts among irrigation water users from upstream to downstream. Since 2011, the German Ministry of Science and Education BMBF established the Sino-German SuMaRiO project, for the sustainable management of river oases along the Tarim River. The project aims to contribute to a sustainable land management which explicitly takes into account ecosystem functions and ecosystem services. SuMaRiO will identify realizable management strategies, considering social, economic and ecological criteria. This will have positive effects for nearly 10 million inhabitants of different ethnic groups. The modelling of water consumption and allocation strategies is a core block in the SuMaRiO cluster. A large-scale hydrological model (MIKE HYDRO Basin) was established for the purpose of sustainable agricultural water management in the main stem Tarim River. MIKE HYDRO Basin is an integrated, multipurpose, map-based decision support tool for river basin analysis, planning and management. It provides detailed simulation results concerning water resources and land use in the catchment areas of the river. Calibration data and future predictions based on large amount of data was acquired. The results of model calibration indicated a close correlation between simulated and observed values. Scenarios with the change on irrigation strategies and land use distributions were investigated. Irrigation scenarios revealed that the available irrigation water has significant and varying effects on the yields of different crops. Irrigation water saving could reach up to 40% in the water-saving irrigation scenario. Land use scenarios illustrated that an increase of farmland area in the

  2. Blocking Probability of a Preemption-Based Bandwidth-Allocation Scheme for Service Differentiation in OBS Networks

    NASA Astrophysics Data System (ADS)

    Phuritatkul, Jumpot; Ji, Yusheng; Zhang, Yongbing

    2006-08-01

    For the next generation optical Internet, optical burst switching (OBS) is considered as a promising solution to exploit the capacity provided by wavelength-division-multiplexing technology. In this paper, the authors analyze preemption-based bandwidth-allocation (PBA) scheme for supporting service differentiation in OBS networks. They first propose the mathematical analysis of burst blocking probability (BBP) for a general case of probabilistic wavelength-preemption algorithm. The BBP of a new arrival burst for a K-channel N-class system is presented. They then apply this model to PBA. The results of analytical loss model are investigated and compared with simulations. The simulation results validate their analytical model and show that a BBP can be controlled for different service classes with the PBA scheme.

  3. Proto-object based rate control for JPEG2000: an approach to content-based scalability.

    PubMed

    Xue, Jianru; Li, Ce; Zheng, Nanning

    2011-04-01

    The JPEG2000 system provides scalability with respect to quality, resolution and color component in the transfer of images. However, scalability with respect to semantic content is still lacking. We propose a biologically plausible salient region based bit allocation mechanism within the JPEG2000 codec for the purpose of augmenting scalability with respect to semantic content. First, an input image is segmented into several salient proto-objects (a region that possibly contains a semantically meaningful physical object) and background regions (a region that contains no object of interest) by modeling visual focus of attention on salient proto-objects. Then, a novel rate control scheme distributes a target bit rate to each individual region according to its saliency, and constructs quality layers of proto-objects for the purpose of more precise truncation comparable to original quality layers in the standard. Empirical results show that the suggested approach adds to the JPEG2000 system scalability with respect to content as well as the functionality of selectively encoding, decoding, and manipulation of each individual proto-object in the image, with only some slightly trivial modifications to the JPEG2000 standard. Furthermore, the proposed rate control approach efficiently reduces the computational complexity and memory usage, as well as maintains the high quality of the image to a level comparable to the conventional post-compression rate distortion (PCRD) optimum truncation algorithm for JPEG2000.

  4. Designing a robust feature extraction method based on optimum allocation and principal component analysis for epileptic EEG signal classification.

    PubMed

    Siuly, Siuly; Li, Yan

    2015-04-01

    The aim of this study is to design a robust feature extraction method for the classification of multiclass EEG signals to determine valuable features from original epileptic EEG data and to discover an efficient classifier for the features. An optimum allocation based principal component analysis method named as OA_PCA is developed for the feature extraction from epileptic EEG data. As EEG data from different channels are correlated and huge in number, the optimum allocation (OA) scheme is used to discover the most favorable representatives with minimal variability from a large number of EEG data. The principal component analysis (PCA) is applied to construct uncorrelated components and also to reduce the dimensionality of the OA samples for an enhanced recognition. In order to choose a suitable classifier for the OA_PCA feature set, four popular classifiers: least square support vector machine (LS-SVM), naive bayes classifier (NB), k-nearest neighbor algorithm (KNN), and linear discriminant analysis (LDA) are applied and tested. Furthermore, our approaches are also compared with some recent research work. The experimental results show that the LS-SVM_1v1 approach yields 100% of the overall classification accuracy (OCA), improving up to 7.10% over the existing algorithms for the epileptic EEG data. The major finding of this research is that the LS-SVM with the 1v1 system is the best technique for the OA_PCA features in the epileptic EEG signal classification that outperforms all the recent reported existing methods in the literature.

  5. Exploring a Model of Study Time Allocation in a Problem-Based Medical Curriculum.

    ERIC Educational Resources Information Center

    Gijselaers, Wim H.; Schmidt, Henk G.

    A study was done of the relation of time for individual study versus instruction time in a non-traditional, problem-based medical curriculum at the University of Limburg (Netherlands). The study collected data on 86 courses conducted in 5 consecutive academic years. In this problem-based approach, each curriculum year in the first 4 years…

  6. 40 CFR 74.26 - Allocation formula.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) SULFUR DIOXIDE OPT-INS Allowance Calculations for Combustion Sources § 74.26 Allocation formula. (a) The Administrator will calculate the annual allowance allocation for a combustion source based on the data... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Allocation formula. 74.26 Section...

  7. 39 CFR 3060.12 - Asset allocation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Asset allocation. 3060.12 Section 3060.12 Postal... COMPETITIVE PRODUCTS ENTERPRISE § 3060.12 Asset allocation. Within 6 months of January 23, 2009, and for each... competitive products enterprise using a method of allocation based on appropriate revenue or cost...

  8. District Allocation of Human Resources Utilizing the Evidence Based Model: A Study of One High Achieving School District in Southern California

    ERIC Educational Resources Information Center

    Lane, Amber Marie

    2013-01-01

    This study applies the Gap Analysis Framework to understand the gaps that exist in human resource allocation of one Southern California school district. Once identified, gaps are closed with the reallocation of human resources, according to the Evidenced Based Model, requiring the re-purposing of core classroom teachers, specialists, special…

  9. Web-based Electronic Sharing and RE-allocation of Assets

    SciTech Connect

    Leverett, Dave; Miller, Robert A.; Berlin, Gary J.

    2002-09-09

    The Electronic Asses Sharing Program is a web-based application that provides the capability for complex-wide sharing and reallocation of assets that are excess, under utilized, or un-utilized. through a web-based fron-end and supporting has database with a search engine, users can search for assets that they need, search for assets needed by others, enter assets they need, and enter assets they have available for reallocation. In addition, entire listings of available assets and needed assets can be viewed. The application is written in Java, the hash database and search engine are in Object-oriented Java Database Management (OJDBM). The application will be hosted on an SRS-managed server outside the Firewall and access will be controlled via a protected realm. An example of the application can be viewed at the followinig (temporary) URL: http://idgdev.srs.gov/servlet/srs.weshare.WeShare

  10. Web-based Electronic Sharing and RE-allocation of Assets

    2002-09-09

    The Electronic Asses Sharing Program is a web-based application that provides the capability for complex-wide sharing and reallocation of assets that are excess, under utilized, or un-utilized. through a web-based fron-end and supporting has database with a search engine, users can search for assets that they need, search for assets needed by others, enter assets they need, and enter assets they have available for reallocation. In addition, entire listings of available assets and needed assetsmore » can be viewed. The application is written in Java, the hash database and search engine are in Object-oriented Java Database Management (OJDBM). The application will be hosted on an SRS-managed server outside the Firewall and access will be controlled via a protected realm. An example of the application can be viewed at the followinig (temporary) URL: http://idgdev.srs.gov/servlet/srs.weshare.WeShare« less

  11. Developing an Agent-based Model for the Depot-based Water Allocation System in the Bakken Field in Western North Dakota

    NASA Astrophysics Data System (ADS)

    Lin, T.; Lin, Z.; Lim, S.; Borders, M.

    2015-12-01

    The oil production at the Bakken Shale increased more than ten times from 2008 to 2013 due to technological advancement in hydraulic fracturing and North Dakota has become the second largest oil producing state in the U.S. behind only Texas since 2012. On average it requires about 2-4 million gallons of freshwater to complete one oil well in the Bakken field and the number of oil well completions (i.e., hydraulic fracturing) in the Bakken field increased from 500 in 2008 to 2085 in 2013. A large quantity of freshwater used for hydraulic fracturing renders a significant impact on water resource management in the semi-arid region. A novel water allocation system - water depots - was spontaneously created to distribute surface and ground water for industrial uses. A GIS-based multi-agent model is developed to simulate the emergent patterns and dynamics of the water depot-based water allocation system and to explore its economic and environmental consequences. Four different types of water depot are defined as agents and water price, climate condition, water source, geology, and other physical and economic constraints are considered in the model. Decentralized optimization algorithm will be used to determine the agents' behaviors. The agent-based model for water depots will be coupled with hydrological models to improve the region's water resources management.

  12. Risk-based decision making for staggered bioterrorist attacks : resource allocation and risk reduction in "reload" scenarios.

    SciTech Connect

    Lemaster, Michelle Nicole; Gay, David M.; Ehlen, Mark Andrew; Boggs, Paul T.; Ray, Jaideep

    2009-10-01

    Staggered bioterrorist attacks with aerosolized pathogens on population centers present a formidable challenge to resource allocation and response planning. The response and planning will commence immediately after the detection of the first attack and with no or little information of the second attack. In this report, we outline a method by which resource allocation may be performed. It involves probabilistic reconstruction of the bioterrorist attack from partial observations of the outbreak, followed by an optimization-under-uncertainty approach to perform resource allocations. We consider both single-site and time-staggered multi-site attacks (i.e., a reload scenario) under conditions when resources (personnel and equipment which are difficult to gather and transport) are insufficient. Both communicable (plague) and non-communicable diseases (anthrax) are addressed, and we also consider cases when the data, the time-series of people reporting with symptoms, are confounded with a reporting delay. We demonstrate how our approach develops allocations profiles that have the potential to reduce the probability of an extremely adverse outcome in exchange for a more certain, but less adverse outcome. We explore the effect of placing limits on daily allocations. Further, since our method is data-driven, the resource allocation progressively improves as more data becomes available.

  13. Agent based model of effects of task allocation strategies in flat organizations

    NASA Astrophysics Data System (ADS)

    Sobkowicz, Pawel

    2016-09-01

    A common practice in many organizations is to pile the work on the best performers. It is easy to implement by the management and, despite the apparent injustice, appears to be working in many situations. In our work we present a simple agent based model, constructed to simulate this practice and to analyze conditions under which the overall efficiency of the organization (for example measured by the backlog of unresolved issues) breaks down, due to the cumulative effect of the individual overloads. The model confirms that the strategy mentioned above is, indeed, rational: it leads to better global results than an alternative one, using equal workload distribution among all workers. The presented analyses focus on the behavior of the organizations close to the limit of the maximum total throughput and provide results for the growth of the unprocessed backlog in several situations, as well as suggestions related to avoiding such buildup.

  14. FMO-based H.264 frame layer rate control for low bit rate video transmission

    NASA Astrophysics Data System (ADS)

    Cajote, Rhandley D.; Aramvith, Supavadee; Miyanaga, Yoshikazu

    2011-12-01

    The use of flexible macroblock ordering (FMO) in H.264/AVC improves error resiliency at the expense of reduced coding efficiency with added overhead bits for slice headers and signalling. The trade-off is most severe at low bit rates, where header bits occupy a significant portion of the total bit budget. To better manage the rate and improve coding efficiency, we propose enhancements to the H.264/AVC frame layer rate control, which take into consideration the effects of using FMO for video transmission. In this article, we propose a new header bits model, an enhanced frame complexity measure, a bit allocation and a quantization parameter adjustment scheme. Simulation results show that the proposed improvements achieve better visual quality compared with the JM 9.2 frame layer rate control with FMO enabled using a different number of slice groups. Using FMO as an error resilient tool with better rate management is suitable in applications that have limited bandwidth and in error prone environments such as video transmission for mobile terminals.

  15. Microeconomics-based resource allocation in overlay networks by using non-strategic behavior modeling

    NASA Astrophysics Data System (ADS)

    Analoui, Morteza; Rezvani, Mohammad Hossein

    2011-01-01

    Behavior modeling has recently been investigated for designing self-organizing mechanisms in the context of communication networks in order to exploit the natural selfishness of the users with the goal of maximizing the overall utility. In strategic behavior modeling, the users of the network are assumed to be game players who seek to maximize their utility with taking into account the decisions that the other players might make. The essential difference between the aforementioned researches and this work is that it incorporates the non-strategic decisions in order to design the mechanism for the overlay network. In this solution concept, the decisions that a peer might make does not affect the actions of the other peers at all. The theory of consumer-firm developed in microeconomics is a model of the non-strategic behavior that we have adopted in our research. Based on it, we have presented distributed algorithms for peers' "joining" and "leaving" operations. We have modeled the overlay network as a competitive economy in which the content provided by an origin server can be viewed as commodity and the origin server and the peers who multicast the content to their downside are considered as the firms. On the other hand, due to the dual role of the peers in the overlay network, they can be considered as the consumers as well. On joining to the overlay economy, each peer is provided with an income and tries to get hold of the service regardless to the behavior of the other peers. We have designed the scalable algorithms in such a way that the existence of equilibrium price (known as Walrasian equilibrium price) is guaranteed.

  16. Understanding the impacts of allocation approaches during process-based life cycle assessment of water treatment chemicals.

    PubMed

    Alvarez-Gaitan, Juan P; Peters, Gregory M; Short, Michael D; Schulz, Matthias; Moore, Stephen

    2014-01-01

    Chemicals are an important component of advanced water treatment operations not only in terms of economics but also from an environmental standpoint. Tools such as life cycle assessment (LCA) are useful for estimating the environmental impacts of water treatment operations. At the same time, LCA analysts must manage several fundamental and as yet unresolved methodological challenges, one of which is the question of how best to "allocate" environmental burdens in multifunctional processes. Using water treatment chemicals as a case study example, this article aims to quantify the variability in greenhouse gas emissions estimates stemming from methodological choices made in respect of allocation during LCA. The chemicals investigated and reported here are those most important to coagulation and disinfection processes, and the outcomes are illustrated on the basis of treating 1000 ML of noncoagulated and nondisinfected water. Recent process and economic data for the production of these chemicals is used and methodological alternatives for solving the multifunctionality problem, including system expansion and mass, exergy, and economic allocation, are applied to data from chlor-alkali plants. In addition, Monte Carlo simulation is included to provide a comprehensive picture of the robustness of economic allocation results to changes in the market price of these industrial commodities. For disinfection, results demonstrate that chlorine gas has a lower global warming potential (GWP) than sodium hypochlorite regardless of the technique used to solve allocation issues. For coagulation, when mass or economic allocation is used to solve the multifunctionality problem in the chlor-alkali facility, ferric chloride was found to have a higher GWP than aluminum sulfate and a slightly lower burden where system expansion or exergy allocation are applied instead. Monte Carlo results demonstrate that when economic allocation is used, GWP results were relatively robust and resilient

  17. Understanding the impacts of allocation approaches during process-based life cycle assessment of water treatment chemicals.

    PubMed

    Alvarez-Gaitan, Juan P; Peters, Gregory M; Short, Michael D; Schulz, Matthias; Moore, Stephen

    2014-01-01

    Chemicals are an important component of advanced water treatment operations not only in terms of economics but also from an environmental standpoint. Tools such as life cycle assessment (LCA) are useful for estimating the environmental impacts of water treatment operations. At the same time, LCA analysts must manage several fundamental and as yet unresolved methodological challenges, one of which is the question of how best to "allocate" environmental burdens in multifunctional processes. Using water treatment chemicals as a case study example, this article aims to quantify the variability in greenhouse gas emissions estimates stemming from methodological choices made in respect of allocation during LCA. The chemicals investigated and reported here are those most important to coagulation and disinfection processes, and the outcomes are illustrated on the basis of treating 1000 ML of noncoagulated and nondisinfected water. Recent process and economic data for the production of these chemicals is used and methodological alternatives for solving the multifunctionality problem, including system expansion and mass, exergy, and economic allocation, are applied to data from chlor-alkali plants. In addition, Monte Carlo simulation is included to provide a comprehensive picture of the robustness of economic allocation results to changes in the market price of these industrial commodities. For disinfection, results demonstrate that chlorine gas has a lower global warming potential (GWP) than sodium hypochlorite regardless of the technique used to solve allocation issues. For coagulation, when mass or economic allocation is used to solve the multifunctionality problem in the chlor-alkali facility, ferric chloride was found to have a higher GWP than aluminum sulfate and a slightly lower burden where system expansion or exergy allocation are applied instead. Monte Carlo results demonstrate that when economic allocation is used, GWP results were relatively robust and resilient

  18. Impact of one or two visits strategy on hypertension burden estimation in HYDY, a population-based cross-sectional study: implications for healthcare resource allocation decision making

    PubMed Central

    Modesti, Pietro Amedeo; Rapi, Stefano; Bamoshmoosh, Mohamed; Baldereschi, Marzia; Massetti, Luciano; Padeletti, Luigi; Gensini, Gian Franco; Zhao, Dong; Al-Hidabi, Dawood; Al Goshae, Husni

    2012-01-01

    Context The prevalence of hypertension in developing countries is coming closer to values found in developed countries. However, surveys usually rely on readings taken at a single visit, the option to implement the diagnosis on readings taken at multiple visits, being limited by costs. Objective To estimate more accurately the magnitude and extent of the resource that should be allocated to the prevention of hypertension. Design Population-based cross-sectional survey with triplicate blood pressure (BP) readings taken on two separate home-visits. Setting Rural and urban locations in three areas of Yemen (capital, inland and coast). Participants A nationally representative sample of the Yemen population aged 15–69 years (5063 men and 5179 women), with an overall response rate of 92% in urban and 94% in rural locations. Main outcome measure Hypertension diagnosed as systolic BP ≥140 mm Hg and/or diastolic BP ≥90 mm Hg and/or self-reported use of antihypertensive drugs. Results Hypertension prevalence (age-standardised to the WHO world population 2001) based on fulfilling the same criteria on both visits (11.3%; 95% Cl 10.7% to 11.9%), was 35% lower than estimation based on the first visit (17.3%; 16.5% to 18.0%). Advanced age, blood glucose ≥7 mmol/l or proteinuria ≥1+ at dipstick test at visit one were significant predictors of confirmation at visit 2. The 959 participants found to be hypertensive at visit 1 or at visit 2 only and thus excluded from the final diagnosis had a rate of proteinuria (5.0%; 3.8% to 6.5%) comparable to rates of the general population (6.1%; 5.6% to 6.6%), and of subjects normotensive at both visits (5.6%; 5.1% to 6.2%). Only 1.9% of Yemen population classified at high or very high cardiovascular (CV) risk at visit 1 moved to average, low or moderate CV risk categories after two visits. Conclusions Hypertension prevalence based on readings obtained after two visits is 35% lower than estimation based on the first visit

  19. 9 CFR 391.2 - Base time rate.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Base time rate. 391.2 Section 391.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE FOOD SAFETY AND... ACCREDITATION § 391.2 Base time rate. The base time rate for inspection services provided pursuant to §§...

  20. 9 CFR 592.510 - Base time rate.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Base time rate. 592.510 Section 592... PRODUCTS INSPECTION VOLUNTARY INSPECTION OF EGG PRODUCTS Fees and Charges § 592.510 Base time rate. The base time rate for voluntary inspection services for egg products is $47.79 per hour per...

  1. 9 CFR 391.2 - Base time rate.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Base time rate. 391.2 Section 391.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE FOOD SAFETY AND... ACCREDITATION § 391.2 Base time rate. The base time rate for inspection services provided pursuant to §§...

  2. 9 CFR 592.510 - Base time rate.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Base time rate. 592.510 Section 592... PRODUCTS INSPECTION VOLUNTARY INSPECTION OF EGG PRODUCTS Fees and Charges § 592.510 Base time rate. The base time rate for voluntary inspection services for egg products is $47.79 per hour per...

  3. A bit allocation method for sparse source coding.

    PubMed

    Kaaniche, Mounir; Fraysse, Aurélia; Pesquet-Popescu, Béatrice; Pesquet, Jean-Christophe

    2014-01-01

    In this paper, we develop an efficient bit allocation strategy for subband-based image coding systems. More specifically, our objective is to design a new optimization algorithm based on a rate-distortion optimality criterion. To this end, we consider the uniform scalar quantization of a class of mixed distributed sources following a Bernoulli-generalized Gaussian distribution. This model appears to be particularly well-adapted for image data, which have a sparse representation in a wavelet basis. In this paper, we propose new approximations of the entropy and the distortion functions using piecewise affine and exponential forms, respectively. Because of these approximations, bit allocation is reformulated as a convex optimization problem. Solving the resulting problem allows us to derive the optimal quantization step for each subband. Experimental results show the benefits that can be drawn from the proposed bit allocation method in a typical transform-based coding application.

  4. Competitive allocation of resources on a network: an agent-based model of air companies competing for the best routes

    NASA Astrophysics Data System (ADS)

    Gurtner, Gérald; Valori, Luca; Lillo, Fabrizio

    2015-05-01

    We present a stylized model of the allocation of resources on a network. By considering as a concrete example the network of sectors of the airspace, where each node is a sector characterized by a maximal number of simultaneously present aircraft, we consider the problem of air companies competing for the allocation of the airspace. Each company is characterized by a cost function, weighting differently punctuality and length of the flight. We consider the model in the presence of pure and mixed populations of types of airline companies and we study how the equilibria depends on the characteristics of the network.

  5. Development of a web GIS application for emissions inventory spatial allocation based on open source software tools

    NASA Astrophysics Data System (ADS)

    Gkatzoflias, Dimitrios; Mellios, Giorgos; Samaras, Zissis

    2013-03-01

    Combining emission inventory methods and geographic information systems (GIS) remains a key issue for environmental modelling and management purposes. This paper examines the development of a web GIS application as part of an emission inventory system that produces maps and files with spatial allocated emissions in a grid format. The study is not confined in the maps produced but also presents the features and capabilities of a web application that can be used by every user even without any prior knowledge of the GIS field. The development of the application was based on open source software tools such as MapServer for the GIS functions, PostgreSQL and PostGIS for the data management and HTML, PHP and JavaScript as programming languages. In addition, background processes are used in an innovative manner to handle the time consuming and computational costly procedures of the application. Furthermore, a web map service was created to provide maps to other clients such as the Google Maps API v3 that is used as part of the user interface. The output of the application includes maps in vector and raster format, maps with temporal resolution on daily and hourly basis, grid files that can be used by air quality management systems and grid files consistent with the European Monitoring and Evaluation Programme Grid. Although the system was developed and validated for the Republic of Cyprus covering a remarkable wide range of pollutant and emissions sources, it can be easily customized for use in other countries or smaller areas, as long as geospatial and activity data are available.

  6. A new approach to assessing the water footprint of hydroelectric power based on allocation of water footprints among reservoir ecosystem services

    NASA Astrophysics Data System (ADS)

    Zhao, Dandan; Liu, Junguo

    Hydroelectric power is an important energy source to meet the growing demand for energy, and large amounts of water are consumed to generate this energy. Previous studies often assumed that the water footprint of hydroelectric power equaled the reservoir's water footprint, but failed to allocate the reservoir water footprint among the many beneficiaries; dealing with this allocation remains a challenge. In this study, we developed a new approach to quantify the water footprint of hydroelectric power (WFh) by separating it from the reservoir water footprint (WF) using an allocation coefficient (ηh) based on the ratio of the benefits from hydroelectric power to the total ecosystem service benefits. We used this approach in a case study of the Three Gorges Reservoir, the world's largest reservoir, which provides multiple ecosystem services. We found large differences between the WFh and the water footprint of per unit of hydroelectric production (PWFh) calculated using ηh and those calculated without this factor. From 2003 to 2012, ηh decreased sharply (from 0.76 in 2005 to 0.41 in 2012), which was due to the fact that large increases in the value of non-energy ecosystem services, and particularly flood control. In 2009, flood control replaced hydroelectricity as the largest ecosystem service of water from the Three Gorges Reservoir. Using our approach, WFh and PWFh averaged 331.0 × 106 m3 and 1.5 m3 GJ-1, respectively. However, these values would almost double without allocating water footprints among different reservoir ecosystem services. Thus, previous studies have overestimated the WFh and PWFh of reservoirs, especially for reservoirs that serve multiple purposes. Thus, the allocation coefficient should not be ignored when calculating the WF of a product or service.

  7. Allocations for HANDI 2000 business management system

    SciTech Connect

    Wilson, D.

    1998-08-24

    The Data Integration 2000 Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract. It is based on the Commercial-Off-The-Shelf product solution with commercially proven business processes. The COTS product solution set, of PassPort and People Soft software, supports finance, supply and chemical management/Material Safety Data Sheet, human resources. Allocations at Fluor Daniel Hanford are burdens added to base costs using a predetermined rate.

  8. Design and implementation of priority and time-window based traffic scheduling and routing-spectrum allocation mechanism in elastic optical networks

    NASA Astrophysics Data System (ADS)

    Wang, Honghuan; Xing, Fangyuan; Yin, Hongxi; Zhao, Nan; Lian, Bizhan

    2016-02-01

    With the explosive growth of network services, the reasonable traffic scheduling and efficient configuration of network resources have an important significance to increase the efficiency of the network. In this paper, an adaptive traffic scheduling policy based on the priority and time window is proposed and the performance of this algorithm is evaluated in terms of scheduling ratio. The routing and spectrum allocation are achieved by using the Floyd shortest path algorithm and establishing a node spectrum resource allocation model based on greedy algorithm, which is proposed by us. The fairness index is introduced to improve the capability of spectrum configuration. The results show that the designed traffic scheduling strategy can be applied to networks with multicast and broadcast functionalities, and makes them get real-time and efficient response. The scheme of node spectrum configuration improves the frequency resource utilization and gives play to the efficiency of the network.

  9. Divergence in plant and microbial allocation strategies explains continental patterns in microbial allocation and biogeochemical fluxes.

    PubMed

    Averill, Colin

    2014-10-01

    Allocation trade-offs shape ecological and biogeochemical phenomena at local to global scale. Plant allocation strategies drive major changes in ecosystem carbon cycling. Microbial allocation to enzymes that decompose carbon vs. organic nutrients may similarly affect ecosystem carbon cycling. Current solutions to this allocation problem prioritise stoichiometric tradeoffs implemented in plant ecology. These solutions may not maximise microbial growth and fitness under all conditions, because organic nutrients are also a significant carbon resource for microbes. I created multiple allocation frameworks and simulated microbial growth using a microbial explicit biogeochemical model. I demonstrate that prioritising stoichiometric trade-offs does not optimise microbial allocation, while exploiting organic nutrients as carbon resources does. Analysis of continental-scale enzyme data supports the allocation patterns predicted by this framework, and modelling suggests large deviations in soil C loss based on which strategy is implemented. Therefore, understanding microbial allocation strategies will likely improve our understanding of carbon cycling and climate.

  10. Effects of feeder space allocations during rearing, female strain, and feed increase rate from photostimulation to peak egg production on broiler breeder female performance.

    PubMed

    Leksrisompong, N; Romero-Sanchez, H; Oviedo-Rondón, E O; Brake, J

    2014-05-01

    A study was conducted to determine if there were differences in female broiler breeder performance of 2 strains that had been subjected to 2 feeder space allocations during the growing period followed by 2 feeding to peak programs. Ross 308 and 708 pullets were reared with a single feeding program to 23 wk of age and with 2 circumferential feeder space allocations (5.3 cm/female or 7.0 cm/female) and then assigned to 2 feed increase programs (slow or fast) from photostimulation to peak egg production. The flock was moved to the laying house with 8.8 cm/female of female feeder space and photostimulated at 23 wk of age when Ross 344 males were added to create 16 pens with 60 females and 7 males each in a 2 × 2 × 2 design. The fast feed increase program significantly increased female BW at 31 wk of age, which could have contributed to an increased female mortality during the summer weather of early lay. The 708 females with 5.3 cm/female feeder space produced smaller eggs at 28 and 30 wk of age. The 708 females exhibited better fertile hatchability than 308 females due to fewer late dead embryos. There were no differences in egg production, fertility, or fertile hatchability due to the main effects of feeding to peak program or growing feeder space, but the slow feed increase from photostimulation to peak production reduced cumulative mortality.

  11. 15 CFR 336.4 - Allocation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Allocation. 336.4 Section 336.4 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) INTERNATIONAL... § 336.4 Allocation. (a) The Tariff Rate Quota licenses will be issued to eligible manufacturers on...

  12. Computer Processor Allocator

    2004-03-01

    The Compute Processor Allocator (CPA) provides an efficient and reliable mechanism for managing and allotting processors in a massively parallel (MP) computer. It maintains information in a database on the health. configuration and allocation of each processor. This persistent information is factored in to each allocation decision. The CPA runs in a distributed fashion to avoid a single point of failure.

  13. Computationally efficient control allocation

    NASA Technical Reports Server (NTRS)

    Durham, Wayne (Inventor)

    2001-01-01

    A computationally efficient method for calculating near-optimal solutions to the three-objective, linear control allocation problem is disclosed. The control allocation problem is that of distributing the effort of redundant control effectors to achieve some desired set of objectives. The problem is deemed linear if control effectiveness is affine with respect to the individual control effectors. The optimal solution is that which exploits the collective maximum capability of the effectors within their individual physical limits. Computational efficiency is measured by the number of floating-point operations required for solution. The method presented returned optimal solutions in more than 90% of the cases examined; non-optimal solutions returned by the method were typically much less than 1% different from optimal and the errors tended to become smaller than 0.01% as the number of controls was increased. The magnitude of the errors returned by the present method was much smaller than those that resulted from either pseudo inverse or cascaded generalized inverse solutions. The computational complexity of the method presented varied linearly with increasing numbers of controls; the number of required floating point operations increased from 5.5 i, to seven times faster than did the minimum-norm solution (the pseudoinverse), and at about the same rate as did the cascaded generalized inverse solution. The computational requirements of the method presented were much better than that of previously described facet-searching methods which increase in proportion to the square of the number of controls.

  14. Network coding based joint signaling and dynamic bandwidth allocation scheme for inter optical network unit communication in passive optical networks

    NASA Astrophysics Data System (ADS)

    Wei, Pei; Gu, Rentao; Ji, Yuefeng

    2014-06-01

    As an innovative and promising technology, network coding has been introduced to passive optical networks (PON) in recent years to support inter optical network unit (ONU) communication, yet the signaling process and dynamic bandwidth allocation (DBA) in PON with network coding (NC-PON) still need further study. Thus, we propose a joint signaling and DBA scheme for efficiently supporting differentiated services of inter ONU communication in NC-PON. In the proposed joint scheme, the signaling process lays the foundation to fulfill network coding in PON, and it can not only avoid the potential threat to downstream security in previous schemes but also be suitable for the proposed hybrid dynamic bandwidth allocation (HDBA) scheme. In HDBA, a DBA cycle is divided into two sub-cycles for applying different coding, scheduling and bandwidth allocation strategies to differentiated classes of services. Besides, as network traffic load varies, the entire upstream transmission window for all REPORT messages slides accordingly, leaving the transmission time of one or two sub-cycles to overlap with the bandwidth allocation calculation time at the optical line terminal (the OLT), so that the upstream idle time can be efficiently eliminated. Performance evaluation results validate that compared with the existing two DBA algorithms deployed in NC-PON, HDBA demonstrates the best quality of service (QoS) support in terms of delay for all classes of services, especially guarantees the end-to-end delay bound of high class services. Specifically, HDBA can eliminate queuing delay and scheduling delay of high class services, reduce those of lower class services by at least 20%, and reduce the average end-to-end delay of all services over 50%. Moreover, HDBA also achieves the maximum delay fairness between coded and uncoded lower class services, and medium delay fairness for high class services.

  15. Individual-Based Completion Rates for Apprentices. Technical Paper

    ERIC Educational Resources Information Center

    Karmel, Tom

    2011-01-01

    Low completion rates for apprentices and trainees have received considerable attention recently and it has been argued that NCVER seriously understates completion rates. In this paper Tom Karmel uses NCVER data on recommencements to estimate individual-based completion rates. It is estimated that around one-quarter of trade apprentices swap…

  16. Algorithms for resource allocation of substance abuse prevention funds based on social indicators: a case study on state of Florida--Part 3.

    PubMed

    Kim, S; Wurster, L; Williams, C; Hepler, N

    1998-01-01

    The purpose of Part 3 is to develop an algorithm for an equitable distribution of state prevention funds to its substate jurisdictions based on the need for prevention services. In this series, the need for prevention services is measured in terms of the existing social indicators observed at the county level. In order to establish a conceptual link as well as the empirical relevance of the selected social indicators as proxy measurements of the estimated need for prevention at the county level, we have employed both concurrent and construct validity tests using the following three constructs as the criterion variables in a multiple regressing setting: 1) county-based composite drug use index score (COMDRUG) measured via the statewide drug survey; 2) county-based proportions of prevention target populations using the conceptual definition advanced by the Institute of Medicine (IOM); and 3) the composite risk factor score (COMRISK) assembled from a list of twenty-two risk and protective factors observed for each county. These constructs were identified previously in Parts 1 and 2. While employing eight social indicators to estimate the overall prevention needs observed at the county level, the social indicators thus selected were able to explain 69 percent of the variations in COMDRUG, 68 percent of the variation in the proportions of students in need of prevention services using IOM definition, and 60 percent of the variation in COMRISK. Following successful validations of the social indicators as viable media with which to estimate county-based prevention needs, the ensuing multiple regression equation is, then, used to build a resource allocation model by determining the proportion of each county's share of the total statewide COMDRUG-predicted from the social indicators and, then, by weighting the latter proportion by the population size of each county under age eighteen. In this way, we have devised county-based Prevention Needs Index (PNI) scores based solely

  17. 47 CFR 76.924 - Allocation to service cost categories.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... equipment basket based on direct analysis of the origin of the costs. (ii) Where allocation based on direct analysis is not possible, common costs for which no allocator has been specified by the Commission shall... be found, common costs shall be allocated to each service cost category based on the ratio of...

  18. 76 FR 4569 - Market-Based Rate Affiliate Restrictions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-26

    ..., fuel procurement or resource planning may not be shared under the market- based rate affiliate..., 75 FR 20796 (Apr. 21, 2010), Notice of Proposed Rulemaking, FERC Stats. & Regs. ] 32,567 (2010). I... affiliates, i.e., affiliates whose power sales are regulated in whole or in part on a market-based rate...

  19. Highly accurate moving object detection in variable bit rate video-based traffic monitoring systems.

    PubMed

    Huang, Shih-Chia; Chen, Bo-Hao

    2013-12-01

    Automated motion detection, which segments moving objects from video streams, is the key technology of intelligent transportation systems for traffic management. Traffic surveillance systems use video communication over real-world networks with limited bandwidth, which frequently suffers because of either network congestion or unstable bandwidth. Evidence supporting these problems abounds in publications about wireless video communication. Thus, to effectively perform the arduous task of motion detection over a network with unstable bandwidth, a process by which bit-rate is allocated to match the available network bandwidth is necessitated. This process is accomplished by the rate control scheme. This paper presents a new motion detection approach that is based on the cerebellar-model-articulation-controller (CMAC) through artificial neural networks to completely and accurately detect moving objects in both high and low bit-rate video streams. The proposed approach is consisted of a probabilistic background generation (PBG) module and a moving object detection (MOD) module. To ensure that the properties of variable bit-rate video streams are accommodated, the proposed PBG module effectively produces a probabilistic background model through an unsupervised learning process over variable bit-rate video streams. Next, the MOD module, which is based on the CMAC network, completely and accurately detects moving objects in both low and high bit-rate video streams by implementing two procedures: 1) a block selection procedure and 2) an object detection procedure. The detection results show that our proposed approach is capable of performing with higher efficacy when compared with the results produced by other state-of-the-art approaches in variable bit-rate video streams over real-world limited bandwidth networks. Both qualitative and quantitative evaluations support this claim; for instance, the proposed approach achieves Similarity and F1 accuracy rates that are 76

  20. Highly accurate moving object detection in variable bit rate video-based traffic monitoring systems.

    PubMed

    Huang, Shih-Chia; Chen, Bo-Hao

    2013-12-01

    Automated motion detection, which segments moving objects from video streams, is the key technology of intelligent transportation systems for traffic management. Traffic surveillance systems use video communication over real-world networks with limited bandwidth, which frequently suffers because of either network congestion or unstable bandwidth. Evidence supporting these problems abounds in publications about wireless video communication. Thus, to effectively perform the arduous task of motion detection over a network with unstable bandwidth, a process by which bit-rate is allocated to match the available network bandwidth is necessitated. This process is accomplished by the rate control scheme. This paper presents a new motion detection approach that is based on the cerebellar-model-articulation-controller (CMAC) through artificial neural networks to completely and accurately detect moving objects in both high and low bit-rate video streams. The proposed approach is consisted of a probabilistic background generation (PBG) module and a moving object detection (MOD) module. To ensure that the properties of variable bit-rate video streams are accommodated, the proposed PBG module effectively produces a probabilistic background model through an unsupervised learning process over variable bit-rate video streams. Next, the MOD module, which is based on the CMAC network, completely and accurately detects moving objects in both low and high bit-rate video streams by implementing two procedures: 1) a block selection procedure and 2) an object detection procedure. The detection results show that our proposed approach is capable of performing with higher efficacy when compared with the results produced by other state-of-the-art approaches in variable bit-rate video streams over real-world limited bandwidth networks. Both qualitative and quantitative evaluations support this claim; for instance, the proposed approach achieves Similarity and F1 accuracy rates that are 76

  1. Make or buy analysis model based on tolerance allocation to minimize manufacturing cost and fuzzy quality loss

    NASA Astrophysics Data System (ADS)

    Rosyidi, C. N.; Puspitoingrum, W.; Jauhari, W. A.; Suhardi, B.; Hamada, K.

    2016-02-01

    The specification of tolerances has a significant impact on the quality of product and final production cost. The company should carefully pay attention to the component or product tolerance so they can produce a good quality product at the lowest cost. Tolerance allocation has been widely used to solve problem in selecting particular process or supplier. But before merely getting into the selection process, the company must first make a plan to analyse whether the component must be made in house (make), to be purchased from a supplier (buy), or used the combination of both. This paper discusses an optimization model of process and supplier selection in order to minimize the manufacturing costs and the fuzzy quality loss. This model can also be used to determine the allocation of components to the selected processes or suppliers. Tolerance, process capability and production capacity are three important constraints that affect the decision. Fuzzy quality loss function is used in this paper to describe the semantic of the quality, in which the product quality level is divided into several grades. The implementation of the proposed model has been demonstrated by solving a numerical example problem that used a simple assembly product which consists of three components. The metaheuristic approach were implemented to OptQuest software from Oracle Crystal Ball in order to obtain the optimal solution of the numerical example.

  2. Resource Allocation in Classrooms. Final Report.

    ERIC Educational Resources Information Center

    Thomas, J. Alan

    This report deals with the allocation of resources within classrooms and homes. It is based on the assumption that learning occurs through a set of processes that require the utilization of human and material resources. It is assumed that the study of resource allocation at the micro level will help provide an understanding of the effect on…

  3. Acquisitions Allocations: Fairness, Equity and Bundled Pricing.

    ERIC Educational Resources Information Center

    Packer, Donna

    2001-01-01

    Examined the effect of an interdisciplinary Web-based citation database with full text, the ProQuest Research Library, on the Western State University library's acquisitions allocation plan. Used list price of full-text journals to calculate increases in acquisitions funding. A list of articles discussing formula allocation is appended.…

  4. An intelligent allocation algorithm for parallel processing

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Homaifar, Abdollah; Ananthram, Kishan G.

    1988-01-01

    The problem of allocating nodes of a program graph to processors in a parallel processing architecture is considered. The algorithm is based on critical path analysis, some allocation heuristics, and the execution granularity of nodes in a program graph. These factors, and the structure of interprocessor communication network, influence the allocation. To achieve realistic estimations of the executive durations of allocations, the algorithm considers the fact that nodes in a program graph have to communicate through varying numbers of tokens. Coarse and fine granularities have been implemented, with interprocessor token-communication duration, varying from zero up to values comparable to the execution durations of individual nodes. The effect on allocation of communication network structures is demonstrated by performing allocations for crossbar (non-blocking) and star (blocking) networks. The algorithm assumes the availability of as many processors as it needs for the optimal allocation of any program graph. Hence, the focus of allocation has been on varying token-communication durations rather than varying the number of processors. The algorithm always utilizes as many processors as necessary for the optimal allocation of any program graph, depending upon granularity and characteristics of the interprocessor communication network.

  5. Model-based Heart rate prediction during Lokomat walking.

    PubMed

    Koenig, Alexander C; Somaini, Luca; Pulfer, Michael; Holenstein, Thomas; Omlin, Ximena; Wieser, Martin; Riener, Robert

    2009-01-01

    We implemented a model for prediction of heart rate during Lokomat walking. Using this model, we can predict potential overstressing of the patient and adapt the physical load accordingly. Current models for treadmill based heart rate control neglect the fact that the interaction torques between Lokomat and human can have a significant effect on heart rate. Tests with five healthy subjects lead to a model of sixth order with walking speed and power expenditure as inputs and heart rate prediction as output. Recordings with five different subjects were used for model validation. Future work includes model identification and predictive heart rate control with spinal cord injured and stroke patients. PMID:19963765

  6. SECURITY MODELING FOR MARITIME PORT DEFENSE RESOURCE ALLOCATION

    SciTech Connect

    Harris, S.; Dunn, D.

    2010-09-07

    Redeployment of existing law enforcement resources and optimal use of geographic terrain are examined for countering the threat of a maritime based small-vessel radiological or nuclear attack. The evaluation was based on modeling conducted by the Savannah River National Laboratory that involved the development of options for defensive resource allocation that can reduce the risk of a maritime based radiological or nuclear threat. A diverse range of potential attack scenarios has been assessed. As a result of identifying vulnerable pathways, effective countermeasures can be deployed using current resources. The modeling involved the use of the Automated Vulnerability Evaluation for Risks of Terrorism (AVERT{reg_sign}) software to conduct computer based simulation modeling. The models provided estimates for the probability of encountering an adversary based on allocated resources including response boats, patrol boats and helicopters over various environmental conditions including day, night, rough seas and various traffic flow rates.

  7. THE 1985 NAPAP EMISSIONS INVENTORY: DEVELOPMENT OF SPECIES ALLOCATION FACTORS

    EPA Science Inventory

    The report describes the methodologies and data bases used to develop species allocation factors and data processing software used to develop the 1985 National Acid Precipitation Assessment Program (NAPAP) Modelers' Emissions Inventory (Version 2). Species allocation factors were...

  8. Optimal resource allocation for multicast sessions in multi-hop wireless networks.

    PubMed

    Bui, Loc; Srikant, R; Stolyar, Alexander

    2008-06-13

    In this paper, we extend recent results on fair and stable resource allocation in wireless networks to include multicast sessions, in particular multi-rate multicast. The solution for multi-rate multicast is based on scheduling virtual (shadow) 'traffic' that 'moves' in reverse direction from destinations to sources. This shadow scheduling algorithm can also be used to control delays in wireless networks.

  9. Medicare and Medicaid programs; Home Health Prospective Payment System rate update for CY 2014, home health quality reporting requirements, and cost allocation of home health survey expenses. Final rule.

    PubMed

    2013-12-01

    This final rule will update the Home Health Prospective Payment System (HH PPS) rates, including the national, standardized 60-day episode payment rates, the national per-visit rates, the low-utilization payment adjustment (LUPA) add-on, and the non-routine medical supply (NRS) conversion factor under the Medicare prospective payment system for home health agencies (HHAs), effective January 1, 2014. As required by the Affordable Care Act, this rule establishes rebasing adjustments, with a 4-year phase-in, to the national, standardized 60-day episode payment rates; the national per-visit rates; and the NRS conversion factor. In addition, this final rule will remove 170 diagnosis codes from assignment to diagnosis groups within the HH PPS Grouper, effective January 1, 2014. Finally, this rule will establish home health quality reporting requirements for CY 2014 payment and subsequent years and will clarify that a state Medicaid program must provide that, in certifying HHAs, the state's designated survey agency carry out certain other responsibilities that already apply to surveys of nursing facilities and Intermediate Care Facilities for Individuals with Intellectual Disabilities (ICF-IID), including sharing in the cost of HHA surveys. For that portion of costs attributable to Medicare and Medicaid, we will assign 50 percent to Medicare and 50 percent to Medicaid, the standard method that CMS and states use in the allocation of expenses related to surveys of nursing homes. PMID:24294635

  10. Medicare and Medicaid programs; Home Health Prospective Payment System rate update for CY 2014, home health quality reporting requirements, and cost allocation of home health survey expenses. Final rule.

    PubMed

    2013-12-01

    This final rule will update the Home Health Prospective Payment System (HH PPS) rates, including the national, standardized 60-day episode payment rates, the national per-visit rates, the low-utilization payment adjustment (LUPA) add-on, and the non-routine medical supply (NRS) conversion factor under the Medicare prospective payment system for home health agencies (HHAs), effective January 1, 2014. As required by the Affordable Care Act, this rule establishes rebasing adjustments, with a 4-year phase-in, to the national, standardized 60-day episode payment rates; the national per-visit rates; and the NRS conversion factor. In addition, this final rule will remove 170 diagnosis codes from assignment to diagnosis groups within the HH PPS Grouper, effective January 1, 2014. Finally, this rule will establish home health quality reporting requirements for CY 2014 payment and subsequent years and will clarify that a state Medicaid program must provide that, in certifying HHAs, the state's designated survey agency carry out certain other responsibilities that already apply to surveys of nursing facilities and Intermediate Care Facilities for Individuals with Intellectual Disabilities (ICF-IID), including sharing in the cost of HHA surveys. For that portion of costs attributable to Medicare and Medicaid, we will assign 50 percent to Medicare and 50 percent to Medicaid, the standard method that CMS and states use in the allocation of expenses related to surveys of nursing homes.

  11. 75 FR 72581 - Assessments, Assessment Base and Rates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ...\\ The 2009 assessments rule established the following initial base assessment rate schedule: \\3\\ 74 FR... for Comment on Assessment Dividends, Assessment Rates and Designated Reserve Ratio, 75 FR 66271. III... adjustments. \\7\\ 74 FR 9525. Unsecured Debt Adjustment All other things equal, greater amounts of...

  12. Field Based Constraints on Reaction Rates in the Crust

    NASA Astrophysics Data System (ADS)

    Baxter, E. F.

    2004-12-01

    Modern research in plate boundary processes involving metamorphism frequently employs complex physical models. Such models require some quantification (or assumption) of the rate at which metamorphic reactions, or chemical exchange, proceed in natural systems. Here, a compilation of available quantitative field-based constraints on high temperature reaction rates will be presented. These include quantifications based on isotopic exchange, porphyroblast and reaction corona growth models, geochronology, and textural analysis. Additionally, natural strain rates provide an important upper bound on simultaneous reaction rates by virtue of a direct mechanistic link between reaction and strain that applies in most situations within the deforming crust. These data show that reaction rates attending regional metamorphism are 4-7 orders of magnitude slower than most laboratory-based predictions. A general rate law for regional metamorphic reactions has been derived which best describes these field-based data: log10(Rnet) = .0029T-9.6±1, where Rnet is the net reaction rate in g/cm2/yr and T is temperature (C) (Baxter 2003, JGSL). Reaction rates attending contact metamorphism differ from laboratory-based predictions by less than 2 orders of magnitude, and are in closest agreement at higher temperatures. Regional metamorphic reaction rates may be limited by comparatively lesser (or transient) availability of aqueous fluid in the intergranular medium, slower heat input, and smaller deviations from equilibrium. Implications of slow natural metamorphic reaction rates may include a delay in the completion of metamorphic reactions which release (or take in) volatiles, and transform the mineralogy of the crust in dynamic plate boundary settings such as subduction zones.

  13. The role of physical formidability in human social status allocation.

    PubMed

    Lukaszewski, Aaron W; Simmons, Zachary L; Anderson, Cameron; Roney, James R

    2016-03-01

    Why are physically formidable men willingly allocated higher social status by others in cooperative groups? Ancestrally, physically formidable males would have been differentially equipped to generate benefits for groups by providing leadership services of within-group enforcement (e.g., implementing punishment of free riders) and between-group representation (e.g., negotiating with other coalitions). Therefore, we hypothesize that adaptations for social status allocation are designed to interpret men's physical formidability as a cue to these leadership abilities, and to allocate greater status to formidable men on this basis. These hypotheses were supported in 4 empirical studies wherein young adults rated standardized photos of subjects (targets) who were described as being part of a white-collar business consultancy. In Studies 1 and 2, male targets' physical strength positively predicted ratings of their projected status within the organization, and this effect was mediated by perceptions that stronger men possessed greater leadership abilities of within-group enforcement and between-group representation. Moreover, (a) these same patterns held whether status was conceptualized as overall ascendancy, prestige-based status, or dominance-based status, and (b) strong men who were perceived as aggressively self-interested were not allocated greater status. Finally, 2 experiments established the causality of physical formidability's effects on status-related perceptions by manipulating targets' relative strength (Study 3) and height (Study 4). In interpreting our findings, we argue that adaptations for formidability-based status allocation may have facilitated the evolution of group cooperation in humans and other primates. (PsycINFO Database Record PMID:26653896

  14. An examination of two concentrate allocation strategies which are based on the early lactation milk yield of autumn calving Holstein Friesian cows.

    PubMed

    Lawrence, D; O'Donovan, M; Boland, T M; Lewis, E; Kennedy, E

    2016-05-01

    The objective of this experiment was to compare the effects of two concentrate feeding strategies offered with a grass silage and maize silage diet on the dry matter (DM) intake, milk production (MP) and estimated energy balance of autumn calved dairy cows. Over a 2-year period, 180 autumn calving Holstein Friesian cows were examined. Within year, cows were blocked into three MP sub-groups (n=9) (high (HMP), medium (MMP) and low (LMP)) based on the average MP data from weeks 3 and 4 of lactation. Within a block cows were randomly assigned to one of two treatments (n=54), flat rate (FR) concentrate feeding or feed to yield (FY) based on MP sub-group. Cows on the FR treatment were offered a fixed rate of concentrate (5.5 kg DM/cow per day) irrespective of MP sub-group. In the FY treatment HMP, MMP and LMP cows were allocated 7.3, 5.5 and 3.7 kg DM of concentrate, respectively. The mean concentrate offered to the FR and FY treatments was the same. On the FR treatment there was no significant difference in total dry matter intake (TDMI, 17.3 kg) between MP sub-groups. In the FY treatment, however, the TDMI of HMP-FY was 2.2 kg greater than MMP-FY, and 4.5 kg greater than LMP-FY (15.2 kg DM). The milk yield of LMP-FR was 3.5 kg less than the mean of the HMP-FR and MMP-FR treatments (24.5 kg). The milk yield of the HMP-FY treatment was 3.6 and 7.9 kg greater than the MMP-FY and LMP-FY treatments, respectively. The difference in MP between the HMP sub-groups was 2.6 kg, which translates to a response of 1.4 kg of milk per additional 1 kg of concentrate offered. There was no significant difference in MP between the two LMP sub-groups; however, MP increased 0.8 kg per additional 1 kg of concentrate offered between cows on the LMP-FR and LMP-FY treatments. The estimated energy balance was positive for cows on the LMP-FR treatment, but negative for cows on the other treatments. The experiment highlights the variation within a herd in MP response to concentrate, as cows with a

  15. An enhanced rate-based emission trading program for NOX: the Dutch model.

    PubMed

    Sholtz, A M; Van Amburg, B; Wochnick, V K

    2001-12-01

    Since 1997 government and industry in The Netherlands have been engaged in intensive policy discussions on how to design an emission trading program that would satisfy the Government's policy objectives within the national and international regulatory framework and accommodate industry's need for a flexible and cost-effective approach. Early on in the discussion the most promising solution was a rate-based approach, which dynamically allocated saleable emission credits based on a performance standard rate and actual energy used by facilities. All industrial facilities above a threshold of 20 MWth would be judged on their ability to meet this performance rate. Those "cleaner" than the standard can sell excess credits to others with an allocation that is less than their actual NOX emission. With some changes in law, such a design could be made to fit well into the national and EU legislative framework while at the same time uniquely meeting industry's requirement of flexibility toward economic growth and facility expansion. (An analysis of the legislative changes required will be given in a separate paper by Chris Dekkers.) However, the environmental outcome of such a system is not as certain as under an absolute emission cap. At the request of the Netherlands Ministry of Housing, Spatial Planning and the Environment (VROM), Automated Credit Exchange (ACE), in close cooperation with the working group of government and industry representatives introduced a number of features into the Dutch NOX program allowing full exploitation of market mechanisms while allowing intermediate adjustments in the performance standard rates. The design is geared toward meeting environmental targets without jeopardizing the trading market the program intends to create. The paper discusses the genesis of the two-tier credit system ACE helped to design, explains the differences between primary (fixed) and secondary (variable) credits, and outlines how the Dutch system is expected to

  16. An enhanced rate-based emission trading program for NOX: the Dutch model.

    PubMed

    Sholtz, A M; Van Amburg, B; Wochnick, V K

    2001-12-01

    Since 1997 government and industry in The Netherlands have been engaged in intensive policy discussions on how to design an emission trading program that would satisfy the Government's policy objectives within the national and international regulatory framework and accommodate industry's need for a flexible and cost-effective approach. Early on in the discussion the most promising solution was a rate-based approach, which dynamically allocated saleable emission credits based on a performance standard rate and actual energy used by facilities. All industrial facilities above a threshold of 20 MWth would be judged on their ability to meet this performance rate. Those "cleaner" than the standard can sell excess credits to others with an allocation that is less than their actual NOX emission. With some changes in law, such a design could be made to fit well into the national and EU legislative framework while at the same time uniquely meeting industry's requirement of flexibility toward economic growth and facility expansion. (An analysis of the legislative changes required will be given in a separate paper by Chris Dekkers.) However, the environmental outcome of such a system is not as certain as under an absolute emission cap. At the request of the Netherlands Ministry of Housing, Spatial Planning and the Environment (VROM), Automated Credit Exchange (ACE), in close cooperation with the working group of government and industry representatives introduced a number of features into the Dutch NOX program allowing full exploitation of market mechanisms while allowing intermediate adjustments in the performance standard rates. The design is geared toward meeting environmental targets without jeopardizing the trading market the program intends to create. The paper discusses the genesis of the two-tier credit system ACE helped to design, explains the differences between primary (fixed) and secondary (variable) credits, and outlines how the Dutch system is expected to

  17. Adaptive Estimation of Intravascular Shear Rate Based on Parameter Optimization

    NASA Astrophysics Data System (ADS)

    Nitta, Naotaka; Takeda, Naoto

    2008-05-01

    The relationships between the intravascular wall shear stress, controlled by flow dynamics, and the progress of arteriosclerosis plaque have been clarified by various studies. Since the shear stress is determined by the viscosity coefficient and shear rate, both factors must be estimated accurately. In this paper, an adaptive method for improving the accuracy of quantitative shear rate estimation was investigated. First, the parameter dependence of the estimated shear rate was investigated in terms of the differential window width and the number of averaged velocity profiles based on simulation and experimental data, and then the shear rate calculation was optimized. The optimized result revealed that the proposed adaptive method of shear rate estimation was effective for improving the accuracy of shear rate calculation.

  18. Making sense of peak load cost allocations

    SciTech Connect

    Power, T.M.

    1995-03-15

    When it comes to cost allocation, common wisdom assigns costs in proportion to class contributions to peak loads, The justification is simple: Since the equipment had to be sized to meet peak day loads, those costs should be allocated on the same basis. Many different peak allocators have been developed on this assumption: single coincident peak contribution, sum of coincident peaks, noncoincident peak, average and excess demand, peak and average demand, base and extra capacity, and so on. Such pure peak-load allocators may not be politically acceptable, but conceptually, at least, they appear to offer the only defensible approach. Nevertheless, where capacity can be added with significant economies of scale, making cost allocations in proportion to peak loads violates well-known relationships between economics and engineering. What is missing is any tracing of the way in which the peak-load design criteria actually influence the cost incurred.

  19. Prediction of Postoperative Mortality in Liver Transplantation in the Era of MELD-Based Liver Allocation: A Multivariate Analysis

    PubMed Central

    Schultze, Daniel; Hillebrand, Norbert; Hinz, Ulf; Büchler, Markus W.; Schemmer, Peter

    2014-01-01

    Background and Aims Liver transplantation is the only curative treatment for end-stage liver disease. While waiting list mortality can be predicted by the MELD-score, reliable scoring systems for the postoperative period do not exist. This study's objective was to identify risk factors that contribute to postoperative mortality. Methods Between December 2006 and March 2011, 429 patients underwent liver transplantation in our department. Risk factors for postoperative mortality in 266 consecutive liver transplantations were identified using univariate and multivariate analyses. Patients who were <18 years, HU-listings, and split-, living related, combined or re-transplantations were excluded from the analysis. The correlation between number of risk factors and mortality was analyzed. Results A labMELD ≥20, female sex, coronary heart disease, donor risk index >1.5 and donor Na+>145 mmol/L were identified to be independent predictive factors for postoperative mortality. With increasing number of these risk-factors, postoperative 90-day and 1-year mortality increased (0–1: 0 and 0%; 2: 2.9 and 17.4%; 3: 5.6 and 16.8%; 4: 22.2 and 33.3%; 5–6: 60.9 and 66.2%). Conclusions In this analysis, a simple score was derived that adequately identified patients at risk after liver transplantation. Opening a discussion on the inclusion of these parameters in the process of organ allocation may be a worthwhile venture. PMID:24905210

  20. Reward Allocation and Academic versus Social Orientation toward School.

    ERIC Educational Resources Information Center

    Peterson, Candida C.; Peterson, James L.

    1978-01-01

    Correlates 138 elementary school children's views about the purposes of school to their styles of reward allocation: academically motivated students allocated rewards equally to two hypothetical performers who had unequally helped a teacher perform a manual chore, while socially motivated children allocated rewards in an equity (performance-based)…

  1. Carbon allocation to ectomycorrhizal fungi correlates with belowground allocation in culture studies.

    PubMed

    Hobbie, Erik A

    2006-03-01

    Ectomycorrhizal fungi form symbioses with most temperate and boreal tree species, but difficulties in measuring carbon allocation to these symbionts have prevented the assessment of their importance in forest ecosystems. Here, I surveyed allocation patterns in 14 culture studies and five field studies of ectomycorrhizal plants. In culture studies, allocation to ectomycorrhizal fungi (NPPf) was linearly related to total belowground net primary production (NPPb) by the equation NPPf = 41.5% x NPPb - 11.3% (r2 = 0.55, P < 0.001) and ranged from 1% to 21% of total net primary production. As a percentage of NPP, allocation to ectomycorrhizal fungi was highest at lowest plant growth rates and lowest nutrient availabilities. Because total belowground allocation can be estimated using carbon balance techniques, these relationships should allow ecologists to incorporate mycorrhizal fungi into existing ecosystem models. In field studies, allocation to ectomycorrhizal fungi ranged from 0% to 22% of total allocation, but wide differences in measurement techniques made intercomparisons difficult. Techniques such as fungal in-growth cores, root branching-order studies, and isotopic analyses could refine our estimates of turnover rates of fine roots, mycorrhizae, and extraradical hyphae. Together with ecosystem modeling, such techniques could soon provide good estimates of the relative importance of root vs. fungal allocation in belowground carbon budgets.

  2. The Unobtrusive Memory Allocator

    2003-03-31

    This library implements a memory allocator/manager which ask its host program or library for memory refions to manage rather than requesting them from the operating system. This allocator supports multiple distinct heaps within a single executable, each of which may grow either upward or downward in memory. The GNU mmalloc library has been modified in such a way that its allocation algorithms have been preserved, but the manner in which it obtains regions to managemore » has been changed to request memory from the host program or library. Additional modifications allow the allocator to manage each heap as either upward or downward-growing. By allowing the hosting program or library to determine what memory is managed, this package allows a greater degree of control than other memory allocation/management libraries. Additional distinguishing features include the ability to manage multiple distinct heaps with in a single executable, each of which may grow either upward or downward in memory. The most common use of this library is in conjunction with the Berkeley Unified Parallel C (UPC) Runtime Library. This package is a modified version of the LGPL-licensed "mmalloc" allocator from release 5.2 of the "gdb" debugger's source code.« less

  3. Evidence for rule-based processes in the inverse base-rate effect.

    PubMed

    Winman, Anders; Wennerholm, Pia; Juslin, Peter; Shanks, David R

    2005-07-01

    Three studies provide convergent evidence that the inverse base-rate effect (Medin & Edelson, 1988) is mediated by rule-based cognitive processes. Experiment 1 shows that, in contrast to adults, prior to the formal operational stage most children do not exhibit the inverse base-rate effect. Experiments 2 and 3 demonstrate that an adult sample is a mix of participants relying on associative processes who categorize according to the base-rate and participants relying on rule-based processes who exhibit a strong inverse base-rate effect. The distribution of the effect is bimodal, and removing participants independently classified as prone to rule-based processing effectively eliminates the inverse base-rate effect. The implications for current explanations of the inverse base-rate effect are discussed. PMID:16194936

  4. Allocation Games: Addressing the Ill-Posed Nature of Allocation in Life-Cycle Inventories.

    PubMed

    Hanes, Rebecca J; Cruze, Nathan B; Goel, Prem K; Bakshi, Bhavik R

    2015-07-01

    Allocation is required when a life cycle contains multi-functional processes. One approach to allocation is to partition the embodied resources in proportion to a criterion, such as product mass or cost. Many practitioners apply multiple partitioning criteria to avoid choosing one arbitrarily. However, life cycle results from different allocation methods frequently contradict each other, making it difficult or impossible for the practitioner to draw any meaningful conclusions from the study. Using the matrix notation for life-cycle inventory data, we show that an inventory that requires allocation leads to an ill-posed problem: an inventory based on allocation is one of an infinite number of inventories that are highly dependent upon allocation methods. This insight is applied to comparative life-cycle assessment (LCA), in which products with the same function but different life cycles are compared. Recently, there have been several studies that applied multiple allocation methods and found that different products were preferred under different methods. We develop the Comprehensive Allocation Investigation Strategy (CAIS) to examine any given inventory under all possible allocation decisions, enabling us to detect comparisons that are not robust to allocation, even when the comparison appears robust under conventional partitioning methods. While CAIS does not solve the ill-posed problem, it provides a systematic way to parametrize and examine the effects of partitioning allocation. The practical usefulness of this approach is demonstrated with two case studies. The first compares ethanol produced from corn stover hydrolysis, corn stover gasification, and corn grain fermentation. This comparison was not robust to allocation. The second case study compares 1,3-propanediol (PDO) produced from fossil fuels and from biomass, which was found to be a robust comparison. PMID:26061700

  5. Allocation Games: Addressing the Ill-Posed Nature of Allocation in Life-Cycle Inventories.

    PubMed

    Hanes, Rebecca J; Cruze, Nathan B; Goel, Prem K; Bakshi, Bhavik R

    2015-07-01

    Allocation is required when a life cycle contains multi-functional processes. One approach to allocation is to partition the embodied resources in proportion to a criterion, such as product mass or cost. Many practitioners apply multiple partitioning criteria to avoid choosing one arbitrarily. However, life cycle results from different allocation methods frequently contradict each other, making it difficult or impossible for the practitioner to draw any meaningful conclusions from the study. Using the matrix notation for life-cycle inventory data, we show that an inventory that requires allocation leads to an ill-posed problem: an inventory based on allocation is one of an infinite number of inventories that are highly dependent upon allocation methods. This insight is applied to comparative life-cycle assessment (LCA), in which products with the same function but different life cycles are compared. Recently, there have been several studies that applied multiple allocation methods and found that different products were preferred under different methods. We develop the Comprehensive Allocation Investigation Strategy (CAIS) to examine any given inventory under all possible allocation decisions, enabling us to detect comparisons that are not robust to allocation, even when the comparison appears robust under conventional partitioning methods. While CAIS does not solve the ill-posed problem, it provides a systematic way to parametrize and examine the effects of partitioning allocation. The practical usefulness of this approach is demonstrated with two case studies. The first compares ethanol produced from corn stover hydrolysis, corn stover gasification, and corn grain fermentation. This comparison was not robust to allocation. The second case study compares 1,3-propanediol (PDO) produced from fossil fuels and from biomass, which was found to be a robust comparison.

  6. Global Earthquake Activity Rate models based on version 2 of the Global Strain Rate Map

    NASA Astrophysics Data System (ADS)

    Bird, P.; Kreemer, C.; Kagan, Y. Y.; Jackson, D. D.

    2013-12-01

    Global Earthquake Activity Rate (GEAR) models have usually been based on either relative tectonic motion (fault slip rates and/or distributed strain rates), or on smoothing of seismic catalogs. However, a hybrid approach appears to perform better than either parent, at least in some retrospective tests. First, we construct a Tectonic ('T') forecast of shallow (≤ 70 km) seismicity based on global plate-boundary strain rates from version 2 of the Global Strain Rate Map. Our approach is the SHIFT (Seismic Hazard Inferred From Tectonics) method described by Bird et al. [2010, SRL], in which the character of the strain rate tensor (thrusting and/or strike-slip and/or normal) is used to select the most comparable type of plate boundary for calibration of the coupled seismogenic lithosphere thickness and corner magnitude. One difference is that activity of offshore plate boundaries is spatially smoothed using empirical half-widths [Bird & Kagan, 2004, BSSA] before conversion to seismicity. Another is that the velocity-dependence of coupling in subduction and continental-convergent boundaries [Bird et al., 2009, BSSA] is incorporated. Another forecast component is the smoothed-seismicity ('S') forecast model of [Kagan & Jackson, 1994, JGR; Kagan & Jackson, 2010, GJI], which was based on optimized smoothing of the shallow part of the GCMT catalog, years 1977-2004. Both forecasts were prepared for threshold magnitude 5.767. Then, we create hybrid forecasts by one of 3 methods: (a) taking the greater of S or T; (b) simple weighted-average of S and T; or (c) log of the forecast rate is a weighted average of the logs of S and T. In methods (b) and (c) there is one free parameter, which is the fractional contribution from S. All hybrid forecasts are normalized to the same global rate. Pseudo-prospective tests for 2005-2012 (using versions of S and T calibrated on years 1977-2004) show that many hybrid models outperform both parents (S and T), and that the optimal weight on S

  7. The organ allocation controversy: how did we arrive here?

    PubMed

    Van Meter, C H

    1999-01-01

    The Department of Health and Human Services (HHS) recently issued a final regulation governing the Organ Procurement and Transplantation Network (OPTN) that directs the allocation of organs to the sickest patients first without regard to a host of medical, geographic, and social factors that members of the transplant community view as an essential part of a sound organ allocation policy.Current organ allocation mechanisms are based on policies that reflect a broad consensus of medical experts and provide equal consideration for both the needs of the sickest patients and the efficient use of organs. This system also reduces potential waste of organs by minimizing cold ischemic time, increases access to transplantation for patients in local communities, provides positive incentives for local citizens and medical professionals to support organ donation initiatives, and decreases the cost of organ transplantation.Representatives of the American Society of Transplant Surgeons have testified before Congress that "giving priority to the sickest patients first over broad geographic areas would be wasteful and dangerous, resulting in fewer patients transplanted, increased death rates, increased retransplantation due to poor organ function, and increased overall cost of transplantation." In response, Congress enacted a 1-year moratorium on the implementation of the HHS rule and provided for a study of the current organ allocation policy and HHS regulation by The Institute of Medicine. PMID:21845113

  8. Statistical inference for extinction rates based on last sightings.

    PubMed

    Nakamura, Miguel; Del Monte-Luna, Pablo; Lluch-Belda, Daniel; Lluch-Cota, Salvador E

    2013-09-21

    Rates of extinction can be estimated from sighting records and are assumed to be implicitly constant by many data analysis methods. However, historical sightings are scarce. Frequently, the only information available for inferring extinction is the date of the last sighting. In this study, we developed a probabilistic model and a corresponding statistical inference procedure based on last sightings. We applied this procedure to data on recent marine extirpations and extinctions, seeking to test the null hypothesis of a constant extinction rate. We found that over the past 500 years extirpations in the ocean have been increasing but at an uncertain rate, whereas a constant rate of global marine extinctions is statistically plausible. The small sample sizes of marine extinction records generate such high uncertainty that different combinations of model inputs can yield different outputs that fit the observed data equally well. Thus, current marine extinction trends may be idiosyncratic.

  9. Increasing Response Rates to Web-Based Surveys

    ERIC Educational Resources Information Center

    Monroe, Martha C.; Adams, Damian C.

    2012-01-01

    We review a popular method for collecing data--Web-based surveys. Although Web surveys are popular, one major concern is their typically low response rates. Using the Dillman et al. (2009) approach, we designed, pre-tested, and implemented a survey on climate change with Extension professionals in the Southeast. The Dillman approach worked well,…

  10. A multilevel region-of-interest based rate control scheme for video communication

    NASA Astrophysics Data System (ADS)

    Zhou, QiLui; Liu, Jiaying; Guo, Zongming

    2009-10-01

    The ROI based video coding is widely applied in video communication. In this paper, we propose a multilevel ROI model, which includes the eye-mouth core region (CR), the face profile region (PR), the edge region (ER) and the background region (BR), to classify the subjective importance level of regions for the scene. Taking account of the proposed model, we first segment the current frame into four regions through skin color detection and feature location. Then, we improve the rate control algorithm in JVT-G012 proposal. We consider two factors, including subjective factor by our multi-level ROI model and objective factor by direct difference from reference frame, to model the complexity weight of each macroblock (MB).We allocate resources both at the frame layer and the basic unit layer, and adjust QP at MB layer. Finally, we restrict the QP of MB with three strategies to maintain the spatial and temporal smoothness. The experimental results illustrate that PSNR of ROI (CR plus PR) area using proposed method is in average over 0.5dB higher than JM8.6, while there are only slight changes in the PSNR of whole frame between two methods. Subjective quality based on our method also achieves much better performance.

  11. Comparison of Airborne and Ground-Based Function Allocation Concepts for NextGen Using Human-In-The-Loop Simulations

    NASA Technical Reports Server (NTRS)

    Wing, David J.; Prevot, Thomas; Murdoch, Jennifer L.; Cabrall, Christopher D.; Homola, Jeffrey R.; Martin, Lynne H.; Mercer, Joey S.; Hoadley, Sherwood T.; Wilson, Sara R.; Hubbs, Clay E.; Chamberlain, James P.; Chartrand, Ryan C.; Consiglio, Maria C.; Palmer, Michael T.

    2010-01-01

    This paper presents an air/ground functional allocation experiment conducted by the National Aeronautics and Space Administration (NASA) using two human-in-the-Loop simulations to compare airborne and ground-based approaches to NextGen separation assurance. The approaches under investigation are two trajectory-based four-dimensional (4D) concepts; one referred to as "airborne trajectory management with self-separation" (airborne) the other as "ground-based automated separation assurance" (ground-based). In coordinated simulations at NASA's Ames and Langley Research Centers, the primary operational participants -controllers for the ground-based concept and pilots for the airborne concept - manage the same traffic scenario using the two different 4D concepts. The common scenarios are anchored in traffic problems that require a significant increase in airspace capacity - on average, double, and in some local areas, close to 250% over current day levels - in order to enable aircraft to safely and efficiently traverse the test airspace. The simulations vary common independent variables such as traffic density, sequencing and scheduling constraints, and timing of trajectory change events. A set of common metrics is collected to enable a direct comparison of relevant results. The simulations will be conducted in spring 2010. If accepted, this paper will be the first publication of the experimental approach and early results. An initial comparison of safety and efficiency as well as operator acceptability under the two concepts is expected.

  12. Performance Invalidity Base Rates Among Healthy Undergraduate Research Participants.

    PubMed

    Ross, Thomas P; Poston, Ashley M; Rein, Patricia A; Salvatore, Andrew N; Wills, Nathan L; York, Taylor M

    2016-02-01

    Few studies have examined base rates of suboptimal effort among healthy, undergraduate students recruited for neuropsychological research. An and colleagues (2012, Conducting research with non-clinical healthy undergraduates: Does effort play a role in neuropsychological test performance? Archives of Clinical Neuropsychology, 27, 849-857) reported high rates of performance invalidity (30.8%-55.6%), calling into question the validity of findings generated from samples of college students. In contrast, subsequent studies have reported much lower base rates ranging from 2.6% to 12%. The present study replicated and extended previous work by examining the performance of 108 healthy undergraduates on the Dot Counting Test, Victoria Symptom Validity Test, Word Memory Test, and a brief battery of neuropsychological measures. During initial testing, 8.3% of the sample scored below cutoffs on at least one Performance Validity Test, while 3.7% were classified as invalid at Time 2 (M interval = 34.4 days). The present findings add to a growing number of studies that suggest performance invalidity base rates in samples of non-clinical, healthy college students are much lower than An and colleagues initial findings. Although suboptimal effort is much less problematic than suggested by An and colleagues, recent reports as high as 12% indicate including measures of effort may be of value when using college students as participants. Methodological issues and recommendations for future research are presented.

  13. Collaborative Resource Allocation

    NASA Technical Reports Server (NTRS)

    Wang, Yeou-Fang; Wax, Allan; Lam, Raymond; Baldwin, John; Borden, Chester

    2007-01-01

    Collaborative Resource Allocation Networking Environment (CRANE) Version 0.5 is a prototype created to prove the newest concept of using a distributed environment to schedule Deep Space Network (DSN) antenna times in a collaborative fashion. This program is for all space-flight and terrestrial science project users and DSN schedulers to perform scheduling activities and conflict resolution, both synchronously and asynchronously. Project schedulers can, for the first time, participate directly in scheduling their tracking times into the official DSN schedule, and negotiate directly with other projects in an integrated scheduling system. A master schedule covers long-range, mid-range, near-real-time, and real-time scheduling time frames all in one, rather than the current method of separate functions that are supported by different processes and tools. CRANE also provides private workspaces (both dynamic and static), data sharing, scenario management, user control, rapid messaging (based on Java Message Service), data/time synchronization, workflow management, notification (including emails), conflict checking, and a linkage to a schedule generation engine. The data structure with corresponding database design combines object trees with multiple associated mortal instances and relational database to provide unprecedented traceability and simplify the existing DSN XML schedule representation. These technologies are used to provide traceability, schedule negotiation, conflict resolution, and load forecasting from real-time operations to long-range loading analysis up to 20 years in the future. CRANE includes a database, a stored procedure layer, an agent-based middle tier, a Web service wrapper, a Windows Integrated Analysis Environment (IAE), a Java application, and a Web page interface.

  14. 77 FR 24198 - Notice of Revocation of Market-Based Rate Authority and Termination of Market-Based Rate Tariffs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-23

    ... ER05-419-000 ] Verde Renewable Energy, Inc ER07-48-000 Vesta Capital Partners LP ER05-1434-000 Vesta... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Revocation of Market-Based Rate Authority and Termination...

  15. Ground-Based Remote Retrievals of Cumulus Entrainment Rates

    SciTech Connect

    Wagner, Timothy J.; Turner, David D.; Berg, Larry K.; Krueger, Steven K.

    2013-07-26

    While fractional entrainment rates for cumulus clouds have typically been derived from airborne observations, this limits the size and scope of available data sets. To increase the number of continental cumulus entrainment rate observations available for study, an algorithm for retrieving them from ground-based remote sensing observations has been developed. This algorithm, called the Entrainment Rate In Cumulus Algorithm (ERICA), uses the suite of instruments at the Southern Great Plains (SGP) site of the United States Department of Energy's Atmospheric Radiation Measurement (ARM) Climate Research Facility as inputs into a Gauss-Newton optimal estimation scheme, in which an assumed guess of the entrainment rate is iteratively adjusted through intercomparison of modeled liquid water path and cloud droplet effective radius to their observed counterparts. The forward model in this algorithm is the Explicit Mixing Parcel Model (EMPM), a cloud parcel model that treats entrainment as a series of discrete entrainment events. A quantified value for measurement uncertainty is also returned as part of the retrieval. Sensitivity testing and information content analysis demonstrate the robust nature of this method for retrieving accurate observations of the entrainment rate without the drawbacks of airborne sampling. Results from a test of ERICA on three months of shallow cumulus cloud events show significant variability of the entrainment rate of clouds in a single day and from one day to the next. The mean value of 1.06 km-¹ for the entrainment rate in this dataset corresponds well with prior observations and simulations of the entrainment rate in cumulus clouds.

  16. Optimality versus stability in water resource allocation.

    PubMed

    Read, Laura; Madani, Kaveh; Inanloo, Bahareh

    2014-01-15

    Water allocation is a growing concern in a developing world where limited resources like fresh water are in greater demand by more parties. Negotiations over allocations often involve multiple groups with disparate social, economic, and political status and needs, who are seeking a management solution for a wide range of demands. Optimization techniques for identifying the Pareto-optimal (social planner solution) to multi-criteria multi-participant problems are commonly implemented, although often reaching agreement for this solution is difficult. In negotiations with multiple-decision makers, parties who base decisions on individual rationality may find the social planner solution to be unfair, thus creating a need to evaluate the willingness to cooperate and practicality of a cooperative allocation solution, i.e., the solution's stability. This paper suggests seeking solutions for multi-participant resource allocation problems through an economics-based power index allocation method. This method can inform on allocation schemes that quantify a party's willingness to participate in a negotiation rather than opt for no agreement. Through comparison of the suggested method with a range of distance-based multi-criteria decision making rules, namely, least squares, MAXIMIN, MINIMAX, and compromise programming, this paper shows that optimality and stability can produce different allocation solutions. The mismatch between the socially-optimal alternative and the most stable alternative can potentially result in parties leaving the negotiation as they may be too dissatisfied with their resource share. This finding has important policy implications as it justifies why stakeholders may not accept the socially optimal solution in practice, and underlies the necessity of considering stability where it may be more appropriate to give up an unstable Pareto-optimal solution for an inferior stable one. Authors suggest assessing the stability of an allocation solution as an

  17. Predicting online ratings based on the opinion spreading process

    NASA Astrophysics Data System (ADS)

    He, Xing-Sheng; Zhou, Ming-Yang; Zhuo, Zhao; Fu, Zhong-Qian; Liu, Jian-Guo

    2015-10-01

    Predicting users' online ratings is always a challenge issue and has drawn lots of attention. In this paper, we present a rating prediction method by combining the user opinion spreading process with the collaborative filtering algorithm, where user similarity is defined by measuring the amount of opinion a user transfers to another based on the primitive user-item rating matrix. The proposed method could produce a more precise rating prediction for each unrated user-item pair. In addition, we introduce a tunable parameter λ to regulate the preferential diffusion relevant to the degree of both opinion sender and receiver. The numerical results for Movielens and Netflix data sets show that this algorithm has a better accuracy than the standard user-based collaborative filtering algorithm using Cosine and Pearson correlation without increasing computational complexity. By tuning λ, our method could further boost the prediction accuracy when using Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) as measurements. In the optimal cases, on Movielens and Netflix data sets, the corresponding algorithmic accuracy (MAE and RMSE) are improved 11.26% and 8.84%, 13.49% and 10.52% compared to the item average method, respectively.

  18. Heart rate measurement based on face video sequence

    NASA Astrophysics Data System (ADS)

    Xu, Fang; Zhou, Qin-Wu; Wu, Peng; Chen, Xing; Yang, Xiaofeng; Yan, Hong-jian

    2015-03-01

    This paper proposes a new non-contact heart rate measurement method based on photoplethysmography (PPG) theory. With this method we can measure heart rate remotely with a camera and ambient light. We collected video sequences of subjects, and detected remote PPG signals through video sequences. Remote PPG signals were analyzed with two methods, Blind Source Separation Technology (BSST) and Cross Spectral Power Technology (CSPT). BSST is a commonly used method, and CSPT is used for the first time in the study of remote PPG signals in this paper. Both of the methods can acquire heart rate, but compared with BSST, CSPT has clearer physical meaning, and the computational complexity of CSPT is lower than that of BSST. Our work shows that heart rates detected by CSPT method have good consistency with the heart rates measured by a finger clip oximeter. With good accuracy and low computational complexity, the CSPT method has a good prospect for the application in the field of home medical devices and mobile health devices.

  19. Diffusion-Based Density-Equalizing Maps: an Interdisciplinary Approach to Visualizing Homicide Rates and Other Georeferenced Statistical Data

    NASA Astrophysics Data System (ADS)

    Mazzitello, Karina I.; Candia, Julián

    2012-12-01

    In every country, public and private agencies allocate extensive funding to collect large-scale statistical data, which in turn are studied and analyzed in order to determine local, regional, national, and international policies regarding all aspects relevant to the welfare of society. One important aspect of that process is the visualization of statistical data with embedded geographical information, which most often relies on archaic methods such as maps colored according to graded scales. In this work, we apply nonstandard visualization techniques based on physical principles. We illustrate the method with recent statistics on homicide rates in Brazil and their correlation to other publicly available data. This physics-based approach provides a novel tool that can be used by interdisciplinary teams investigating statistics and model projections in a variety of fields such as economics and gross domestic product research, public health and epidemiology, sociodemographics, political science, business and marketing, and many others.

  20. A high-rate PCI-based telemetry processor system

    NASA Astrophysics Data System (ADS)

    Turri, R.

    2002-07-01

    The high performances reached by the Satellite on-board telemetry generation and transmission, as consequently, will impose the design of ground facilities with higher processing capabilities at low cost to allow a good diffusion of these ground station. The equipment normally used are based on complex, proprietary bus and computing architectures that prevent the systems from exploiting the continuous and rapid increasing in computing power available on market. The PCI bus systems now allow processing of high-rate data streams in a standard PC-system. At the same time the Windows NT operating system supports multitasking and symmetric multiprocessing, giving the capability to process high data rate signals. In addition, high-speed networking, 64 bit PCI-bus technologies and the increase in processor power and software, allow creating a system based on COTS products (which in future may be easily and inexpensively upgraded). In the frame of EUCLID RTP 9.8 project, a specific work element was dedicated to develop the architecture of a system able to acquire telemetry data of up to 600 Mbps. Laben S.p.A - a Finmeccanica Company -, entrusted of this work, has designed a PCI-based telemetry system making possible the communication between a satellite down-link and a wide area network at the required rate.

  1. Fast adaptive OFDM-PON over single fiber loopback transmission using dynamic rate adaptation-based algorithm for channel performance improvement

    NASA Astrophysics Data System (ADS)

    Kartiwa, Iwa; Jung, Sang-Min; Hong, Moon-Ki; Han, Sang-Kook

    2014-03-01

    In this paper, we propose a novel fast adaptive approach that was applied to an OFDM-PON 20-km single fiber loopback transmission system to improve channel performance in term of stabilized BER below 2 × 10-3 and higher throughput beyond 10 Gb/s. The upstream transmission is performed through light source-seeded modulation using 1-GHz RSOA at the ONU. Experimental results indicated that the dynamic rate adaptation algorithm based on greedy Levin-Campello could be an effective solution to mitigate channel instability and data rate degradation caused by the Rayleigh back scattering effect and inefficient resource subcarrier allocation.

  2. Quantitative prediction of genome-wide resource allocation in bacteria.

    PubMed

    Goelzer, Anne; Muntel, Jan; Chubukov, Victor; Jules, Matthieu; Prestel, Eric; Nölker, Rolf; Mariadassou, Mahendra; Aymerich, Stéphane; Hecker, Michael; Noirot, Philippe; Becher, Dörte; Fromion, Vincent

    2015-11-01

    Predicting resource allocation between cell processes is the primary step towards decoding the evolutionary constraints governing bacterial growth under various conditions. Quantitative prediction at genome-scale remains a computational challenge as current methods are limited by the tractability of the problem or by simplifying hypotheses. Here, we show that the constraint-based modeling method Resource Balance Analysis (RBA), calibrated using genome-wide absolute protein quantification data, accurately predicts resource allocation in the model bacterium Bacillus subtilis for a wide range of growth conditions. The regulation of most cellular processes is consistent with the objective of growth rate maximization except for a few suboptimal processes which likely integrate more complex objectives such as coping with stressful conditions and survival. As a proof of principle by using simulations, we illustrated how calibrated RBA could aid rational design of strains for maximizing protein production, offering new opportunities to investigate design principles in prokaryotes and to exploit them for biotechnological applications.

  3. Caesarean Delivery Rate Review: An Evidence-Based Analysis

    PubMed Central

    Degani, N; Sikich, N

    2015-01-01

    Background In 2007, caesarean deliveries comprised 28% of all hospital deliveries in Ontario. Provincial caesarean delivery rates increased with maternal age and varied by Local Health Integration Network. However, the accepted rate of caesarean delivery in a low-risk maternal population remains unclear. Objectives To review the literature to assess factors that affect the likelihood of experiencing a caesarean delivery, and to examine Ontario caesarean delivery rates to determine whether there is rate variation across the province. Data Sources Data sources included publications from OVID MEDLINE, OVID MEDLINE In-Process and Other Non-Indexed Citations, OVID Embase, EBSCO Cumulative Index to Nursing & Allied Health Literature (CINAHL), and EBM Reviews, as well as data from the Canadian Institute for Health Information Discharge Abstracts Database and the Better Outcomes and Registry Network. Review Methods A mixed-methods approach was used, which included a systematic review of the literature to delineate factors associated with the likelihood of caesarean delivery and an analysis of administrative and clinical data on hospital deliveries in Ontario to determine provincial caesarean delivery rates, variation in rates, and reasons for variation. Results Fourteen systematic reviews assessed 14 factors affecting the likelihood of caesarean delivery; 7 factors were associated with an increased likelihood of caesarean delivery, and 2 factors were associated with a decreased likelihood. Five factors had no influence. One factor provided moderate-quality evidence supporting elective induction policies in low-risk women. The overall Ontario caesarean delivery rate in a very-low-risk population was 17%, but varied significantly across Ontario hospitals. Limitations The literature review included a 5–year period and used only systematic reviews. The determination of Robson class for women is based on care received in hospital only, and the low-risk population may have

  4. Confidentiality Protection of User Data and Adaptive Resource Allocation for Managing Multiple Workflow Performance in Service-Based Systems

    ERIC Educational Resources Information Center

    An, Ho

    2012-01-01

    In this dissertation, two interrelated problems of service-based systems (SBS) are addressed: protecting users' data confidentiality from service providers, and managing performance of multiple workflows in SBS. Current SBSs pose serious limitations to protecting users' data confidentiality. Since users' sensitive data is sent in…

  5. Approaches to Resource Allocation

    ERIC Educational Resources Information Center

    Dressel, Paul; Simon, Lou Anna Kimsey

    1976-01-01

    Various budgeting patterns and strategies are currently in use, each with its own particular strengths and weaknesses. Neither cost-benefit analysis nor cost-effectiveness analysis offers any better solution to the allocation problem than do the unsupported contentions of departments or the historical unit costs. An operable model that performs…

  6. Monetary-based consequences for drug abstinence: Methods of implementation and some considerations about the allocation of finances in substance abusers

    PubMed Central

    Dallery, Jesse; Raiff, Bethany

    2012-01-01

    Conceptualizing drug abuse within the framework of behavioral theories of choice highlights the relevance of environmental variables in shifting behavior away from drug-related purchases. Choosing to use drugs results in immediate, certain consequences (e.g., drug high and relief from withdrawal), whereas choosing abstinence typically results in delayed, and often uncertain, consequences (e.g., improved health, interpersonal relationships, money). Contingency management (CM) increases choice for drug abstinence via the availability of immediate, financial-based gains, contingent on objective evidence of abstinence. In this selective review of the literature, we highlight a variety of methods to deliver CM in practical, effective, and sustainable ways. We consider a number of parameters that are critical to the success of monetary-based CM, and the role of the context in influencing CM’s effects. To illustrate the broad range of applications of CM, we also review different methods for arranging contingencies to promote abstinence and other relevant behavior. Finally, we discuss some considerations about how drug-dependent individuals allocate their finances in the context of CM interventions. PMID:22149758

  7. Video-rate volumetric optical coherence tomography-based microangiography

    NASA Astrophysics Data System (ADS)

    Baran, Utku; Wei, Wei; Xu, Jingjiang; Qi, Xiaoli; Davis, Wyatt O.; Wang, Ruikang K.

    2016-04-01

    Video-rate volumetric optical coherence tomography (vOCT) is relatively young in the field of OCT imaging but has great potential in biomedical applications. Due to the recent development of the MHz range swept laser sources, vOCT has started to gain attention in the community. Here, we report the first in vivo video-rate volumetric OCT-based microangiography (vOMAG) system by integrating an 18-kHz resonant microelectromechanical system (MEMS) mirror with a 1.6-MHz FDML swept source operating at ˜1.3 μm wavelength. Because the MEMS scanner can offer an effective B-frame rate of 36 kHz, we are able to engineer vOMAG with a video rate up to 25 Hz. This system was utilized for real-time volumetric in vivo visualization of cerebral microvasculature in mice. Moreover, we monitored the blood perfusion dynamics during stimulation within mouse ear in vivo. We also discussed this system's limitations. Prospective MEMS-enabled OCT probes with a real-time volumetric functional imaging capability can have a significant impact on endoscopic imaging and image-guided surgery applications.

  8. Video-rate volumetric optical coherence tomography-based microangiography

    NASA Astrophysics Data System (ADS)

    Baran, Utku; Wei, Wei; Xu, Jingjiang; Qi, Xiaoli; Davis, Wyatt O.; Wang, Ruikang K.

    2016-04-01

    Video-rate volumetric optical coherence tomography (vOCT) is relatively young in the field of OCT imaging but has great potential in biomedical applications. Due to the recent development of the MHz range swept laser sources, vOCT has started to gain attention in the community. Here, we report the first in vivo video-rate volumetric OCT-based microangiography (vOMAG) system by integrating an 18-kHz resonant microelectromechanical system (MEMS) mirror with a 1.6-MHz FDML swept source operating at ˜1.3 μm wavelength. Because the MEMS scanner can offer an effective B-frame rate of 36 kHz, we are able to engineer vOMAG with a video rate up to 25 Hz. This system was utilized for real-time volumetric in vivo visualization of cerebral microvasculature in mice. Moreover, we monitored the blood perfusion dynamics during stimulation within mouse ear in vivo. We also discussed this system's limitations. Prospective MEMS-enabled OCT probes with a real-time volumetric functional imaging capability can have a significant impact on endoscopic imaging and image-guided surgery applications.

  9. Cardiac rate detection method based on the beam splitter prism

    NASA Astrophysics Data System (ADS)

    Yang, Lei; Liu, Xiaohua; Liu, Ming; Zhao, Yuejin; Dong, Liquan; Zhao, Ruirui; Jin, Xiaoli; Zhao, Jingsheng

    2013-09-01

    A new cardiac rate measurement method is proposed. Through the beam splitter prism, the common-path optical system of transmitting and receiving signals is achieved. By the focusing effect of the lens, the small amplitude motion artifact is inhibited and the signal-to-noise is improved. The cardiac rate is obtained based on the PhotoPlethysmoGraphy (PPG). We use LED as the light source and use photoelectric diode as the receiving tube. The LED and the photoelectric diode are on the different sides of the beam splitter prism and they form the optical system. The signal processing and display unit is composed by the signal processing circuit, data acquisition device and computer. The light emitted by the modulated LED is collimated by the lens and irradiates the measurement target through the beam splitter prism. The light reflected by the target is focused on the receiving tube through the beam splitter prism and another lens. The signal received by the photoelectric diode is processed by the analog circuit and obtained by the data acquisition device. Through the filtering and Fast Fourier Transform, the cardiac rate is achieved. We get the real time cardiac rate by the moving average method. We experiment with 30 volunteers, containing different genders and different ages. We compare the signals captured by this method to a conventional PPG signal captured concurrently from a finger. The results of the experiments are all relatively agreeable and the biggest deviation value is about 2bmp.

  10. A count rate based contamination control standard for electron accelerators

    SciTech Connect

    May, R.T.; Schwahn, S.O.

    1996-12-31

    Accelerators of sufficient energy and particle fluence can produce radioactivity as an unwanted byproduct. The radioactivity is typically imbedded in structural materials but may also be removable from surfaces. Many of these radionuclides decay by positron emission or electron capture; they often have long half lives and produce photons of low energy and yield making detection by standard devices difficult. The contamination control limit used throughout the US nuclear industry and the Department of Energy is 1,000 disintegrations per minute. This limit is based on the detection threshold of pancake type Geiger-Mueller probes for radionuclides of relatively high radiotoxicity, such as cobalt-60. Several radionuclides of concern at a high energy electron accelerator are compared in terms of radiotoxicity with radionuclides commonly found in the nuclear industry. Based on this comparison, a count-rate based contamination control limit and associated measurement strategy is proposed which provides adequate detection of contamination at accelerators without an increase in risk.

  11. [Design of Oxygen Saturation, Heart Rate, Respiration Rate Detection System Based on Smartphone of Android Operating System].

    PubMed

    Zhu, Mingshan; Zeng, Bixin

    2015-03-01

    In this paper, we designed an oxygen saturation, heart rate, respiration rate monitoring system based on smartphone of android operating system, physiological signal acquired by MSP430 microcontroller and transmitted by Bluetooth module. PMID:26524782

  12. [Design of Oxygen Saturation, Heart Rate, Respiration Rate Detection System Based on Smartphone of Android Operating System].

    PubMed

    Zhu, Mingshan; Zeng, Bixin

    2015-03-01

    In this paper, we designed an oxygen saturation, heart rate, respiration rate monitoring system based on smartphone of android operating system, physiological signal acquired by MSP430 microcontroller and transmitted by Bluetooth module.

  13. Doppler-Based Flow Rate Sensing in Microfluidic Channels

    PubMed Central

    Stern, Liron; Bakal, Avraham; Tzur, Mor; Veinguer, Maya; Mazurski, Noa; Cohen, Nadav; Levy, Uriel

    2014-01-01

    We design, fabricate and experimentally demonstrate a novel generic method to detect flow rates and precise changes of flow velocity in microfluidic devices. Using our method we can measure flow rates of ∼2 mm/s with a resolution of 0.08 mm/s. The operation principle is based on the Doppler shifting of light diffracted from a self-generated periodic array of bubbles within the channel and using self-heterodyne detection to analyze the diffracted light. As such, the device is appealing for variety of “lab on chip” bio-applications where a simple and accurate speed measurement is needed, e.g., for flow-cytometry and cell sorting. PMID:25211195

  14. Rate-Based Model Predictive Control of Turbofan Engine Clearance

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan A.

    2006-01-01

    An innovative model predictive control strategy is developed for control of nonlinear aircraft propulsion systems and sub-systems. At the heart of the controller is a rate-based linear parameter-varying model that propagates the state derivatives across the prediction horizon, extending prediction fidelity to transient regimes where conventional models begin to lose validity. The new control law is applied to a demanding active clearance control application, where the objectives are to tightly regulate blade tip clearances and also anticipate and avoid detrimental blade-shroud rub occurrences by optimally maintaining a predefined minimum clearance. Simulation results verify that the rate-based controller is capable of satisfying the objectives during realistic flight scenarios where both a conventional Jacobian-based model predictive control law and an unconstrained linear-quadratic optimal controller are incapable of doing so. The controller is evaluated using a variety of different actuators, illustrating the efficacy and versatility of the control approach. It is concluded that the new strategy has promise for this and other nonlinear aerospace applications that place high importance on the attainment of control objectives during transient regimes.

  15. Infrared imaging based hyperventilation monitoring through respiration rate estimation

    NASA Astrophysics Data System (ADS)

    Basu, Anushree; Routray, Aurobinda; Mukherjee, Rashmi; Shit, Suprosanna

    2016-07-01

    A change in the skin temperature is used as an indicator of physical illness which can be detected through infrared thermography. Thermograms or thermal images can be used as an effective diagnostic tool for monitoring and diagnosis of various diseases. This paper describes an infrared thermography based approach for detecting hyperventilation caused due to stress and anxiety in human beings by computing their respiration rates. The work employs computer vision techniques for tracking the region of interest from thermal video to compute the breath rate. Experiments have been performed on 30 subjects. Corner feature extraction using Minimum Eigenvalue (Shi-Tomasi) algorithm and registration using Kanade Lucas-Tomasi algorithm has been used here. Thermal signature around the extracted region is detected and subsequently filtered through a band pass filter to compute the respiration profile of an individual. If the respiration profile shows unusual pattern and exceeds the threshold we conclude that the person is stressed and tending to hyperventilate. Results obtained are compared with standard contact based methods which have shown significant correlations. It is envisaged that the thermal image based approach not only will help in detecting hyperventilation but can assist in regular stress monitoring as it is non-invasive method.

  16. Visual Attention Allocation Between Robotic Arm and Environmental Process Control: Validating the STOM Task Switching Model

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher; Vieanne, Alex; Clegg, Benjamin; Sebok, Angelia; Janes, Jessica

    2015-01-01

    Fifty six participants time shared a spacecraft environmental control system task with a realistic space robotic arm control task in either a manual or highly automated version. The former could suffer minor failures, whose diagnosis and repair were supported by a decision aid. At the end of the experiment this decision aid unexpectedly failed. We measured visual attention allocation and switching between the two tasks, in each of the eight conditions formed by manual-automated arm X expected-unexpected failure X monitoring- failure management. We also used our multi-attribute task switching model, based on task attributes of priority interest, difficulty and salience that were self-rated by participants, to predict allocation. An un-weighted model based on attributes of difficulty, interest and salience accounted for 96 percent of the task allocation variance across the 8 different conditions. Task difficulty served as an attractor, with more difficult tasks increasing the tendency to stay on task.

  17. Networks in financial markets based on the mutual information rate.

    PubMed

    Fiedor, Paweł

    2014-05-01

    In the last few years there have been many efforts in econophysics studying how network theory can facilitate understanding of complex financial markets. These efforts consist mainly of the study of correlation-based hierarchical networks. This is somewhat surprising as the underlying assumptions of research looking at financial markets are that they are complex systems and thus behave in a nonlinear manner, which is confirmed by numerous studies, making the use of correlations which are inherently dealing with linear dependencies only baffling. In this paper we introduce a way to incorporate nonlinear dynamics and dependencies into hierarchical networks to study financial markets using mutual information and its dynamical extension: the mutual information rate. We show that this approach leads to different results than the correlation-based approach used in most studies, on the basis of 91 companies listed on the New York Stock Exchange 100 between 2003 and 2013, using minimal spanning trees and planar maximally filtered graphs.

  18. Networks in financial markets based on the mutual information rate

    NASA Astrophysics Data System (ADS)

    Fiedor, Paweł

    2014-05-01

    In the last few years there have been many efforts in econophysics studying how network theory can facilitate understanding of complex financial markets. These efforts consist mainly of the study of correlation-based hierarchical networks. This is somewhat surprising as the underlying assumptions of research looking at financial markets are that they are complex systems and thus behave in a nonlinear manner, which is confirmed by numerous studies, making the use of correlations which are inherently dealing with linear dependencies only baffling. In this paper we introduce a way to incorporate nonlinear dynamics and dependencies into hierarchical networks to study financial markets using mutual information and its dynamical extension: the mutual information rate. We show that this approach leads to different results than the correlation-based approach used in most studies, on the basis of 91 companies listed on the New York Stock Exchange 100 between 2003 and 2013, using minimal spanning trees and planar maximally filtered graphs.

  19. Rate-based degradation modeling of lithium-ion cells

    SciTech Connect

    E.V. Thomas; I. Bloom; J.P. Christophersen; V.S. Battaglia

    2012-05-01

    Accelerated degradation testing is commonly used as the basis to characterize battery cell performance over a range of stress conditions (e.g., temperatures). Performance is measured by some response that is assumed to be related to the state of health of the cell (e.g., discharge resistance). Often, the ultimate goal of such testing is to predict cell life at some reference stress condition, where cell life is defined to be the point in time where performance has degraded to some critical level. These predictions are based on a degradation model that expresses the expected performance level versus the time and conditions under which a cell has been aged. Usually, the degradation model relates the accumulated degradation to the time at a constant stress level. The purpose of this article is to present an alternative framework for constructing a degradation model that focuses on the degradation rate rather than the accumulated degradation. One benefit of this alternative approach is that prediction of cell life is greatly facilitated in situations where the temperature exposure is not isothermal. This alternative modeling framework is illustrated via a family of rate-based models and experimental data acquired during calendar-life testing of high-power lithium-ion cells.

  20. Rate-based screening of pressure-dependent reaction networks

    NASA Astrophysics Data System (ADS)

    Matheu, David M.; Lada, Thomas A.; Green, William H.; Dean, Anthony M.; Grenda, Jeffrey M.

    2001-08-01

    Computer tools to automatically generate large gas-phase kinetic models find increasing use in industry. Until recently, mechanism generation algorithms have been restricted to generating kinetic models in the high-pressure limit, unless special adjustments are made for particular cases. A new approach, recently presented, allows the automated generation of pressure-dependent reaction networks for chemically and thermally activated reactions (Grenda et al., 2000; Grenda and Dean, in preparation; Grenda et al., 1998; see Refs. [1-3]). These pressure-dependent reaction networks can be quite large and can contain a large number of unimportant pathways. We thus present an algorithm for the automated screening of pressure-dependent reaction networks. It allows a computer to discover and incorporate pressure-dependent reactions in a manner consistent with the existing rate-based model generation method. The new algorithm works by using a partially-explored (or "screened") pressure-dependent reaction network to predict rate constants, and updating predictions as more parts of the network are discovered. It requires only partial knowledge of the network connectivity, and allows the user to explore only the important channels at a given temperature and pressure. Applications to vinyl + O 2, 1-naphthyl + acetylene and phenylvinyl radical dissociation are presented. We show that the error involved in using a truncated pressure-dependent network to predict a rate constant is insignificant, for all channels whose yields are significantly greater than a user-specified tolerance. A bound for the truncation error is given. This work demonstrates the feasibility of using screened networks to predict pressure-dependent rate constants k(T,P).

  1. 7 CFR 6.25 - Allocation of licenses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 1 2013-01-01 2013-01-01 false Allocation of licenses. 6.25 Section 6.25 Agriculture Office of the Secretary of Agriculture IMPORT QUOTAS AND FEES Dairy Tariff-Rate Import Quota Licensing § 6.25 Allocation of licenses. (a) Historical licenses for the 1997 quota year (Appendix 1). (1)...

  2. Satellite altimetry based rating curves throughout the entire Amazon basin

    NASA Astrophysics Data System (ADS)

    Paris, A.; Calmant, S.; Paiva, R. C.; Collischonn, W.; Silva, J. S.; Bonnet, M.; Seyler, F.

    2013-05-01

    The Amazonian basin is the largest hydrological basin all over the world. In the recent past years, the basin has experienced an unusual succession of extreme draughts and floods, which origin is still a matter of debate. Yet, the amount of data available is poor, both over time and space scales, due to factor like basin's size, access difficulty and so on. One of the major locks is to get discharge series distributed over the entire basin. Satellite altimetry can be used to improve our knowledge of the hydrological stream flow conditions in the basin, through rating curves. Rating curves are mathematical relationships between stage and discharge at a given place. The common way to determine the parameters of the relationship is to compute the non-linear regression between the discharge and stage series. In this study, the discharge data was obtained by simulation through the entire basin using the MGB-IPH model with TRMM Merge input rainfall data and assimilation of gage data, run from 1998 to 2010. The stage dataset is made of ~800 altimetry series at ENVISAT and JASON-2 virtual stations. Altimetry series span between 2002 and 2010. In the present work we present the benefits of using stochastic methods instead of probabilistic ones to determine a dataset of rating curve parameters which are consistent throughout the entire Amazon basin. The rating curve parameters have been computed using a parameter optimization technique based on Markov Chain Monte Carlo sampler and Bayesian inference scheme. This technique provides an estimate of the best parameters for the rating curve, but also their posterior probability distribution, allowing the determination of a credibility interval for the rating curve. Also is included in the rating curve determination the error over discharges estimates from the MGB-IPH model. These MGB-IPH errors come from either errors in the discharge derived from the gage readings or errors in the satellite rainfall estimates. The present

  3. Myrmics Memory Allocator

    SciTech Connect

    Lymperis, S.

    2011-09-23

    MMA is a stand-alone memory management system for MPI clusters. It implements a shared Partitioned Global Address Space, where multiple MPI processes request objects from the allocator and the latter provides them with system-wide unique memory addresses for each object. It provides applications with an intuitive way of managing the memory system in a unified way, thus enabling easier writing of irregular application code.

  4. Metabolically Derived human ventilation rates: A revised approach based upon oxygen consumption rates (Final Report) 2009

    EPA Science Inventory

    The purpose of this report is to provide a revised approach for calculating an individual's ventilation rate directly from their oxygen consumption rate. This revised approach will be used to update the ventilation rate information in the Exposure Factors Handbook, which serve as...

  5. Synaptic Tagging During Memory Allocation

    PubMed Central

    Rogerson, Thomas; Cai, Denise; Frank, Adam; Sano, Yoshitake; Shobe, Justin; Aranda, Manuel L.; Silva, Alcino J.

    2014-01-01

    There is now compelling evidence that the allocation of memory to specific neurons (neuronal allocation) and synapses (synaptic allocation) in a neurocircuit is not random and that instead specific mechanisms, such as increases in neuronal excitability and synaptic tagging and capture, determine the exact sites where memories are stored. We propose an integrated view of these processes, such that neuronal allocation, synaptic tagging and capture, spine clustering and metaplasticity reflect related aspects of memory allocation mechanisms. Importantly, the properties of these mechanisms suggest a set of rules that profoundly affect how memories are stored and recalled. PMID:24496410

  6. What Does it Really Cost? Allocating Indirect Costs.

    ERIC Educational Resources Information Center

    Snyder, Herbert; Davenport, Elisabeth

    1997-01-01

    Better managerial control in terms of decision making and understanding the costs of a system/service result from allocating indirect costs. Allocation requires a three-step process: selecting cost objectives, pooling related overhead costs, and selecting costs bases to connect the objectives to the pooled costs. Argues that activity-based costing…

  7. 50 CFR 660.323 - Pacific whiting allocations, allocation attainment, and inseason allocation reapportionment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Pacific whiting allocations, allocation attainment, and inseason allocation reapportionment. 660.323 Section 660.323 Wildlife and Fisheries FISHERY...) FISHERIES OFF WEST COAST STATES West Coast Groundfish Fisheries § 660.323 Pacific whiting...

  8. [Health resources allocation in Canada provinces: the role of indicators of health needs].

    PubMed

    Thouez, Jean-Pierre

    2002-01-01

    In an attempt to limit their health care expenditures Canadian provinces have strengthened the necessity to allocate health care resources according to their population needs. The difficulties and limitations of the needs-based approach are explored. First, indicators of population needs for health care were introduced into a formula of resource allocation for hospital-based services in England in the late 1970. Secondly, there are broad similarities between both the philosophy and resource allocation strategies of Canada and Britain. Thirdly, the main definition of a needs indicator is to measure the level of equity- or inequity-in the distribution of health care resources between regions. Fourthly, a needs indicator, as least as developed by the Canadian provinces, concerns general and specialized services that should be found in each of their regions. Fifthly, a needs indicator constitutes a tool for the calculation of a capitation rate. Finally, future research should focus on parameters which are not an integral part of the allocation method, but which have a strong impact, in the attainment of regional equity such as administrative decisions that are taken when budgets are to be allocated or reduced between regions.

  9. [Health resources allocation in Canada provinces: the role of indicators of health needs].

    PubMed

    Thouez, Jean-Pierre

    2002-01-01

    In an attempt to limit their health care expenditures Canadian provinces have strengthened the necessity to allocate health care resources according to their population needs. The difficulties and limitations of the needs-based approach are explored. First, indicators of population needs for health care were introduced into a formula of resource allocation for hospital-based services in England in the late 1970. Secondly, there are broad similarities between both the philosophy and resource allocation strategies of Canada and Britain. Thirdly, the main definition of a needs indicator is to measure the level of equity- or inequity-in the distribution of health care resources between regions. Fourthly, a needs indicator, as least as developed by the Canadian provinces, concerns general and specialized services that should be found in each of their regions. Fifthly, a needs indicator constitutes a tool for the calculation of a capitation rate. Finally, future research should focus on parameters which are not an integral part of the allocation method, but which have a strong impact, in the attainment of regional equity such as administrative decisions that are taken when budgets are to be allocated or reduced between regions. PMID:12050941

  10. Stage-discharge rating curves based on satellite altimetry and modeled discharge in the Amazon basin

    NASA Astrophysics Data System (ADS)

    Paris, Adrien; Dias de Paiva, Rodrigo; Santos da Silva, Joecila; Medeiros Moreira, Daniel; Calmant, Stephane; Garambois, Pierre-André; Collischonn, Walter; Bonnet, Marie-Paule; Seyler, Frederique

    2016-05-01

    In this study, rating curves (RCs) were determined by applying satellite altimetry to a poorly gauged basin. This study demonstrates the synergistic application of remote sensing and watershed modeling to capture the dynamics and quantity of flow in the Amazon River Basin, respectively. Three major advancements for estimating basin-scale patterns in river discharge are described. The first advancement is the preservation of the hydrological meanings of the parameters expressed by Manning's equation to obtain a data set containing the elevations of the river beds throughout the basin. The second advancement is the provision of parameter uncertainties and, therefore, the uncertainties in the rated discharge. The third advancement concerns estimating the discharge while considering backwater effects. We analyzed the Amazon Basin using nearly one thousand series that were obtained from ENVISAT and Jason-2 altimetry for more than 100 tributaries. Discharge values and related uncertainties were obtained from the rain-discharge MGB-IPH model. We used a global optimization algorithm based on the Monte Carlo Markov Chain and Bayesian framework to determine the rating curves. The data were randomly allocated into 80% calibration and 20% validation subsets. A comparison with the validation samples produced a Nash-Sutcliffe efficiency (Ens) of 0.68. When the MGB discharge uncertainties were less than 5%, the Ens value increased to 0.81 (mean). A comparison with the in situ discharge resulted in an Ens value of 0.71 for the validation samples (and 0.77 for calibration). The Ens values at the mouths of the rivers that experienced backwater effects significantly improved when the mean monthly slope was included in the RC. Our RCs were not mission-dependent, and the Ens value was preserved when applying ENVISAT rating curves to Jason-2 altimetry at crossovers. The cease-to-flow parameter of our RCs provided a good proxy for determining river bed elevation. This proxy was validated

  11. Ground data systems resource allocation process

    NASA Technical Reports Server (NTRS)

    Berner, Carol A.; Durham, Ralph; Reilly, Norman B.

    1989-01-01

    The Ground Data Systems Resource Allocation Process at the Jet Propulsion Laboratory provides medium- and long-range planning for the use of Deep Space Network and Mission Control and Computing Center resources in support of NASA's deep space missions and Earth-based science. Resources consist of radio antenna complexes and associated data processing and control computer networks. A semi-automated system was developed that allows operations personnel to interactively generate, edit, and revise allocation plans spanning periods of up to ten years (as opposed to only two or three weeks under the manual system) based on the relative merit of mission events. It also enhances scientific data return. A software system known as the Resource Allocation and Planning Helper (RALPH) merges the conventional methods of operations research, rule-based knowledge engineering, and advanced data base structures. RALPH employs a generic, highly modular architecture capable of solving a wide variety of scheduling and resource sequencing problems. The rule-based RALPH system has saved significant labor in resource allocation. Its successful use affirms the importance of establishing and applying event priorities based on scientific merit, and the benefit of continuity in planning provided by knowledge-based engineering. The RALPH system exhibits a strong potential for minimizing development cycles of resource and payload planning systems throughout NASA and the private sector.

  12. Base rates of hate crime victimization among college students.

    PubMed

    Rayburn, Nadine Recker; Earleywine, Mitchell; Davison, Gerald C

    2003-10-01

    This study uses the unmatched count technique (UCT) to estimate base rates for hate crime victimization in college students and compares the results with estimates found using conventional methods. Hate crimes, criminal acts perpetrated against individuals or members of specific stigmatized groups, intend to express condemnation, hate, disapproval, dislike, or distrust for a group. The UCT is a promising tool in the investigation of hate crime because it does not require participants to directly answer sensitive questions. This may provide more accurate responses than other methods. The UCT revealed higher estimates for a variety of serious hate crimes, including physical and sexual assault. These higher estimates provide a better feel for the level of hate crime victimization and point to the increased need for hate crime victims' assistance programs on college campuses.

  13. Optimization of Surface Acoustic Wave-Based Rate Sensors

    PubMed Central

    Xu, Fangqian; Wang, Wen; Shao, Xiuting; Liu, Xinlu; Liang, Yong

    2015-01-01

    The optimization of an surface acoustic wave (SAW)-based rate sensor incorporating metallic dot arrays was performed by using the approach of partial-wave analysis in layered media. The optimal sensor chip designs, including the material choice of piezoelectric crystals and metallic dots, dot thickness, and sensor operation frequency were determined theoretically. The theoretical predictions were confirmed experimentally by using the developed SAW sensor composed of differential delay line-oscillators and a metallic dot array deposited along the acoustic wave propagation path of the SAW delay lines. A significant improvement in sensor sensitivity was achieved in the case of 128° YX LiNbO3, and a thicker Au dot array, and low operation frequency were used to structure the sensor. PMID:26473865

  14. Heat release rate properties of wood-based materials

    SciTech Connect

    Chamberlain, D.L.

    1983-07-01

    A background to the present heat release rate calorimetry is presented. Heat release rates and cumulative heat release were measured for 16 different lumber and wood products, using three different heat release rate instruments. The effects of moisture content, exposure heat flux, density of product, and fire retardant on rate of heat release were measured. The three small-scale heat release rate calorimeters were compared, and equations relating the data from each were developed.

  15. Foraging Allocation in the Honey Bee, Apis mellifera L. (Hymenoptera, Apidae), Tuned by the Presence of the Spinosad-Based Pesticide GF-120.

    PubMed

    Cabrera-Marín, N V; Liedo, P; Vandame, R; Sánchez, D

    2015-04-01

    Agroecosystem management commonly involves the use of pesticides. As a result, a heterogeneous landscape is created, in which suitable and unsuitable spaces are defined by the absence/presence of pesticides. In this study, we explored how foragers of the honey bee, Apis mellifera L., adapt to such context. We specifically evaluated the effect of GF-120, a spinosad-based fruit fly toxic bait, on the allocation of foragers between food sources under the hypothesis that foragers will move from food sources with GF-120 to food sources without it. We thus carried out three experiments: in experiment 1, a group of foragers was trained to collect honey solution from a feeder; next, this feeder offered a GF-120/honey solution. A minority of foragers continued collecting the GF-120/honey solution. In experiment 2, we trained two groups of foragers from a colony to two food sources equally rewarding. Next, GF-120 was added to one of the food sources. We found that the majority of foragers moved from the GF-120-treated feeder to the feeder without GF-120 and that the minority that continued visiting the GF-120-treated feeder did not collect the GF-120/honey solution. In a third experiment, we wanted to know if foragers in an experimental setup as in experiment 1 would perform waggle dances: none of the foragers that collected GF-120/honey were observed dancing. Our results emphasize the importance of "food refuges" for non-target species, since they minimize the impact of agrochemicals upon them. PMID:26013135

  16. High performance in healthcare priority setting and resource allocation: A literature- and case study-based framework in the Canadian context.

    PubMed

    Smith, Neale; Mitton, Craig; Hall, William; Bryan, Stirling; Donaldson, Cam; Peacock, Stuart; Gibson, Jennifer L; Urquhart, Bonnie

    2016-08-01

    Priority setting and resource allocation, or PSRA, are key functions of executive teams in healthcare organizations. Yet decision-makers often base their choices on historical patterns of resource distribution or political pressures. Our aim was to provide leaders with guidance on how to improve PSRA practice, by creating organizational contexts which enable high performance. We carried out in-depth case studies of six Canadian healthcare organizations to obtain from healthcare leaders their understanding of the concept of high performance in PSRA and the factors which contribute to its achievement. Individual and group interviews were carried out (n = 62) with senior managers, middle managers and Board members. Site observations and document review were used to assist researchers in interpreting the interview data. Qualitative data were analyzed iteratively with the literature on empirical examples of PSRA practice, in order to develop a framework of high performance in PSRA. The framework consists of four domains - structures, processes, attitudes and behaviours, and outcomes - within which are 19 specific elements. The emergent themes derive from case studies in different kinds of health organizations (urban/rural, small/large) across Canada. The elements can serve as a checklist for 'high performance' in PSRA. This framework provides a means by which decision-makers in healthcare might assess their practice and identify key areas for improvement. The findings are likely generalizable, certainly within Canada but also across countries. This work constitutes, to our knowledge, the first attempt to present a full package of elements comprising high performance in health care PSRA.

  17. High performance in healthcare priority setting and resource allocation: A literature- and case study-based framework in the Canadian context.

    PubMed

    Smith, Neale; Mitton, Craig; Hall, William; Bryan, Stirling; Donaldson, Cam; Peacock, Stuart; Gibson, Jennifer L; Urquhart, Bonnie

    2016-08-01

    Priority setting and resource allocation, or PSRA, are key functions of executive teams in healthcare organizations. Yet decision-makers often base their choices on historical patterns of resource distribution or political pressures. Our aim was to provide leaders with guidance on how to improve PSRA practice, by creating organizational contexts which enable high performance. We carried out in-depth case studies of six Canadian healthcare organizations to obtain from healthcare leaders their understanding of the concept of high performance in PSRA and the factors which contribute to its achievement. Individual and group interviews were carried out (n = 62) with senior managers, middle managers and Board members. Site observations and document review were used to assist researchers in interpreting the interview data. Qualitative data were analyzed iteratively with the literature on empirical examples of PSRA practice, in order to develop a framework of high performance in PSRA. The framework consists of four domains - structures, processes, attitudes and behaviours, and outcomes - within which are 19 specific elements. The emergent themes derive from case studies in different kinds of health organizations (urban/rural, small/large) across Canada. The elements can serve as a checklist for 'high performance' in PSRA. This framework provides a means by which decision-makers in healthcare might assess their practice and identify key areas for improvement. The findings are likely generalizable, certainly within Canada but also across countries. This work constitutes, to our knowledge, the first attempt to present a full package of elements comprising high performance in health care PSRA. PMID:27367899

  18. Simplifying rules for optimal allocation of preventive care resources.

    PubMed

    Gandjour, Afschin

    2012-04-01

    Given the limited resources for preventive care, policy-makers need to consider the efficiency/cost-effectiveness of preventive measures, such as drugs and vaccines, when allocating preventive care resources. However, in many settings only limited information on lifetime costs and effects of preventive measures exists. Therefore, it seems useful to provide policy-makers with some simplifying rules when allocating preventive care resources. The purpose of this article is to investigate the relevance of risk and severity of the disease to be prevented for the optimal allocation of preventive care resources. The report shows - based on a constrained optimization model - that optimal allocation of preventive care resources does, in fact, depend on both factors. Resources should be allocated to the prevention of diseases with a higher probability of occurrence or larger severity. This article also identifies situations where preventive care resources should be allocated to the prevention of less severe disease.

  19. [Organ allocation. Ethical issues].

    PubMed

    Cattorini, P

    2010-01-01

    The criteria for allocating organs are one of the most debated ethical issue in the transplantation programs. The article examines some rules and principles followed by "Nord Italia Transplant program", summarized in its Principles' Charter and explained in a recent interdisciplinary book. General theories of justice and their application to individual clinical cases are commented and evaluated, in order to foster a public, democratic, transparent debate among professionals and citizens, scientific associations and customers' organizations. Some specific moral dilemmas are focused regarding the concepts of proportionate treatment, unselfish donation by living persons, promotion of local institutions efficiency. PMID:20677677

  20. A comparison between computer-controlled and set work rate exercise based on target heart rate

    NASA Technical Reports Server (NTRS)

    Pratt, Wanda M.; Siconolfi, Steven F.; Webster, Laurie; Hayes, Judith C.; Mazzocca, Augustus D.; Harris, Bernard A., Jr.

    1991-01-01

    Two methods are compared for observing the heart rate (HR), metabolic equivalents, and time in target HR zone (defined as the target HR + or - 5 bpm) during 20 min of exercise at a prescribed intensity of the maximum working capacity. In one method, called set-work rate exercise, the information from a graded exercise test is used to select a target HR and to calculate a corresponding constant work rate that should induce the desired HR. In the other method, the work rate is controlled by a computer algorithm to achieve and maintain a prescribed target HR. It is shown that computer-controlled exercise is an effective alternative to the traditional set work rate exercise, particularly when tight control of cardiovascular responses is necessary.

  1. Artificial intelligent techniques for optimizing water allocation in a reservoir watershed

    NASA Astrophysics Data System (ADS)

    Chang, Fi-John; Chang, Li-Chiu; Wang, Yu-Chung

    2014-05-01

    This study proposes a systematical water allocation scheme that integrates system analysis with artificial intelligence techniques for reservoir operation in consideration of the great uncertainty upon hydrometeorology for mitigating droughts impacts on public and irrigation sectors. The AI techniques mainly include a genetic algorithm and adaptive-network based fuzzy inference system (ANFIS). We first derive evaluation diagrams through systematic interactive evaluations on long-term hydrological data to provide a clear simulation perspective of all possible drought conditions tagged with their corresponding water shortages; then search the optimal reservoir operating histogram using genetic algorithm (GA) based on given demands and hydrological conditions that can be recognized as the optimal base of input-output training patterns for modelling; and finally build a suitable water allocation scheme through constructing an adaptive neuro-fuzzy inference system (ANFIS) model with a learning of the mechanism between designed inputs (water discount rates and hydrological conditions) and outputs (two scenarios: simulated and optimized water deficiency levels). The effectiveness of the proposed approach is tested on the operation of the Shihmen Reservoir in northern Taiwan for the first paddy crop in the study area to assess the water allocation mechanism during drought periods. We demonstrate that the proposed water allocation scheme significantly and substantially avails water managers of reliably determining a suitable discount rate on water supply for both irrigation and public sectors, and thus can reduce the drought risk and the compensation amount induced by making restrictions on agricultural use water.

  2. Non-contact Laser-based Human Respiration Rate Measurement

    NASA Astrophysics Data System (ADS)

    Scalise, L.; Marchionni, P.; Ercoli, I.

    2011-08-01

    At present the majority of the instrumentation, used in clinical environments, to measure human respiration rate are based on invasive and contact devices. The gold standard instrument is considered the spirometer which is largely used; it needs a direct contact and requires a collaboration by the patient. Laser Doppler Vibrometer (LDVi) is an optical, non-contact measurement system for the assessment of a surface velocity and displacement. LDVi has already been used for the measurement of the cardiac activity and for the measurement of the chest-wall displacements. The aims of this work are to select the best measurement point on the thoracic surface for LDVi monitoring of the respiration rate (RR) and to compare measured data with the RR valued provided by the spirometer. The measurement system is composed by a LDV system and a data acquisition board installed on a PC. Tests were made on 10 different point of the thorax for each patient. Patients population was composed by 33 subjects (17 male and 16 female). The optimal measurement point was chosen considering the maximum peak-to-peak value of the displacement measured by LDV. Before extracting RR we have used a special wavelet decomposition for better selection of the expiration peaks. A standard spirometer was used for the validation of the data. From tests it results that the optimal measurement point, namely is located on the inferior part of the thoracic region (left, front side). From our tests we have obtained a close correlation between the RR values measured by the spirometer and those measured by the proposed method: a difference of 14±211 ms on the RR value is reported for the entire population of 33 subjects. Our method allows a no-contact measurement of lungs activity (respiration period), reducing the electric and biological risks. Moreover it allows to measure in critical environment like in RMN or in burned skin where is difficult or impossible to apply electrodes.

  3. Estimating glomerular filtration rate in a population-based study

    PubMed Central

    Shankar, Anoop; Lee, Kristine E; Klein, Barbara EK; Muntner, Paul; Brazy, Peter C; Cruickshanks, Karen J; Nieto, F Javier; Danforth, Lorraine G; Schubert, Carla R; Tsai, Michael Y; Klein, Ronald

    2010-01-01

    Background: Glomerular filtration rate (GFR)-estimating equations are used to determine the prevalence of chronic kidney disease (CKD) in population-based studies. However, it has been suggested that since the commonly used GFR equations were originally developed from samples of patients with CKD, they underestimate GFR in healthy populations. Few studies have made side-by-side comparisons of the effect of various estimating equations on the prevalence estimates of CKD in a general population sample. Patients and methods: We examined a population-based sample comprising adults from Wisconsin (age, 43–86 years; 56% women). We compared the prevalence of CKD, defined as a GFR of <60 mL/min per 1.73 m2 estimated from serum creatinine, by applying various commonly used equations including the modification of diet in renal disease (MDRD) equation, Cockcroft–Gault (CG) equation, and the Mayo equation. We compared the performance of these equations against the CKD definition of cystatin C >1.23 mg/L. Results: We found that the prevalence of CKD varied widely among different GFR equations. Although the prevalence of CKD was 17.2% with the MDRD equation and 16.5% with the CG equation, it was only 4.8% with the Mayo equation. Only 24% of those identified to have GFR in the range of 50–59 mL/min per 1.73 m2 by the MDRD equation had cystatin C levels >1.23 mg/L; their mean cystatin C level was only 1 mg/L (interquartile range, 0.9–1.2 mg/L). This finding was similar for the CG equation. For the Mayo equation, 62.8% of those patients with GFR in the range of 50–59 mL/min per 1.73 m2 had cystatin C levels >1.23 mg/L; their mean cystatin C level was 1.3 mg/L (interquartile range, 1.2–1.5 mg/L). The MDRD and CG equations showed a false-positive rate of >10%. Discussion: We found that the MDRD and CG equations, the current standard to estimate GFR, appeared to overestimate the prevalence of CKD in a general population sample. PMID:20730018

  4. Biomass Resource Allocation among Competing End Uses

    SciTech Connect

    Newes, E.; Bush, B.; Inman, D.; Lin, Y.; Mai, T.; Martinez, A.; Mulcahy, D.; Short, W.; Simpkins, T.; Uriarte, C.; Peck, C.

    2012-05-01

    The Biomass Scenario Model (BSM) is a system dynamics model developed by the U.S. Department of Energy as a tool to better understand the interaction of complex policies and their potential effects on the biofuels industry in the United States. However, it does not currently have the capability to account for allocation of biomass resources among the various end uses, which limits its utilization in analysis of policies that target biomass uses outside the biofuels industry. This report provides a more holistic understanding of the dynamics surrounding the allocation of biomass among uses that include traditional use, wood pellet exports, bio-based products and bioproducts, biopower, and biofuels by (1) highlighting the methods used in existing models' treatments of competition for biomass resources; (2) identifying coverage and gaps in industry data regarding the competing end uses; and (3) exploring options for developing models of biomass allocation that could be integrated with the BSM to actively exchange and incorporate relevant information.

  5. Online rate control in digital cameras for near-constant distortion based on minimum/maximum criterion

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Yong; Ortega, Antonio

    2000-04-01

    We address the problem of online rate control in digital cameras, where the goal is to achieve near-constant distortion for each image. Digital cameras usually have a pre-determined number of images that can be stored for the given memory size and require limited time delay and constant quality for each image. Due to time delay restrictions, each image should be stored before the next image is received. Therefore, we need to define an online rate control that is based on the amount of memory used by previously stored images, the current image, and the estimated rate of future images. In this paper, we propose an algorithm for online rate control, in which an adaptive reference, a 'buffer-like' constraint, and a minimax criterion (as a distortion metric to achieve near-constant quality) are used. The adaptive reference is used to estimate future images and the 'buffer-like' constraint is required to keep enough memory for future images. We show that using our algorithm to select online bit allocation for each image in a randomly given set of images provides near constant quality. Also, we show that our result is near optimal when a minimax criterion is used, i.e., it achieves a performance close to that obtained by applying an off-line rate control that assumes exact knowledge of the images. Suboptimal behavior is only observed in situations where the distribution of images is not truly random (e.g., if most of the 'complex' images are captured at the end of the sequence.) Finally, we propose a T- step delay rate control algorithm and using the result of 1- step delay rate control algorithm, we show that this algorithm removes the suboptimal behavior.

  6. An Approach for Transmission Loss and Cost Allocation by Loss Allocation Index and Co-operative Game Theory

    NASA Astrophysics Data System (ADS)

    Khan, Baseem; Agnihotri, Ganga; Mishra, Anuprita S.

    2016-03-01

    In the present work authors proposed a novel method for transmission loss and cost allocation to users (generators and loads). In the developed methodology transmission losses are allocated to users based on their usage of the transmission line. After usage allocation, particular loss allocation indices (PLAI) are evaluated for loads and generators. Also Cooperative game theory approach is applied for comparison of results. The proposed method is simple and easy to implement on the practical power system. Sample 6 bus and IEEE 14 bus system is used for showing the effectiveness of proposed method.

  7. Motion-related resource allocation in dynamic wireless visual sensor network environments.

    PubMed

    Katsenou, Angeliki V; Kondi, Lisimachos P; Parsopoulos, Konstantinos E

    2014-01-01

    This paper investigates quality-driven cross-layer optimization for resource allocation in direct sequence code division multiple access wireless visual sensor networks. We consider a single-hop network topology, where each sensor transmits directly to a centralized control unit (CCU) that manages the available network resources. Our aim is to enable the CCU to jointly allocate the transmission power and source-channel coding rates for each node, under four different quality-driven criteria that take into consideration the varying motion characteristics of each recorded video. For this purpose, we studied two approaches with a different tradeoff of quality and complexity. The first one allocates the resources individually for each sensor, whereas the second clusters them according to the recorded level of motion. In order to address the dynamic nature of the recorded scenery and re-allocate the resources whenever it is dictated by the changes in the amount of motion in the scenery, we propose a mechanism based on the particle swarm optimization algorithm, combined with two restarting schemes that either exploit the previously determined resource allocation or conduct a rough estimation of it. Experimental simulations demonstrate the efficiency of the proposed approaches.

  8. The Study of CIQ Inspection Rate's Setting Problem Based on Gray-fuzzy Comprehensive Evaluation Theory

    NASA Astrophysics Data System (ADS)

    Hui, Liu; Ding, Liu Wen

    Inspection is not only one of the most significant duties, but the important prerequisite for going through Customs. Setting the inspection rate scientifically can not only solve the contradiction between the "check on" and "service" for CIQ (CHINA ENTRY-EXIT INSPECTION AND UARANTINE)site inspection personnel, but can highlight the key of inspection and enhance the ratio of discovering, achieving the optimum allocation of CIQ's limited human and material resources. In this article, from the characteristics of inspection risk evaluation themselves, considering the setting problem of check rate at from the angle of data mining, construct an index system rationally and objectively. On the basis of Fuzzy theory, evaluate the risk of the inward and outward goods with the grey-fuzzy comprehensive evaluation method, and establish the inspection rate scientifically.

  9. Allocation of Resources. SPEC Kit 31.

    ERIC Educational Resources Information Center

    Association of Research Libraries, Washington, DC. Office of Management Studies.

    This kit on resource allocation in academic and research libraries contains nine primary source documents and a concise summary of a 1976 Association of Research Libraries (ARL) survey on management of fiscal spending activities in ARL libraries. Based on responses from 70 libraries, the summary discusses 3 specific subjects within the general…

  10. Generalized multidimensional dynamic allocation method.

    PubMed

    Lebowitsch, Jonathan; Ge, Yan; Young, Benjamin; Hu, Feifang

    2012-12-10

    Dynamic allocation has received considerable attention since it was first proposed in the 1970s as an alternative means of allocating treatments in clinical trials which helps to secure the balance of prognostic factors across treatment groups. The purpose of this paper is to present a generalized multidimensional dynamic allocation method that simultaneously balances treatment assignments at three key levels: within the overall study, within each level of each prognostic factor, and within each stratum, that is, combination of levels of different factors Further it offers capabilities for unbalanced and adaptive designs for trials. The treatment balancing performance of the proposed method is investigated through simulations which compare multidimensional dynamic allocation with traditional stratified block randomization and the Pocock-Simon method. On the basis of these results, we conclude that this generalized multidimensional dynamic allocation method is an improvement over conventional dynamic allocation methods and is flexible enough to be applied for most trial settings including Phases I, II and III trials.

  11. Metabolically-Derived Human Ventilation Rates: A Revised Approach Based Upon Oxygen Consumption Rates (External Review Draft)

    EPA Science Inventory

    EPA has released a draft report entitled, Metabolically-Derived Human Ventilation Rates: A Revised Approach Based Upon Oxygen Consumption Rates, for independent external peer review and public comment. NCEA published the Exposure Factors Handbook in 1997. This comprehens...

  12. 28 CFR 100.14 - Directly allocable costs.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... the benefits accruing to the multiple cost objectives. (6) The base period for allocating allocable... incurred for the same purpose in like circumstances have been included as a direct cost of that, or any... of the benefits accruing to the multiple cost objectives. (ii) Similarly, the particular case...

  13. Resource Allocation Strategies Employed in Large versus Small School Systems.

    ERIC Educational Resources Information Center

    Gutierrez, Eugene J.

    Education is faced with a declining resource base coupled with overwhelming demands for categorical programs. The current resource allocation strategy common to all systems is cutting spending. The difference between large and small districts is less important than differences in complexity. Complexity in resource allocations is more a function of…

  14. Sexually violent predators: toward reasonable estimates of recidivism base rates.

    PubMed

    Neller, Daniel J; Petris, Giovanni

    2013-01-01

    The sexual recidivism rate of sex offenders is a controversial issue. Perhaps as controversial is the sexual recidivism rate of the select group of sex offenders who are examined pursuant to sexually violent predator (SVP) statutes. At present, reliable estimates of SVP recidivism are unavailable. We propose that reasonable estimates of SVP recidivism can be reached by considering three available pieces of data: (i) a likely recidivism rate of the general population of sex offenders; (ii) procedures typically followed by jurisdictions that civilly commit sex offenders; and (iii) classification accuracy of procedures. Although sexual recidivism rates vary across jurisdictions, the results of our analyses suggest sex offenders referred for examination pursuant to SVP statutes recidivate at substantially higher rates than typical sex offenders. Our results further suggest that sex offenders recommended for commitment as SVPs recidivate at even greater rates than SVP respondents who are not recommended for commitment. We discuss practice and policy implications of these findings.

  15. The choice between allocation principles: amplifying when equality dominates.

    PubMed

    Eek, Daniel; Selart, Marcus

    2009-04-01

    One hundred and ninety participants (95 undergraduates and 95 employees) responded to a factorial survey in which a number of case-based organizational allocation tasks were described. Participants were asked to imagine themselves as employees in fictitious organizations and chose among three allocations of employee-development schemes invested by the manager in different work groups. The allocations regarded how such investments should be allocated between two parties. Participants chose twice, once picking the fairest and once the best allocation. One between-subjects factor varied whether the parties represented social (i.e., choosing among allocations between two different work groups) or temporal comparisons (i.e., choosing among allocations between the present and the following year). Another between-subjects factor varied whether participants' in-group was represented by the parties or not. One allocation maximized the outcome to one party, another maximized the joint outcome received by both parties, and a third provided both parties with equal but lower outcomes. It was predicted that equality, although always deficient to both parties, would be the preferred allocation when parties represented social comparisons and when choices were based on fairness. When parties represented temporal comparisons, and when choices were based on preference, maximizing the joint outcome was hypothesized to be the preferred allocation. Results supported these hypotheses. Against what was predicted, whether the in-group was represented by the parties or not did not moderate the results, indicating that participants' allocation preferences were not affected by self-interest. The main message is that people make sensible distinctions between what they prefer and what they regard as fair. The results were the same for participating students who imagined themselves as being employees and participants who were true employees, suggesting that no serious threats to external

  16. The choice between allocation principles: amplifying when equality dominates.

    PubMed

    Eek, Daniel; Selart, Marcus

    2009-04-01

    One hundred and ninety participants (95 undergraduates and 95 employees) responded to a factorial survey in which a number of case-based organizational allocation tasks were described. Participants were asked to imagine themselves as employees in fictitious organizations and chose among three allocations of employee-development schemes invested by the manager in different work groups. The allocations regarded how such investments should be allocated between two parties. Participants chose twice, once picking the fairest and once the best allocation. One between-subjects factor varied whether the parties represented social (i.e., choosing among allocations between two different work groups) or temporal comparisons (i.e., choosing among allocations between the present and the following year). Another between-subjects factor varied whether participants' in-group was represented by the parties or not. One allocation maximized the outcome to one party, another maximized the joint outcome received by both parties, and a third provided both parties with equal but lower outcomes. It was predicted that equality, although always deficient to both parties, would be the preferred allocation when parties represented social comparisons and when choices were based on fairness. When parties represented temporal comparisons, and when choices were based on preference, maximizing the joint outcome was hypothesized to be the preferred allocation. Results supported these hypotheses. Against what was predicted, whether the in-group was represented by the parties or not did not moderate the results, indicating that participants' allocation preferences were not affected by self-interest. The main message is that people make sensible distinctions between what they prefer and what they regard as fair. The results were the same for participating students who imagined themselves as being employees and participants who were true employees, suggesting that no serious threats to external

  17. A Framework for Optimal Control Allocation with Structural Load Constraints

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Taylor, Brian R.; Jutte, Christine V.; Burken, John J.; Trinh, Khanh V.; Bodson, Marc

    2010-01-01

    Conventional aircraft generally employ mixing algorithms or lookup tables to determine control surface deflections needed to achieve moments commanded by the flight control system. Control allocation is the problem of converting desired moments into control effector commands. Next generation aircraft may have many multipurpose, redundant control surfaces, adding considerable complexity to the control allocation problem. These issues can be addressed with optimal control allocation. Most optimal control allocation algorithms have control surface position and rate constraints. However, these constraints are insufficient to ensure that the aircraft's structural load limits will not be exceeded by commanded surface deflections. In this paper, a framework is proposed to enable a flight control system with optimal control allocation to incorporate real-time structural load feedback and structural load constraints. A proof of concept simulation that demonstrates the framework in a simulation of a generic transport aircraft is presented.

  18. Research on allocation efficiency of the daisy chain allocation algorithm

    NASA Astrophysics Data System (ADS)

    Shi, Jingping; Zhang, Weiguo

    2013-03-01

    With the improvement of the aircraft performance in reliability, maneuverability and survivability, the number of the control effectors increases a lot. How to distribute the three-axis moments into the control surfaces reasonably becomes an important problem. Daisy chain method is simple and easy to be carried out in the design of the allocation system. But it can not solve the allocation problem for entire attainable moment subset. For the lateral-directional allocation problem, the allocation efficiency of the daisy chain can be directly measured by the area of its subset of attainable moments. Because of the non-linear allocation characteristic, the subset of attainable moments of daisy-chain method is a complex non-convex polygon, and it is difficult to solve directly. By analyzing the two-dimensional allocation problems with a "micro-element" idea, a numerical calculation algorithm is proposed to compute the area of the non-convex polygon. In order to improve the allocation efficiency of the algorithm, a genetic algorithm with the allocation efficiency chosen as the fitness function is proposed to find the best pseudo-inverse matrix.

  19. Estimating mental fatigue based on electroencephalogram and heart rate variability

    NASA Astrophysics Data System (ADS)

    Zhang, Chong; Yu, Xiaolin

    2010-01-01

    The effects of long term mental arithmetic task on psychology are investigated by subjective self-reporting measures and action performance test. Based on electroencephalogram (EEG) and heart rate variability (HRV), the impacts of prolonged cognitive activity on central nervous system and autonomic nervous system are observed and analyzed. Wavelet packet parameters of EEG and power spectral indices of HRV are combined to estimate the change of mental fatigue. Then wavelet packet parameters of EEG which change significantly are extracted as the features of brain activity in different mental fatigue state, support vector machine (SVM) algorithm is applied to differentiate two mental fatigue states. The experimental results show that long term mental arithmetic task induces the mental fatigue. The wavelet packet parameters of EEG and power spectral indices of HRV are strongly correlated with mental fatigue. The predominant activity of autonomic nervous system of subjects turns to the sympathetic activity from parasympathetic activity after the task. Moreover, the slow waves of EEG increase, the fast waves of EEG and the degree of disorder of brain decrease compared with the pre-task. The SVM algorithm can effectively differentiate two mental fatigue states, which achieves the maximum classification accuracy (91%). The SVM algorithm could be a promising tool for the evaluation of mental fatigue. Fatigue, especially mental fatigue, is a common phenomenon in modern life, is a persistent occupational hazard for professional. Mental fatigue is usually accompanied with a sense of weariness, reduced alertness, and reduced mental performance, which would lead the accidents in life, decrease productivity in workplace and harm the health. Therefore, the evaluation of mental fatigue is important for the occupational risk protection, productivity, and occupational health.

  20. Photosynthetic rates derived from satellite-based chlorophyll concentration

    SciTech Connect

    Behrenfeld, M.J.; Falkowski, P.G.

    1997-01-01

    We assembled a dataset of C-based productivity measurements to understand the critical variables required for accurate assessment of daily depth-integrated phytoplankton carbon fixation (PP{sub eu}) from measurements of sea surface pigment concentrations (C{sub sat}). From this dataset, we developed a light-dependent, depth-resolved model for carbon fixation (VGPM) that partitions environmental factors affecting primary production into those that influence the relative vertical distribution of primary production (P{sub z}) and those that control the optimal assimilation efficiency of the productivity profile (P{sub opt}{sup B}). The VGPM accounted for 79% of the observed variability in P{sub z} and 86% of the variability in PP{sub eu} by using measured values of P{sub opt}{sup B}. Our results indicate that the accuracy of productivity algorithms in estimating PP{sub eu} is dependent primarily upon the ability to accurately represent variability in P{sub opt}{sup B}. We developed a temperature-dependent P{sub opt}{sup B} model that was used in conjunction with monthly climatological images of C{sub sat}, sea surface temperature, and cloud-corrected estimates of surface irradiance to calculate a global annual phytoplankton carbon fixation (PP{sub annu}) rate of 43.5 Pg C yr{sup {minus}1}. The geographical distribution of PP{sub annu} was distinctly different than results from previous models. Our results illustrate the importance of focusing P{sub opt}{sup B} model development on temporal and spatial, rather than the vertical, variability. 87 refs., 9 figs., 2 tabs.

  1. Adaptive resource allocation architecture applied to line tracking

    NASA Astrophysics Data System (ADS)

    Owen, Mark W.; Pace, Donald W.

    2000-04-01

    Recent research has demonstrated the benefits of a multiple hypothesis, multiple model sonar line tracking solution, achieved at significant computational cost. We have developed an adaptive architecture that trades computational resources for algorithm complexity based on environmental conditions. A Fuzzy Logic Rule-Based approach is applied to adaptively assign algorithmic resources to meet system requirements. The resources allocated by the Fuzzy Logic algorithm include (1) the number of hypotheses permitted (yielding multi-hypothesis and single-hypothesis modes), (2) the number of signal models to use (yielding an interacting multiple model capability), (3) a new track likelihood for hypothesis generation, (4) track attribute evaluator activation (for signal to noise ratio, frequency bandwidth, and others), and (5) adaptive cluster threshold control. Algorithm allocation is driven by a comparison of current throughput rates to a desired real time rate. The Fuzzy Logic Controlled (FLC) line tracker, a single hypothesis line tracker, and a multiple hypothesis line tracker are compared on real sonar data. System resource usage results demonstrate the utility of the FLC line tracker.

  2. A New Model for Equitable and Efficient Resource Allocation to Schools: The Israeli Case

    ERIC Educational Resources Information Center

    BenDavid-Hadar, Iris; Ziderman, Adrian

    2011-01-01

    This paper sets out a new budget allocation formula for schools, designed to achieve a more equitable distribution of educational achievement. In addition to needs-based elements, the suggested composite allocation formula includes an improvement component, whereby schools receive budgetary allocations based on a new incentive measure developed in…

  3. Scenario-Based Multi-Objective Optimum Allocation Model for Earthquake Emergency Shelters Using a Modified Particle Swarm Optimization Algorithm: A Case Study in Chaoyang District, Beijing, China

    PubMed Central

    Zhao, Xiujuan; Xu, Wei; Ma, Yunjia; Hu, Fuyu

    2015-01-01

    The correct location of earthquake emergency shelters and their allocation to residents can effectively reduce the number of casualties by providing safe havens and efficient evacuation routes during the chaotic period of the unfolding disaster. However, diverse and strict constraints and the discrete feasible domain of the required models make the problem of shelter location and allocation more difficult. A number of models have been developed to solve this problem, but there are still large differences between the models and the actual situation because the characteristics of the evacuees and the construction costs of the shelters have been excessively simplified. We report here the development of a multi-objective model for the allocation of residents to earthquake shelters by considering these factors using the Chaoyang district, Beijing, China as a case study. The two objectives of this model were to minimize the total weighted evacuation time from residential areas to a specified shelter and to minimize the total area of all the shelters. The two constraints were the shelter capacity and the service radius. Three scenarios were considered to estimate the number of people who would need to be evacuated. The particle swarm optimization algorithm was first modified by applying the von Neumann structure in former loops and global structure in later loops, and then used to solve this problem. The results show that increasing the shelter area can result in a large decrease in the total weighted evacuation time from scheme 1 to scheme 9 in scenario A, from scheme 1 to scheme 9 in scenario B, from scheme 1 to scheme 19 in scenario C. If the funding were not a limitation, then the final schemes of each scenario are the best solutions, otherwise the earlier schemes are more reasonable. The modified model proved to be useful for the optimization of shelter allocation, and the result can be used as a scientific reference for planning shelters in the Chaoyang district

  4. The Principal as Resource Allocator.

    ERIC Educational Resources Information Center

    Peterson, Kent D.

    The effect of political influences on the allocation of personnel, money, facilities, and equipment by elementary school principals is discussed in this paper. The use of Zald's political economy framework as a tool for understanding the principal's role in allocating resources is described by the author. He suggests that the principal occupies a…

  5. Digital heart rate measurement based on Atmega16L

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoqing; Feng, Lishuang; Wang, Jiqiang

    2008-02-01

    The photoelectric heart rate meter reported in this paper picks up heart rate signals with a photoelectric cell, converts them into standard TTL pulse signal, and sends them into the input capture interface of a single-chip computer Atmega16L. Its input capture register can capture the Timer/Counter value at a given external (edge triggered) event on the input capture pin (ICP1) of T/C1. The counter number is sent into T/C1's input capture register ICR1 after the voltage of the input capture pin ICP1 jumped according to the program setting. The single-chip computer catches the input pulse signal as some numerical values of Timer/Counter (T/C1) and works out a single heart rate cycle and displays by three seven segment tubes, which are the peripheral equipments of the single-chip computer. ICCAVR integrated compiler is applied to assemble and compile the software programs of the heart rate meter. After the programs compiled successfully, a HEX file is produced and downloaded into the single chip computer by software SLISP. This photoelectric heart rate meter can measure the people heart rate efficiently with measurement range of 10-200 times per minute, precision of +/- 1%, low cost and reliable performance.

  6. 24 CFR 92.50 - Formula allocation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Formula allocation. 92.50 Section... Development HOME INVESTMENT PARTNERSHIPS PROGRAM Allocation Formula § 92.50 Formula allocation. (a) Jurisdictions eligible for a formula allocation. HUD will provide allocations of funds in amounts determined...

  7. On Estimation of GPS-based Indonesian Strain Rate Map

    NASA Astrophysics Data System (ADS)

    Susilo, Susilo; Abidin, Hasanuddin Z.; Meilano, Irwan; Sapiie, Benyamin; Wijanarto, Antonius B.

    2016-04-01

    Using the GPS-derived rates at survey mode (sGPS) stations and continuous GPS stations across Indonesian region, covering the 22 years period from 1993 to 2014, the linear deformation velocities with an accuracy of about 2 to 3 mm/year level are derived. These velocities are corrected to the coseismic and postseismic deformation caused by significant earthquakes in that period. In this study, we use this GPS velocities field to construct a crustal strain rate map without including the physical model yet. An interpolation method was used to compute the velocity model. By differentiation of the continuous velocity model, we derive the strain rate map of Indonesia. At present, our result is only the magnitude of the strain rate. The Indonesian strain rate map is very important for studying the deformation characteristics in the region and to establish a deformation (velocity) model for supporting the implementation of the Indonesian Geospatial Reference System 2013 (IGRS 2013). This is a new semi-dynamic geocentric datum of Indonesia, which uses the global ITRF2008 reference frame, with a reference epoch of 1 January 2012. A deformation (velocity) model is required to transform coordinates from an observation epoch to or from this reference epoch.

  8. Cognitive cost as dynamic allocation of energetic resources.

    PubMed

    Christie, S Thomas; Schrater, Paul

    2015-01-01

    While it is widely recognized that thinking is somehow costly, involving cognitive effort and producing mental fatigue, these costs have alternatively been assumed to exist, treated as the brain's assessment of lost opportunities, or suggested to be metabolic but with implausible biological bases. We present a model of cognitive cost based on the novel idea that the brain senses and plans for longer-term allocation of metabolic resources by purposively conserving brain activity. We identify several distinct ways the brain might control its metabolic output, and show how a control-theoretic model that models decision-making with an energy budget can explain cognitive effort avoidance in terms of an optimal allocation of limited energetic resources. The model accounts for both subject responsiveness to reward and the detrimental effects of hypoglycemia on cognitive function. A critical component of the model is using astrocytic glycogen as a plausible basis for limited energetic reserves. Glycogen acts as an energy buffer that can temporarily support high neural activity beyond the rate supported by blood glucose supply. The published dynamics of glycogen depletion and repletion are consonant with a broad array of phenomena associated with cognitive cost. Our model thus subsumes both the "cost/benefit" and "limited resource" models of cognitive cost while retaining valuable contributions of each. We discuss how the rational control of metabolic resources could underpin the control of attention, working memory, cognitive look ahead, and model-free vs. model-based policy learning. PMID:26379482

  9. Cognitive cost as dynamic allocation of energetic resources

    PubMed Central

    Christie, S. Thomas; Schrater, Paul

    2015-01-01

    While it is widely recognized that thinking is somehow costly, involving cognitive effort and producing mental fatigue, these costs have alternatively been assumed to exist, treated as the brain's assessment of lost opportunities, or suggested to be metabolic but with implausible biological bases. We present a model of cognitive cost based on the novel idea that the brain senses and plans for longer-term allocation of metabolic resources by purposively conserving brain activity. We identify several distinct ways the brain might control its metabolic output, and show how a control-theoretic model that models decision-making with an energy budget can explain cognitive effort avoidance in terms of an optimal allocation of limited energetic resources. The model accounts for both subject responsiveness to reward and the detrimental effects of hypoglycemia on cognitive function. A critical component of the model is using astrocytic glycogen as a plausible basis for limited energetic reserves. Glycogen acts as an energy buffer that can temporarily support high neural activity beyond the rate supported by blood glucose supply. The published dynamics of glycogen depletion and repletion are consonant with a broad array of phenomena associated with cognitive cost. Our model thus subsumes both the “cost/benefit” and “limited resource” models of cognitive cost while retaining valuable contributions of each. We discuss how the rational control of metabolic resources could underpin the control of attention, working memory, cognitive look ahead, and model-free vs. model-based policy learning. PMID:26379482

  10. Cognitive cost as dynamic allocation of energetic resources.

    PubMed

    Christie, S Thomas; Schrater, Paul

    2015-01-01

    While it is widely recognized that thinking is somehow costly, involving cognitive effort and producing mental fatigue, these costs have alternatively been assumed to exist, treated as the brain's assessment of lost opportunities, or suggested to be metabolic but with implausible biological bases. We present a model of cognitive cost based on the novel idea that the brain senses and plans for longer-term allocation of metabolic resources by purposively conserving brain activity. We identify several distinct ways the brain might control its metabolic output, and show how a control-theoretic model that models decision-making with an energy budget can explain cognitive effort avoidance in terms of an optimal allocation of limited energetic resources. The model accounts for both subject responsiveness to reward and the detrimental effects of hypoglycemia on cognitive function. A critical component of the model is using astrocytic glycogen as a plausible basis for limited energetic reserves. Glycogen acts as an energy buffer that can temporarily support high neural activity beyond the rate supported by blood glucose supply. The published dynamics of glycogen depletion and repletion are consonant with a broad array of phenomena associated with cognitive cost. Our model thus subsumes both the "cost/benefit" and "limited resource" models of cognitive cost while retaining valuable contributions of each. We discuss how the rational control of metabolic resources could underpin the control of attention, working memory, cognitive look ahead, and model-free vs. model-based policy learning.

  11. Radiocarbon Based Ages and Growth Rates: Hawaiian Deep Sea Corals

    SciTech Connect

    Roark, E B; Guilderson, T P; Dunbar, R B; Ingram, B L

    2006-01-13

    The radial growth rates and ages of three different groups of Hawaiian deep-sea 'corals' were determined using radiocarbon measurements. Specimens of Corallium secundum, Gerardia sp., and Leiopathes glaberrima, were collected from 450 {+-} 40 m at the Makapuu deep-sea coral bed using a submersible (PISCES V). Specimens of Antipathes dichotoma were collected at 50 m off Lahaina, Maui. The primary source of carbon to the calcitic C. secundum skeleton is in situ dissolved inorganic carbon (DIC). Using bomb {sup 14}C time markers we calculate radial growth rates of {approx} 170 {micro}m y{sup -1} and ages of 68-75 years on specimens as tall as 28 cm of C. secundum. Gerardia sp., A. dichotoma, and L. glaberrima have proteinaceous skeletons and labile particulate organic carbon (POC) is their primary source of architectural carbon. Using {sup 14}C we calculate a radial growth rate of 15 {micro}m y{sup -1} and an age of 807 {+-} 30 years for a live collected Gerardia sp., showing that these organisms are extremely long lived. Inner and outer {sup 14}C measurements on four sub-fossil Gerardia spp. samples produce similar growth rate estimates (range 14-45 {micro}m y{sup -1}) and ages (range 450-2742 years) as observed for the live collected sample. Similarly, with a growth rate of < 10 {micro}m y{sup -1} and an age of {approx}2377 years, L. glaberrima at the Makapuu coral bed, is also extremely long lived. In contrast, the shallow-collected A. dichotoma samples yield growth rates ranging from 130 to 1,140 {micro}m y{sup -1}. These results show that Hawaiian deep-sea corals grow more slowly and are older than previously thought.

  12. High removal rate laser-based coating removal system

    DOEpatents

    Matthews, Dennis L.; Celliers, Peter M.; Hackel, Lloyd; Da Silva, Luiz B.; Dane, C. Brent; Mrowka, Stanley

    1999-11-16

    A compact laser system that removes surface coatings (such as paint, dirt, etc.) at a removal rate as high as 1000 ft.sup.2 /hr or more without damaging the surface. A high repetition rate laser with multiple amplification passes propagating through at least one optical amplifier is used, along with a delivery system consisting of a telescoping and articulating tube which also contains an evacuation system for simultaneously sweeping up the debris produced in the process. The amplified beam can be converted to an output beam by passively switching the polarization of at least one amplified beam. The system also has a personal safety system which protects against accidental exposures.

  13. Philosophy of the Spike: Rate-Based vs. Spike-Based Theories of the Brain

    PubMed Central

    Brette, Romain

    2015-01-01

    Does the brain use a firing rate code or a spike timing code? Considering this controversial question from an epistemological perspective, I argue that progress has been hampered by its problematic phrasing. It takes the perspective of an external observer looking at whether those two observables vary with stimuli, and thereby misses the relevant question: which one has a causal role in neural activity? When rephrased in a more meaningful way, the rate-based view appears as an ad hoc methodological postulate, one that is practical but with virtually no empirical or theoretical support. PMID:26617496

  14. Relaxed Time Slot Negotiation for Grid Resource Allocation

    NASA Astrophysics Data System (ADS)

    Son, Seokho; Sim, Kwang Mong

    Since participants in a computational grid may be independent bodies, some mechanisms are necessary for resolving the differences in their preferences for price and desirable time slots for utilizing/leasing computing resources. Whereas there are mechanisms for supporting price negotiation for grid resource allocation, there is little or no negotiation support for allocating mutually acceptable time slots for grid participants. The contribution of this work is designing a negotiation mechanism for facilitating time slot negotiations between grid participants. In particular, this work adopts a relaxed time slot negotiation protocol designed to enhance the success rate and resource utilization level by allowing some flexibility for making slight adjustments following a tentative agreement for a mutually acceptable time slot. The ideas of the relaxed time slot negotiation are implemented in an agent-based grid testbed, and empirical results of the relaxed time slot negotiation mechanism carried out, (i) a consumer and a provider agent have a mutually satisfying agreement on time slot and price, (ii) consumer agents achieved higher success rates in negotiation, and (iii) provider agents achieved higher utility and resource utilization of overall grid.

  15. Gender Systematics in Telescope Time Allocation at ESO

    NASA Astrophysics Data System (ADS)

    Patat, F.

    2016-09-01

    The results of a comprehensive statistical analysis of gender systematics in the time allocation process at ESO are presented. The sample on which the study is based includes more than 13 000 Normal and Short proposals, submitted by about 3000 principal investigators (PI) over eight years. The genders of PIs, and of the panel members of the Observing Programmes Committee (OPC), were used, together with their career level, to analyse the grade distributions and the proposal success rates. Proposals submitted by female PIs show a significantly lower probability of being allocated time. The proposal success rates (defined as number of top ranked runs over requested runs) are 16.0 ± 0.6% and 22.0 ± 0.4% for females and males, respectively. To a significant extent the disparity is related to different input distributions in terms of career level. The seniority of male PIs is significantly higher than that of female PIs, with only 34% of the female PIs being professionally employed astronomers (compared to 53% for male PIs). A small, but statistically significant, gender-dependent behaviour is measured for the OPC referees: both genders show the same systematics, but they are larger for males than females. The PI female/male fraction is very close to 30/70; although far from parity, the fraction is higher than that observed, for instance, among IAU membership.

  16. Resource Balancing Control Allocation

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Bodson, Marc

    2010-01-01

    Next generation aircraft with a large number of actuators will require advanced control allocation methods to compute the actuator commands needed to follow desired trajectories while respecting system constraints. Previously, algorithms were proposed to minimize the l1 or l2 norms of the tracking error and of the control effort. The paper discusses the alternative choice of using the l1 norm for minimization of the tracking error and a normalized l(infinity) norm, or sup norm, for minimization of the control effort. The algorithm computes the norm of the actuator deflections scaled by the actuator limits. Minimization of the control effort then translates into the minimization of the maximum actuator deflection as a percentage of its range of motion. The paper shows how the problem can be solved effectively by converting it into a linear program and solving it using a simplex algorithm. Properties of the algorithm are investigated through examples. In particular, the min-max criterion results in a type of resource balancing, where the resources are the control surfaces and the algorithm balances these resources to achieve the desired command. A study of the sensitivity of the algorithms to the data is presented, which shows that the normalized l(infinity) algorithm has the lowest sensitivity, although high sensitivities are observed whenever the limits of performance are reached.

  17. Phase plane based identification of fetal heart rate patterns

    PubMed Central

    Vairavan, Srinivasan; Sriram, Bhargavi; Wilson, James D.; Preissl, Hubert; Eswaran, Hari

    2012-01-01

    Using a phase plane analysis (PPA) of the spatial spread of trajectories of the fetal heart rate and its time-derivative we characterize the fetal heart rate patterns (fHRP) as defined by Nijhuis. For this purpose, we collect 22 fetal magnetocardiogram using a 151 SQUID system from 22 low-risk fetuses in gestational ages ranging from 30 to 37 weeks. Each study lasted for 30 minutes. After the attenuation of the maternal cardiac signals, we identify the R waves using an adaptive Hilbert transform approach and calculate the fetal heart rate. On these datasets, we apply the proposed approach and the traditionally used approaches such as standard deviation of the normal to normal intervals (SDNN) and root mean square of the successive difference (RMSSD). Heart rate patterns are scored by an expert using Nijhuis criteria and revealed A, B, and D patterns. A receiver operator characteristic (ROC) curve is used to assess the performance of the metric to differentiate the different patterns. Results showed that only PPA was able to differentiate all pairs of fHRP with high performance. PMID:22254593

  18. A liquid cooled garment temperature controller based on sweat rate

    NASA Technical Reports Server (NTRS)

    Chambers, A. B.; Blackaby, J. R.

    1972-01-01

    An automatic controller for liquid cooled space suits is reported that utilizes human sweat rate as the primary input signal. The controller is so designed that the coolant inlet temperature is inversely proportional to the subject's latent heat loss as evidenced by evaporative water loss.

  19. An Empirical Approach to Determining Employee Deviance Base Rates.

    ERIC Educational Resources Information Center

    Slora, Karen B.

    Employee deviance may reflect either acts of employee theft or of production deviance. Employee theft refers to the unauthorized taking of cash, merchandise, or property. Production deviance refers to counterproductive activities which serve to slow the rate or quality of output, such as intentionally doing slow or sloppy work or using drugs on…

  20. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, Jamie; Toon, Troy; Adams, Timothy C.; Miranda, David J.

    2016-01-01

    This paper describes the methodology that was developed to allocate reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program's subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. Allocating is an iterative process; as systems moved beyond their conceptual and preliminary design phases this provided an opportunity for the reliability engineering team to reevaluate allocations based on updated designs and maintainability characteristics of the components. Trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper will discuss the value of modifying reliability and maintainability allocations made for the GSDO subsystems as the program nears the end of its design phase.

  1. High base pair opening rates in tracts of GC base pairs.

    PubMed

    Dornberger, U; Leijon, M; Fritzsche, H

    1999-03-12

    Sequence-dependent structural features of the DNA double helix have a strong influence on the base pair opening dynamics. Here we report a detailed study of the kinetics of base pair breathing in tracts of GC base pairs in DNA duplexes derived from 1H NMR measurements of the imino proton exchange rates upon titration with the exchange catalyst ammonia. In the limit of infinite exchange catalyst concentration, the exchange times of the guanine imino protons of the GC tracts extrapolate to much shorter base pair lifetimes than commonly observed for isolated GC base pairs. The base pair lifetimes in the GC tracts are below 5 ms for almost all of the base pairs. The unusually rapid base pair opening dynamics of GC tracts are in striking contrast to the behavior of AT tracts, where very long base pair lifetimes are observed. The implication of these findings for the structural principles governing spontaneous helix opening as well as the DNA-binding specificity of the cytosine-5-methyltransferases, where flipping of the cytosine base has been observed, are discussed.

  2. Review of the Uruguayan Kidney Allocation System: the solution to a complex problem, preliminary data.

    PubMed

    Bengochea, M; Alvarez, I; Toledo, R; Carretto, E; Forteza, D

    2010-01-01

    The National Kidney Transplant Program with cadaveric donors is based on centralized and unique waitlist, serum bank, and allocation criteria, approved by Instituto Nacional de Donación y Trasplante (INDT) in agreement with clinical teams. The median donor rates over last 3 years is 20 per million population and the median number of waitlist candidates is 450. The increased number of waiting list patients and the rapid aging of our populations demanded strategies for donor acceptance, candidate assignment, and analysis of more efficient and equitable allocation models. The objectives of the new national allocation system were to improve posttransplant patient and graft survivals, allow equal access to transplantation, and reduce waitlist times. The objective of this study was to analyze variables in our current allocation system and to create a mathematical/simulation model to evaluate a new allocation system. We compared candidates and transplanted patients for gender, age, ABO blood group, human leukocyte agents (HLA), percentage of reactive antibodies (PRA), and waiting list and dialysis times. Only 2 factors showed differences: highly sensitized and patients >65 years old (Bernoulli test). An agreement between INDT and Engineering Faculty yielded a major field of study. During 2008 the data analysis and model building began. The waiting list data of the last decade of donors and transplants were processed to develop a virtual model. We used inputs of candidates and donors, with outputs and structure of the simulation system to evaluate the proposed changes. Currently, the INDT and the Mathematics and Statistics Institute are working to develop a simulation model, that is able to analyze our new national allocation system.

  3. Real-Time Adaptive Control Allocation Applied to a High Performance Aircraft

    NASA Technical Reports Server (NTRS)

    Davidson, John B.; Lallman, Frederick J.; Bundick, W. Thomas

    2001-01-01

    Abstract This paper presents the development and application of one approach to the control of aircraft with large numbers of control effectors. This approach, referred to as real-time adaptive control allocation, combines a nonlinear method for control allocation with actuator failure detection and isolation. The control allocator maps moment (or angular acceleration) commands into physical control effector commands as functions of individual control effectiveness and availability. The actuator failure detection and isolation algorithm is a model-based approach that uses models of the actuators to predict actuator behavior and an adaptive decision threshold to achieve acceptable false alarm/missed detection rates. This integrated approach provides control reconfiguration when an aircraft is subjected to actuator failure, thereby improving maneuverability and survivability of the degraded aircraft. This method is demonstrated on a next generation military aircraft Lockheed-Martin Innovative Control Effector) simulation that has been modified to include a novel nonlinear fluid flow control control effector based on passive porosity. Desktop and real-time piloted simulation results demonstrate the performance of this integrated adaptive control allocation approach.

  4. Bounding quantum gate error rate based on reported average fidelity

    NASA Astrophysics Data System (ADS)

    Sanders, Yuval R.; Wallman, Joel J.; Sanders, Barry C.

    2016-01-01

    Remarkable experimental advances in quantum computing are exemplified by recent announcements of impressive average gate fidelities exceeding 99.9% for single-qubit gates and 99% for two-qubit gates. Although these high numbers engender optimism that fault-tolerant quantum computing is within reach, the connection of average gate fidelity with fault-tolerance requirements is not direct. Here we use reported average gate fidelity to determine an upper bound on the quantum-gate error rate, which is the appropriate metric for assessing progress towards fault-tolerant quantum computation, and we demonstrate that this bound is asymptotically tight for general noise. Although this bound is unlikely to be saturated by experimental noise, we demonstrate using explicit examples that the bound indicates a realistic deviation between the true error rate and the reported average fidelity. We introduce the Pauli distance as a measure of this deviation, and we show that knowledge of the Pauli distance enables tighter estimates of the error rate of quantum gates.

  5. Bounding quantum gate error rate based on reported average fidelity

    NASA Astrophysics Data System (ADS)

    Sanders, Yuval; Wallman, Joel; Sanders, Barry

    Remarkable experimental advances in quantum computing are exemplified by recent announcements of impressive average gate fidelities exceeding 99.9% for single-qubit gates and 99% for two-qubit gates. Although these high numbers engender optimism that fault-tolerant quantum computing is within reach, the connection of average gate fidelity with fault-tolerance requirements is not direct. Here we use reported average gate fidelity to determine an upper bound on the quantum-gate error rate, which is the appropriate metric for assessing progress towards fault-tolerant quantum computation, and we demonstrate that this bound is asymptotically tight for general noise. Although this bound is unlikely to be saturated by experimental noise, we demonstrate using explicit examples that the bound indicates a realistic deviation between the true error rate and the reported average fidelity. We introduce the Pauli-distance as a measure of this deviation, and we show that knowledge of the Pauli-distance enables tighter estimates of the error rate of quantum gates.

  6. Infant breathing rate counter based on variable resistor for pneumonia

    NASA Astrophysics Data System (ADS)

    Sakti, Novi Angga; Hardiyanto, Ardy Dwi; La Febry Andira R., C.; Camelya, Kesa; Widiyanti, Prihartini

    2016-03-01

    Pneumonia is one of the leading causes of death in new born baby in Indonesia. According to WHO in 2002, breathing rate is very important index to be the symptom of pneumonia. In the Community Health Center, the nurses count with a stopwatch for exactly one minute. Miscalculation in Community Health Center occurs because of long time concentration and focus on two object at once. This calculation errors can cause the baby who should be admitted to the hospital only be attended at home. Therefore, an accurate breathing rate counter at Community Health Center level is necessary. In this work, resistance change of variable resistor is made to be breathing rate counter. Resistance change in voltage divider can produce voltage change. If the variable resistance moves periodically, the voltage will change periodically too. The voltage change counted by software in the microcontroller. For the every mm shift at the variable resistor produce average 0.96 voltage change. The software can count the number of wave generated by shifting resistor.

  7. Ocean Color Based Estimates of Global Photochemical Rate Processes

    NASA Astrophysics Data System (ADS)

    Nelson, N. B.; Siegel, D. A.; Toole, D. A.

    2005-12-01

    The development and validation of new ocean color data products beyond chlorophyll allows for the assessment of biogeochemically relevant rate processes other than primary production, such as CO production and DMS photolysis. We present here a proof-of-concept study in which we integrate multiple global remote sensing data streams to estimate the solar irradiance absorbed by chromophoric dissolved organic matter (CDOM) in the euphotic zone. This quantity can be convolved with apparent quantum yield spectra to estimate photochemical reaction rates. In this study we use ocean color reflectance spectra from SeaWiFS and/or MODIS to estimate in-water light absorption and backscattering spectra using the Garver-Siegel-Maritorena ocean color model. These quantities were used to empirically estimate the diffuse attenuation coefficient spectrum (Kd) for surface waters, and thus depth profiles of light penetration. UV Irradiance spectra at the surface were estimated using TOMS data. We also estimated the scalar to vector irradiance ratio using information from radiative transfer modeling in conjunction with absorption and backscattering coefficient spectra. These quantities were combined to estimate the spectrum of light absorption by CDOM, or photochemically active radiation. Finally, we combined the photochemically active radiation spectra with open ocean estimates of apparent quantum yield to produce maps of photochemical production of CO. Global maps of time integrated production rates closely resemble similar maps of CDOM distribution, indicating a proximal control of photochemistry by CDOM.

  8. The Impact of Statistical Choices on Neonatal Intensive Care Unit Quality Ratings Based on Nosocomial Infection Rates

    PubMed Central

    Lee, Henry Chong; Chien, Alyna T.; Bardach, Naomi S.; Clay, Ted; Gould, Jeffrey B.; Dudley, R. Adams

    2014-01-01

    Objective To examine the extent to which performance assessment methodologies affect the percent of neonatal intensive care units (NICUs) and very low birth weight (VLBW) infants included in performance assessments, distribution of NICU performance ratings, and level of agreement in those ratings. Design Cross-sectional study based on risk-adjusted nosocomial infection rates. Setting NICUs belonging to the California Perinatal Quality Care Collaborative 2007–2008. Participants 126 California NICUs and 10,487 VLBW infants. Main Exposure Three performance assessment choices: 1. Excluding “low-volume” NICUs (those caring for < 30 VLBW infants in a year) vs. a criterion based on confidence intervals, 2. Using Bayesian vs. frequentist hierarchical models, and 3. Pooling data across one vs. two years. Main Outcome Measures Proportion of NICUs and patients included in quality assessment, distribution of ratings for NICUs, and agreement between methods using the kappa statistic. Results Depending on the methods applied, between 51% and 85% of NICUs were included in performance assessment, the percent of VLBW infants included in performance assessment ranged from 72% to 96%, between 76–87% NICUs were considered “average,” and the level of agreement between NICU ratings ranged from 0.26 to 0.89. Conclusions The percent of NICUs included in performance assessment and their ratings can shift dramatically depending on performance measurement methodology. Physicians, payers, and policymakers should continue to closely examine which existing performance assessment methodologies are most appropriate for evaluating pediatric care quality. PMID:21536958

  9. 23 CFR 1240.13 - Determination of national average seat belt use rate.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false Determination of national average seat belt use rate... BASED ON SEAT BELT USE RATES Determination of Allocations § 1240.13 Determination of national average seat belt use rate. The national average seat belt use rate for a calendar year shall be the sum of...

  10. THE 1985 NAPAP EMISSIONS INVENTORY: DEVELOPMENT OF TEMPORAL ALLOCATION FACTORS

    EPA Science Inventory

    The report documents the development and processing of temporal allocation factors for the 1985 National Acid Precipitation Assessment Program (NAPAP) emissions inventory (Version 2). The NAPAP emissions inventory represents the most comprehensive emissions data base available fo...

  11. Probabilistic resource allocation system with self-adaptive capability

    NASA Technical Reports Server (NTRS)

    Yufik, Yan M. (Inventor)

    1996-01-01

    A probabilistic resource allocation system is disclosed containing a low capacity computational module (Short Term Memory or STM) and a self-organizing associative network (Long Term Memory or LTM) where nodes represent elementary resources, terminal end nodes represent goals, and directed links represent the order of resource association in different allocation episodes. Goals and their priorities are indicated by the user, and allocation decisions are made in the STM, while candidate associations of resources are supplied by the LTM based on the association strength (reliability). Reliability values are automatically assigned to the network links based on the frequency and relative success of exercising those links in the previous allocation decisions. Accumulation of allocation history in the form of an associative network in the LTM reduces computational demands on subsequent allocations. For this purpose, the network automatically partitions itself into strongly associated high reliability packets, allowing fast approximate computation and display of allocation solutions satisfying the overall reliability and other user-imposed constraints. System performance improves in time due to modification of network parameters and partitioning criteria based on the performance feedback.

  12. Probabilistic resource allocation system with self-adaptive capability

    NASA Technical Reports Server (NTRS)

    Yufik, Yan M. (Inventor)

    1998-01-01

    A probabilistic resource allocation system is disclosed containing a low capacity computational module (Short Term Memory or STM) and a self-organizing associative network (Long Term Memory or LTM) where nodes represent elementary resources, terminal end nodes represent goals, and weighted links represent the order of resource association in different allocation episodes. Goals and their priorities are indicated by the user, and allocation decisions are made in the STM, while candidate associations of resources are supplied by the LTM based on the association strength (reliability). Weights are automatically assigned to the network links based on the frequency and relative success of exercising those links in the previous allocation decisions. Accumulation of allocation history in the form of an associative network in the LTM reduces computational demands on subsequent allocations. For this purpose, the network automatically partitions itself into strongly associated high reliability packets, allowing fast approximate computation and display of allocation solutions satisfying the overall reliability and other user-imposed constraints. System performance improves in time due to modification of network parameters and partitioning criteria based on the performance feedback.

  13. Variable-rate colour image quantization based on quadtree segmentation

    NASA Astrophysics Data System (ADS)

    Hu, Y. C.; Li, C. Y.; Chuang, J. C.; Lo, C. C.

    2011-09-01

    A novel variable-sized block encoding with threshold control for colour image quantization (CIQ) is presented in this paper. In CIQ, the colour palette used has a great influence on the reconstructed image quality. Typically, a higher image quality and a larger storage cost are obtained when a larger-sized palette is used in CIQ. To cut down the storage cost while preserving quality of the reconstructed images, the threshold control policy for quadtree segmentation is used in this paper. Experimental results show that the proposed method adaptively provides desired bit rates while having better image qualities comparing to CIQ with the usage of multiple palettes of different sizes.

  14. 44 CFR 61.12 - Rates based on a flood protection system involving Federal funds.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false Rates based on a flood... EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program INSURANCE COVERAGE AND RATES § 61.12 Rates based on a flood protection system...

  15. 44 CFR 61.12 - Rates based on a flood protection system involving Federal funds.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 44 Emergency Management and Assistance 1 2012-10-01 2011-10-01 true Rates based on a flood... EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program INSURANCE COVERAGE AND RATES § 61.12 Rates based on a flood protection system...

  16. 44 CFR 61.12 - Rates based on a flood protection system involving Federal funds.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false Rates based on a flood... EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program INSURANCE COVERAGE AND RATES § 61.12 Rates based on a flood protection system...

  17. 44 CFR 61.12 - Rates based on a flood protection system involving Federal funds.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Rates based on a flood... EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program INSURANCE COVERAGE AND RATES § 61.12 Rates based on a flood protection system...

  18. 44 CFR 61.12 - Rates based on a flood protection system involving Federal funds.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Rates based on a flood... EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program INSURANCE COVERAGE AND RATES § 61.12 Rates based on a flood protection system...

  19. 48 CFR 1616.7002 - Clause-contracts based on cost analysis (experience rated).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... cost analysis (experience rated). 1616.7002 Section 1616.7002 Federal Acquisition Regulations System... based on cost analysis (experience rated). The clause at section 1652.216-71 shall be inserted in all FEHBP contracts based on cost analysis (experience rated)....

  20. High-level reasoning and base-rate use: do we need cue-competition to explain the inverse base-rate effect?

    PubMed

    Juslin, P; Wennerholm, P; Winman, A

    2001-05-01

    Previous accounts of the inverse base-rate effect (D. L. Medin & S. M. Edelson, 1988) have revolved around the concept of cue-competition. In this article, the authors propose that high-level reasoning in the form of an eliminative inference mechanism may contribute to the effect. A quantitative implementation of this idea demonstrates that it has the power by itself to produce the pattern of base-rate effects in the Medin and Edelson (1988) design. Four predictions are derived that contradict the predictions by attention to distinctive input (ADIT; J. K. Kruschke, 1996), up to date the most successful account of the inverse base-rate effect. Results from 3 experiments disconfirm the predictions by ADIT and demonstrate the importance of high-level reasoning in designs of the Medin and Edelson kind. Implications for the interpretation of the inverse base-rate effect and the attention-shifting mechanisms presumed by ADIT are discussed. PMID:11394684

  1. A fault-based model for crustal deformation, fault slip-rates and off-fault strain rate in California

    USGS Publications Warehouse

    Zeng, Yuehua; Shen, Zheng-Kang

    2016-01-01

    We invert Global Positioning System (GPS) velocity data to estimate fault slip rates in California using a fault‐based crustal deformation model with geologic constraints. The model assumes buried elastic dislocations across the region using Uniform California Earthquake Rupture Forecast Version 3 (UCERF3) fault geometries. New GPS velocity and geologic slip‐rate data were compiled by the UCERF3 deformation working group. The result of least‐squares inversion shows that the San Andreas fault slips at 19–22  mm/yr along Santa Cruz to the North Coast, 25–28  mm/yr along the central California creeping segment to the Carrizo Plain, 20–22  mm/yr along the Mojave, and 20–24  mm/yr along the Coachella to the Imperial Valley. Modeled slip rates are 7–16  mm/yr lower than the preferred geologic rates from the central California creeping section to the San Bernardino North section. For the Bartlett Springs section, fault slip rates of 7–9  mm/yr fall within the geologic bounds but are twice the preferred geologic rates. For the central and eastern Garlock, inverted slip rates of 7.5 and 4.9  mm/yr, respectively, match closely with the geologic rates. For the western Garlock, however, our result suggests a low slip rate of 1.7  mm/yr. Along the eastern California shear zone and southern Walker Lane, our model shows a cumulative slip rate of 6.2–6.9  mm/yr across its east–west transects, which is ∼1  mm/yr increase of the geologic estimates. For the off‐coast faults of central California, from Hosgri to San Gregorio, fault slips are modeled at 1–5  mm/yr, similar to the lower geologic bounds. For the off‐fault deformation, the total moment rate amounts to 0.88×1019  N·m/yr, with fast straining regions found around the Mendocino triple junction, Transverse Ranges and Garlock fault zones, Landers and Brawley seismic zones, and farther south. The overall California moment rate is 2.76×1019

  2. Strain Rate Behavior of HTPB-Based Magnetorheological Materials

    NASA Astrophysics Data System (ADS)

    Stoltz, Chad; Seminuk, Kenneth; Joshi, Vasant

    2013-06-01

    It is of particular interest to determine whether the mechanical properties of binder systems can be manipulated by adding ferrous or Magnetostrictive particulates. Strain rate response of two HTPB/Fe (Hydroxyl-terminated Polybutadiene/Iron) compositions under electromagnetic fields has been investigated using a Split Hopkinson Pressure bar arrangement equipped with aluminum bars. Two HTPB/Fe compositions were developed, the first without plasticizer and the second containing plasticizer. Samples were tested with and without the application of a 0.01 Tesla magnetic field coil. Strain gauge data taken from the Split Hopkinson Pressure bar has been used to determine what mechanical properties were changed by inducing a mild electromagnetic field onto each sample. The data reduction method to obtain stress-strain plots included dispersion corrections for deciphering minute changes due to compositional alterations. Data collected from the Split Hopkinson Pressure bar indicate changes in the Mechanical Stress-Strain curves and suggest that the impedance of a binder system can be altered by means of a magnetic field. We acknowledge the Defense Threat Reduction Agency for funding.

  3. Adaptive allocation for binary outcomes using decreasingly informative priors.

    PubMed

    Sabo, Roy T

    2014-01-01

    A method of outcome-adaptive allocation is presented using Bayes methods, where a natural lead-in is incorporated through the use of informative yet skeptical prior distributions for each treatment group. These prior distributions are modeled on unobserved data in such a way that their influence on the allocation scheme decreases as the trial progresses. Simulation studies show this method to behave comparably to the Bayesian adaptive allocation method described by Thall and Wathen (2007), who incorporate a natural lead-in through sample-size-based exponents.

  4. Fair resource allocation and stability for communication networks with multipath routing

    NASA Astrophysics Data System (ADS)

    Li, Shiyong; Sun, Wei; Hua, Changchun

    2014-11-01

    Multipath networks allow that each source-destination pair can have several different paths for data transmission, thus they improve the performance of increasingly bandwidth-hungry applications and well cater for traffic load balancing and bandwidth usage efficiency. This paper investigates fair resource allocation for users in multipath networks and formulates it as a multipath network utility maximisation problem with several fairness concepts. By applying the Lagrangian method, sub-problems for users and paths are derived from the resource allocation model and interpreted from an economic point of view. In order to solve the model, a novel rate-based flow control algorithm is proposed for achieving optimal resource allocation, which depends only on local information. In the presence of round-trip delays, sufficient conditions are obtained for local stability of the delayed algorithm. As for the end-to-end implementation in Internet, a window-based flow control mechanism is presented since it is more convenient to implement than rate-based flow control.

  5. Legal briefing: organ donation and allocation.

    PubMed

    Pope, Thaddeus Mason

    2010-01-01

    This issue's "Legal Briefing" column covers legal developments pertaining to organ donation and allocation. This topic has been the subject of recent articles in JCE. Organ donation and allocation have also recently been the subjects of significant public policy attention. In the past several months, legislatures and regulatory agencies across the United States and across the world have changed, or considered changing, the methods for procuring and distributing human organs for transplantation. Currently, in the U.S., more than 100,000 persons are waiting for organ transplantation. In China, more than 1.5 million people are waiting. Given the chronic shortage of available organs (especially kidneys and livers) relative to demand, the primary focus of most legal developments has been on increasing the rate of donation. These and related developments are usefully divided into the following 12 topical categories: 1. Revised Uniform Anatomical Gift Act. 2. Presumed Consent and Opt-Out. 3. Mandated Choice. 4. Donation after Cardiac Death. 5. Payment and Compensation. 6. Donation by Prisoners. 7. Donor Registries. 8. Public Education. 9. Other Procurement Initiatives. 10. Lawsuits and Liability. 11. Trafficking and Tourism. 12. Allocation and Distribution. PMID:21089996

  6. Agreement in cardiovascular risk rating based on anthropometric parameters

    PubMed Central

    Dantas, Endilly Maria da Silva; Pinto, Cristiane Jordânia; Freitas, Rodrigo Pegado de Abreu; de Medeiros, Anna Cecília Queiroz

    2015-01-01

    Objective To investigate the agreement in evaluation of risk of developing cardiovascular diseases based on anthropometric parameters in young adults. Methods The study included 406 students, measuring weight, height, and waist and neck circumferences. Waist-to-height ratio and the conicity index. The kappa coefficient was used to assess agreement in risk classification for cardiovascular diseases. The positive and negative specific agreement values were calculated as well. The Pearson chi-square (χ2) test was used to assess associations between categorical variables (p<0.05). Results The majority of the parameters assessed (44%) showed slight (k=0.21 to 0.40) and/or poor agreement (k<0.20), with low values of negative specific agreement. The best agreement was observed between waist circumference and waist-to-height ratio both for the general population (k=0.88) and between sexes (k=0.93 to 0.86). There was a significant association (p<0.001) between the risk of cardiovascular diseases and females when using waist circumference and conicity index, and with males when using neck circumference. This resulted in a wide variation in the prevalence of cardiovascular disease risk (5.5%-36.5%), depending on the parameter and the sex that was assessed. Conclusion The results indicate variability in agreement in assessing risk for cardiovascular diseases, based on anthropometric parameters, and which also seems to be influenced by sex. Further studies in the Brazilian population are required to better understand this issue. PMID:26466060

  7. Rethinking Reinforcement: Allocation, Induction, and Contingency

    PubMed Central

    Baum, William M

    2012-01-01

    The concept of reinforcement is at least incomplete and almost certainly incorrect. An alternative way of organizing our understanding of behavior may be built around three concepts: allocation, induction, and correlation. Allocation is the measure of behavior and captures the centrality of choice: All behavior entails choice and consists of choice. Allocation changes as a result of induction and correlation. The term induction covers phenomena such as adjunctive, interim, and terminal behavior—behavior induced in a situation by occurrence of food or another Phylogenetically Important Event (PIE) in that situation. Induction resembles stimulus control in that no one-to-one relation exists between induced behavior and the inducing event. If one allowed that some stimulus control were the result of phylogeny, then induction and stimulus control would be identical, and a PIE would resemble a discriminative stimulus. Much evidence supports the idea that a PIE induces all PIE-related activities. Research also supports the idea that stimuli correlated with PIEs become PIE-related conditional inducers. Contingencies create correlations between “operant” activity (e.g., lever pressing) and PIEs (e.g., food). Once an activity has become PIE-related, the PIE induces it along with other PIE-related activities. Contingencies also constrain possible performances. These constraints specify feedback functions, which explain phenomena such as the higher response rates on ratio schedules in comparison with interval schedules. Allocations that include a lot of operant activity are “selected” only in the sense that they generate more frequent occurrence of the PIE within the constraints of the situation; contingency and induction do the “selecting.” PMID:22287807

  8. Rethinking reinforcement: allocation, induction, and contingency.

    PubMed

    Baum, William M

    2012-01-01

    The concept of reinforcement is at least incomplete and almost certainly incorrect. An alternative way of organizing our understanding of behavior may be built around three concepts: allocation, induction, and correlation. Allocation is the measure of behavior and captures the centrality of choice: All behavior entails choice and consists of choice. Allocation changes as a result of induction and correlation. The term induction covers phenomena such as adjunctive, interim, and terminal behavior-behavior induced in a situation by occurrence of food or another Phylogenetically Important Event (PIE) in that situation. Induction resembles stimulus control in that no one-to-one relation exists between induced behavior and the inducing event. If one allowed that some stimulus control were the result of phylogeny, then induction and stimulus control would be identical, and a PIE would resemble a discriminative stimulus. Much evidence supports the idea that a PIE induces all PIE-related activities. Research also supports the idea that stimuli correlated with PIEs become PIE-related conditional inducers. Contingencies create correlations between "operant" activity (e.g., lever pressing) and PIEs (e.g., food). Once an activity has become PIE-related, the PIE induces it along with other PIE-related activities. Contingencies also constrain possible performances. These constraints specify feedback functions, which explain phenomena such as the higher response rates on ratio schedules in comparison with interval schedules. Allocations that include a lot of operant activity are "selected" only in the sense that they generate more frequent occurrence of the PIE within the constraints of the situation; contingency and induction do the "selecting." PMID:22287807

  9. Rethinking reinforcement: allocation, induction, and contingency.

    PubMed

    Baum, William M

    2012-01-01

    The concept of reinforcement is at least incomplete and almost certainly incorrect. An alternative way of organizing our understanding of behavior may be built around three concepts: allocation, induction, and correlation. Allocation is the measure of behavior and captures the centrality of choice: All behavior entails choice and consists of choice. Allocation changes as a result of induction and correlation. The term induction covers phenomena such as adjunctive, interim, and terminal behavior-behavior induced in a situation by occurrence of food or another Phylogenetically Important Event (PIE) in that situation. Induction resembles stimulus control in that no one-to-one relation exists between induced behavior and the inducing event. If one allowed that some stimulus control were the result of phylogeny, then induction and stimulus control would be identical, and a PIE would resemble a discriminative stimulus. Much evidence supports the idea that a PIE induces all PIE-related activities. Research also supports the idea that stimuli correlated with PIEs become PIE-related conditional inducers. Contingencies create correlations between "operant" activity (e.g., lever pressing) and PIEs (e.g., food). Once an activity has become PIE-related, the PIE induces it along with other PIE-related activities. Contingencies also constrain possible performances. These constraints specify feedback functions, which explain phenomena such as the higher response rates on ratio schedules in comparison with interval schedules. Allocations that include a lot of operant activity are "selected" only in the sense that they generate more frequent occurrence of the PIE within the constraints of the situation; contingency and induction do the "selecting."

  10. 42 CFR 413.174 - Prospective rates for hospital-based and independent ESRD facilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Prospective rates for hospital-based and....174 Prospective rates for hospital-based and independent ESRD facilities. Link to an amendment... January 1, 2009, the methodology differentiates between hospital-based and independent ESRD facilities;...

  11. Patch Network for Power Allocation and Distribution in Smart Materials

    NASA Technical Reports Server (NTRS)

    Golembiewski, Walter T.

    2000-01-01

    The power allocation and distribution (PAD) circuitry is capable of allocating and distributing a single or multiple sources of power over multi-elements of a power user grid system. The purpose of this invention is to allocate and distribute power that is collected by individual patch rectennas to a region of specific power-user devices, such as actuators. The patch rectenna converts microwave power into DC power. Then this DC power is used to drive actuator devices. However, the power from patch rectennas is not sufficient to drive actuators unless all the collected power is effectively used to drive another group by allocation and distribution. The power allocation and distribution (PAD) circuitry solves the shortfall of power for devices in a large array. The PAD concept is based on the networked power control in which power collected over the whole array of rectennas is allocated to a sub domain where a group of devices is required to be activated for operation. Then the allocated power is distributed to individual element of power-devices in the sub domain according to a selected run-mode.

  12. Preserving the allocation ratio at every allocation with biased coin randomization and minimization in studies with unequal allocation.

    PubMed

    Kuznetsova, Olga M; Tymofyeyev, Yevgen

    2012-04-13

    The demand for unequal allocation in clinical trials is growing. Most commonly, the unequal allocation is achieved through permuted block randomization. However, other allocation procedures might be required to better approximate the allocation ratio in small samples, reduce the selection bias in open-label studies, or balance on baseline covariates. When these allocation procedures are generalized to unequal allocation, special care is to be taken to preserve the allocation ratio at every allocation step. This paper offers a way to expand the biased coin randomization to unequal allocation that preserves the allocation ratio at every allocation. The suggested expansion works with biased coin randomization that balances only on treatment group totals and with covariate-adaptive procedures that use a random biased coin element at every allocation. Balancing properties of the allocation ratio preserving biased coin randomization and minimization are described through simulations. It is demonstrated that these procedures are asymptotically protected against the shift in the rerandomization distribution identified for some examples of minimization with 1:2 allocation. The asymptotic shift in the rerandomization distribution of the difference in treatment means for an arbitrary unequal allocation procedure is explicitly derived in the paper.

  13. Collective credit allocation in science

    PubMed Central

    Shen, Hua-Wei; Barabási, Albert-László

    2014-01-01

    Collaboration among researchers is an essential component of the modern scientific enterprise, playing a particularly important role in multidisciplinary research. However, we continue to wrestle with allocating credit to the coauthors of publications with multiple authors, because the relative contribution of each author is difficult to determine. At the same time, the scientific community runs an informal field-dependent credit allocation process that assigns credit in a collective fashion to each work. Here we develop a credit allocation algorithm that captures the coauthors’ contribution to a publication as perceived by the scientific community, reproducing the informal collective credit allocation of science. We validate the method by identifying the authors of Nobel-winning papers that are credited for the discovery, independent of their positions in the author list. The method can also compare the relative impact of researchers working in the same field, even if they did not publish together. The ability to accurately measure the relative credit of researchers could affect many aspects of credit allocation in science, potentially impacting hiring, funding, and promotion decisions. PMID:25114238

  14. Collective credit allocation in science.

    PubMed

    Shen, Hua-Wei; Barabási, Albert-László

    2014-08-26

    Collaboration among researchers is an essential component of the modern scientific enterprise, playing a particularly important role in multidisciplinary research. However, we continue to wrestle with allocating credit to the coauthors of publications with multiple authors, because the relative contribution of each author is difficult to determine. At the same time, the scientific community runs an informal field-dependent credit allocation process that assigns credit in a collective fashion to each work. Here we develop a credit allocation algorithm that captures the coauthors' contribution to a publication as perceived by the scientific community, reproducing the informal collective credit allocation of science. We validate the method by identifying the authors of Nobel-winning papers that are credited for the discovery, independent of their positions in the author list. The method can also compare the relative impact of researchers working in the same field, even if they did not publish together. The ability to accurately measure the relative credit of researchers could affect many aspects of credit allocation in science, potentially impacting hiring, funding, and promotion decisions. PMID:25114238

  15. Base Rates of Social Skills Acquisition/Performance Deficits, Strengths, and Problem Behaviors: An Analysis of the Social Skills Improvement System-Rating Scales

    ERIC Educational Resources Information Center

    Gresham, Frank M.; Elliott, Stephen N.; Kettler, Ryan J.

    2010-01-01

    Base rate information is important in clinical assessment because one cannot know how unusual or typical a phenomenon is without first knowing its base rate in the population. This study empirically determined the base rates of social skills acquisition and performance deficits, social skills strengths, and problem behaviors using a nationally…

  16. Experimental ocean acidification alters the allocation of metabolic energy

    PubMed Central

    Pan, T.-C. Francis; Applebaum, Scott L.; Manahan, Donal T.

    2015-01-01

    Energy is required to maintain physiological homeostasis in response to environmental change. Although responses to environmental stressors frequently are assumed to involve high metabolic costs, the biochemical bases of actual energy demands are rarely quantified. We studied the impact of a near-future scenario of ocean acidification [800 µatm partial pressure of CO2 (pCO2)] during the development and growth of an important model organism in developmental and environmental biology, the sea urchin Strongylocentrotus purpuratus. Size, metabolic rate, biochemical content, and gene expression were not different in larvae growing under control and seawater acidification treatments. Measurements limited to those levels of biological analysis did not reveal the biochemical mechanisms of response to ocean acidification that occurred at the cellular level. In vivo rates of protein synthesis and ion transport increased ∼50% under acidification. Importantly, the in vivo physiological increases in ion transport were not predicted from total enzyme activity or gene expression. Under acidification, the increased rates of protein synthesis and ion transport that were sustained in growing larvae collectively accounted for the majority of available ATP (84%). In contrast, embryos and prefeeding and unfed larvae in control treatments allocated on average only 40% of ATP to these same two processes. Understanding the biochemical strategies for accommodating increases in metabolic energy demand and their biological limitations can serve as a quantitative basis for assessing sublethal effects of global change. Variation in the ability to allocate ATP differentially among essential functions may be a key basis of resilience to ocean acidification and other compounding environmental stressors. PMID:25825763

  17. Experimental ocean acidification alters the allocation of metabolic energy.

    PubMed

    Pan, T-C Francis; Applebaum, Scott L; Manahan, Donal T

    2015-04-14

    Energy is required to maintain physiological homeostasis in response to environmental change. Although responses to environmental stressors frequently are assumed to involve high metabolic costs, the biochemical bases of actual energy demands are rarely quantified. We studied the impact of a near-future scenario of ocean acidification [800 µatm partial pressure of CO2 (pCO2)] during the development and growth of an important model organism in developmental and environmental biology, the sea urchin Strongylocentrotus purpuratus. Size, metabolic rate, biochemical content, and gene expression were not different in larvae growing under control and seawater acidification treatments. Measurements limited to those levels of biological analysis did not reveal the biochemical mechanisms of response to ocean acidification that occurred at the cellular level. In vivo rates of protein synthesis and ion transport increased ∼50% under acidification. Importantly, the in vivo physiological increases in ion transport were not predicted from total enzyme activity or gene expression. Under acidification, the increased rates of protein synthesis and ion transport that were sustained in growing larvae collectively accounted for the majority of available ATP (84%). In contrast, embryos and prefeeding and unfed larvae in control treatments allocated on average only 40% of ATP to these same two processes. Understanding the biochemical strategies for accommodating increases in metabolic energy demand and their biological limitations can serve as a quantitative basis for assessing sublethal effects of global change. Variation in the ability to allocate ATP differentially among essential functions may be a key basis of resilience to ocean acidification and other compounding environmental stressors.

  18. Background activities, induction, and behavioral allocation in operant performance.

    PubMed

    Baum, William M; Davison, Michael

    2014-09-01

    In experiments on operant behavior, other activities, called "background" activities, compete with the operant activities. Herrnstein's (1970) formulation of the matching law included background reinforcers in the form of a parameter rO, but remained vague about the activities (BO) that produce rO. To gain more understanding, we analyzed data from three studies of performance with pairs of variable-interval schedules that changed frequently in the relative rate at which they produced food: Baum and Davison (2014), Belke and Heyman (1994), and Soto, McDowell, and Dallery (2005). Results sometimes deviated from the matching law, suggesting variation in rO. When rO was calculated from the matching equation, two results emerged: (a) rO is directly proportional to BO, as in a ratio schedule; and (b) rO and BO depend on the food rate, which is to say that BO consists of activities induced by food, as a phylogenetically important event. Other activities unrelated to food (BN ) correspond to Herrnstein's original conception of rO and may be included in the matching equation. A model based on Baum's (Baum, 2012) concepts of allocation, induction, and contingency explained the deviations from the matching law. In the model, operant activity B, BO, and BN competed unequally in the time allocation: B and BO both replaced BN , BO replaced lever pressing (Soto et al.), and key pecking replaced BO (Baum & Davison). Although the dependence of rO and BO on food rate changes Herrnstein's (1970) formulation, the model preserved the generalized matching law for operant activities by incorporating power-function induction.

  19. Accept/decline decision module for the liver simulated allocation model.

    PubMed

    Kim, Sang-Phil; Gupta, Diwakar; Israni, Ajay K; Kasiske, Bertram L

    2015-03-01

    Simulated allocation models (SAMs) are used to evaluate organ allocation policies. An important component of SAMs is a module that decides whether each potential recipient will accept an offered organ. The objective of this study was to develop and test accept-or-decline classifiers based on several machine-learning methods in an effort to improve the SAM for liver allocation. Feature selection and imbalance correction methods were tested and best approaches identified for application to organ transplant data. Then, we used 2011 liver match-run data to compare classifiers based on logistic regression, support vector machines, boosting, classification and regression trees, and Random Forests. Finally, because the accept-or-decline module will be embedded in a simulation model, we also developed an evaluation tool for comparing performance of predictors, which we call sample-path accuracy. The Random Forest method resulted in the smallest overall error rate, and boosting techniques had greater accuracy when both sensitivity and specificity were simultaneously considered important. Our comparisons show that no method dominates all others on all performance measures of interest. A logistic regression-based classifier is easy to implement and allows for pinpointing the contribution of each feature toward the probability of acceptance. Other methods we tested did not have a similar interpretation. The Scientific Registry of Transplant Recipients decided to use the logistic regression-based accept-decline decision module in the next generation of liver SAM.

  20. Irrational time allocation in decision-making.

    PubMed

    Oud, Bastiaan; Krajbich, Ian; Miller, Kevin; Cheong, Jin Hyun; Botvinick, Matthew; Fehr, Ernst

    2016-01-13

    Time is an extremely valuable resource but little is known about the efficiency of time allocation in decision-making. Empirical evidence suggests that in many ecologically relevant situations, decision difficulty and the relative reward from making a correct choice, compared to an incorrect one, are inversely linked, implying that it is optimal to use relatively less time for difficult choice problems. This applies, in particular, to value-based choices, in which the relative reward from choosing the higher valued item shrinks as the values of the other options get closer to the best option and are thus more difficult to discriminate. Here, we experimentally show that people behave sub-optimally in such contexts. They do not respond to incentives that favour the allocation of time to choice problems in which the relative reward for choosing the best option is high; instead they spend too much time on problems in which the reward difference between the options is low. We demonstrate this by showing that it is possible to improve subjects' time allocation with a simple intervention that cuts them off when their decisions take too long. Thus, we provide a novel form of evidence that organisms systematically spend their valuable time in an inefficient way, and simultaneously offer a potential solution to the problem. PMID:26763695

  1. Irrational time allocation in decision-making.

    PubMed

    Oud, Bastiaan; Krajbich, Ian; Miller, Kevin; Cheong, Jin Hyun; Botvinick, Matthew; Fehr, Ernst

    2016-01-13

    Time is an extremely valuable resource but little is known about the efficiency of time allocation in decision-making. Empirical evidence suggests that in many ecologically relevant situations, decision difficulty and the relative reward from making a correct choice, compared to an incorrect one, are inversely linked, implying that it is optimal to use relatively less time for difficult choice problems. This applies, in particular, to value-based choices, in which the relative reward from choosing the higher valued item shrinks as the values of the other options get closer to the best option and are thus more difficult to discriminate. Here, we experimentally show that people behave sub-optimally in such contexts. They do not respond to incentives that favour the allocation of time to choice problems in which the relative reward for choosing the best option is high; instead they spend too much time on problems in which the reward difference between the options is low. We demonstrate this by showing that it is possible to improve subjects' time allocation with a simple intervention that cuts them off when their decisions take too long. Thus, we provide a novel form of evidence that organisms systematically spend their valuable time in an inefficient way, and simultaneously offer a potential solution to the problem.

  2. A Base Rate Approach to Evaluating Developmental Mathematics and English Courses at a Community College.

    ERIC Educational Resources Information Center

    Hector, Judith H.

    A study was conducted at Walters State Community College (WSCC) using a "base rate" approach to evaluate developmental mathematics and English courses. The study involved the collection of data on students' success rates in developmental courses and later in college-level courses; on developmental students' success rates compared with those of…

  3. A rate-compatible family of protograph-based LDPC codes built by expurgation and lengthening

    NASA Technical Reports Server (NTRS)

    Dolinar, Sam

    2005-01-01

    We construct a protograph-based rate-compatible family of low-density parity-check codes that cover a very wide range of rates from 1/2 to 16/17, perform within about 0.5 dB of their capacity limits for all rates, and can be decoded conveniently and efficiently with a common hardware implementation.

  4. Effects of Personalization and Invitation Email Length on Web-Based Survey Response Rates

    ERIC Educational Resources Information Center

    Trespalacios, Jesús H.; Perkins, Ross A.

    2016-01-01

    Individual strategies to increase response rate and survey completion have been extensively researched. Recently, efforts have been made to investigate a combination of interventions to yield better response rates for web-based surveys. This study examined the effects of four different survey invitation conditions on response rate. From a large…

  5. Software For Allocation Of Tolerances

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken; Raman, Shivakumar; Pulat, Simin

    1992-01-01

    Collection of computer programs being developed to assist engineers in allocating tolerances to dimensions of components and assemblies. System reflects tolerancing expertise of design and manufacturing engineers; helps engineers maintain comprehensive tolerancing policy and overview that might otherwise get lost when attending to details of design and manufacturing processes. Necessary to allocate tolerances for three main reasons: tolerances allow for variations in dimensions of components as manufactured; assembly of two or more components, dimensions lie between specified limits; and part replaced must fit in place.

  6. Compliance of Perioperative Antibiotic Dosing and Surgical Site Infection Rate in Office-Based Elective Surgery

    PubMed Central

    Davison, Steven P.; Jackson, Monica

    2016-01-01

    Background: A best practice goal to reduce surgical site infection includes administration of antibiotics in the ideal preoperative window. This article evaluates an office surgical suite antibiotic administration rate and compares it with the timing of a local hospital treating a similar patient population. The hypothesis was that similar or better compliance and surgical site infection rates can be achieved in the office-based suite. Methods: A total of 277 office-based surgeries were analyzed for antibiotic administration time before incision and their corresponding surgical site infection rate. Results: Our facility administered timely prophylactic antibiotics in 96% of cases with a surgical site infection rate of 0.36%. This rate was significantly lower than a reported rate of 3.7%. Conclusion: Low infection rates with high antibiotic administration rate suggest that compliance with best possible practice protocols is possible in the outpatient setting. PMID:27579234

  7. Antenna Allocation in MIMO Radar with Widely Separated Antennas for Multi-Target Detection

    PubMed Central

    Gao, Hao; Wang, Jian; Jiang, Chunxiao; Zhang, Xudong

    2014-01-01

    In this paper, we explore a new resource called multi-target diversity to optimize the performance of multiple input multiple output (MIMO) radar with widely separated antennas for detecting multiple targets. In particular, we allocate antennas of the MIMO radar to probe different targets simultaneously in a flexible manner based on the performance metric of relative entropy. Two antenna allocation schemes are proposed. In the first scheme, each antenna is allocated to illuminate a proper target over the entire illumination time, so that the detection performance of each target is guaranteed. The problem is formulated as a minimum makespan scheduling problem in the combinatorial optimization framework. Antenna allocation is implemented through a branch-and-bound algorithm and an enhanced factor 2 algorithm. In the second scheme, called antenna-time allocation, each antenna is allocated to illuminate different targets with different illumination time. Both antenna allocation and time allocation are optimized based on illumination probabilities. Over a large range of transmitted power, target fluctuations and target numbers, both of the proposed antenna allocation schemes outperform the scheme without antenna allocation. Moreover, the antenna-time allocation scheme achieves a more robust detection performance than branch-and-bound algorithm and the enhanced factor 2 algorithm when the target number changes. PMID:25350505

  8. Antenna allocation in MIMO radar with widely separated antennas for multi-target detection.

    PubMed

    Gao, Hao; Wang, Jian; Jiang, Chunxiao; Zhang, Xudong

    2014-10-27

    In this paper, we explore a new resource called multi-target diversity to optimize the performance of multiple input multiple output (MIMO) radar with widely separated antennas for detecting multiple targets. In particular, we allocate antennas of the MIMO radar to probe different targets simultaneously in a flexible manner based on the performance metric of relative entropy. Two antenna allocation schemes are proposed. In the first scheme, each antenna is allocated to illuminate a proper target over the entire illumination time, so that the detection performance of each target is guaranteed. The problem is formulated as a minimum makespan scheduling problem in the combinatorial optimization framework. Antenna allocation is implemented through a branch-and-bound algorithm and an enhanced factor 2 algorithm. In the second scheme, called antenna-time allocation, each antenna is allocated to illuminate different targets with different illumination time. Both antenna allocation and time allocation are optimized based on illumination probabilities. Over a large range of transmitted power, target fluctuations and target numbers, both of the proposed antenna allocation schemes outperform the scheme without antenna allocation. Moreover, the antenna-time allocation scheme achieves a more robust detection performance than branch-and-bound algorithm and the enhanced factor 2 algorithm when the target number changes.

  9. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, Troy; Toon, Jamie; Conner, Angelo C.; Adams, Timothy C.; Miranda, David J.

    2016-01-01

    This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program’s subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.

  10. 77 FR 42722 - Berry Petroleum Company; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-20

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Berry Petroleum Company; Supplemental Notice That Initial Market-Based Rate...-referenced proceeding, of Berry Petroleum Company's application for market-based rate authority, with...

  11. 76 FR 39868 - Amerigreen Energy, Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-07

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Amerigreen Energy, Inc.; Supplemental Notice That Initial Market- Based Rate...-referenced proceeding of Amerigreen Energy, Inc.'s application for market-based rate authority, with...

  12. 77 FR 35374 - Independence Electricity; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-13

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Independence Electricity; Supplemental Notice That Initial Market-Based Rate...-referenced proceeding of Independence Electricity's application for market-based rate authority, with...

  13. Unobtrusive heart rate monitor based on a fiber specklegram sensor and a single-board computer

    NASA Astrophysics Data System (ADS)

    Benevides, Alessandro B.; Frizera, Anselmo; Cotrina, Anibal; Ribeiro, Moisés. R. N.; Segatto, Marcelo E. V.; Pontes, Maria José

    2015-09-01

    This paper proposes a portable and unobtrusive heart rate monitor based on fiber specklegram sensors. The proposed module uses the Raspberry Pi module to perform the image acquisition and the fiber specklegram sensor, which is based on multimode plastic optical fibers. The heart rate is obtained by welch power spectral density estimate and the heart beats are identified by means of a threshold analysis.

  14. Can Sample-Specific Simulations Help Detect Low Base-Rate Taxonicity?

    ERIC Educational Resources Information Center

    Beach, Steven R. H.; Amir, Nader; Bau, Jinn Jonp

    2005-01-01

    The authors examined the role of the sample-specific simulations (SSS; A. M. Ruscio & J. Ruscio, 2002; J. Ruscio & A. M. Ruscio, 2004) procedure in detecting low base-rate taxa that might otherwise prove elusive. The procedure preserved key distributional characteristics for moderate to high base-rate taxa, but it performed inadequately for low…

  15. 77 FR 23475 - Ironwood Windpower, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-19

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Ironwood Windpower, LLC; Supplemental Notice That Initial Market- Based Rate...-referenced proceeding of Ironwood Windpower, LLC's application for market-based rate authority, with...

  16. 77 FR 30521 - Community Energy, Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-23

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Community Energy, Inc.; Supplemental Notice That Initial Market- Based Rate...-referenced proceeding of Community Energy, Inc.'s application for market-based rate authority, with...

  17. High Data Rate OFDM-Based Radio over FSO Communication System Using M-QAM Modulation

    NASA Astrophysics Data System (ADS)

    Kumar, Naresh; Teixeira, Antonio Luis Jesus

    2015-12-01

    In this paper, we have presented analysis of OFDM-based Radio over FSO Communication System using M-QAM Modulation under different data rate and attenuation. A distance of 1,000 m was achieved at 10 Gbit/s data rate in OFDM-based Radio over FSO Communication System.

  18. Determining the Scoring Validity of a Co-Constructed CEFR-Based Rating Scale

    ERIC Educational Resources Information Center

    Deygers, Bart; Van Gorp, Koen

    2015-01-01

    Considering scoring validity as encompassing both reliable rating scale use and valid descriptor interpretation, this study reports on the validation of a CEFR-based scale that was co-constructed and used by novice raters. The research questions this paper wishes to answer are (a) whether it is possible to construct a CEFR-based rating scale with…

  19. 78 FR 56690 - Seneca Generation, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-13

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Seneca Generation, LLC; Supplemental Notice That Initial Market- Based Rate...-referenced proceeding, of Seneca Generation, LLC's application for market-based rate authority, with...

  20. COMBINING RATE-BASED AND CAP-AND-TRADE EMISSIONS POLICIES. (R828628)

    EPA Science Inventory

    Rate-based emissions policies (like tradable performance standards, TPS) fix average emissions intensity, while cap-and-trade (CAT) policies fix total emissions. This paper shows that unfettered trade between rate-based and cap-and-trade programs always raises combined emissio...

  1. 78 FR 75560 - Biofuels Washington LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Biofuels Washington LLC; Supplemental Notice That Initial Market- Based Rate...-referenced proceeding, of Biofuels Washington LLC's application for market-based rate authority, with...

  2. 18 CFR 284.502 - Procedures for applying for market-based rates.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Procedures for applying for market-based rates. 284.502 Section 284.502 Conservation of Power and Water Resources FEDERAL... POLICY ACT OF 1978 AND RELATED AUTHORITIES Applications for Market-Based Rates for Storage §...

  3. 76 FR 1609 - Morris Cogeneration, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-11

    ... Energy Regulatory Commission Morris Cogeneration, LLC; Supplemental Notice That Initial Market-Based Rate... notice in the above-referenced proceeding Morris Cogeneration, LLC's application for market-based rate... by clicking on the appropriate link in the above list. They are also available for review in...

  4. An Exploration of the Base Rate Scores of the Millon Clinical Multiaxial Inventory-III

    ERIC Educational Resources Information Center

    Grove, William M.; Vrieze, Scott I.

    2009-01-01

    The Millon Clinical Multiaxial Inventory (3rd ed.; MCMI-III) is a widely used psychological assessment of clinical and personality disorders. Unlike typical tests, the MCMI-III uses a base-rate score transformation to incorporate prior probabilities of disorder (i.e., base rates) in test output and diagnostic thresholds. The authors describe the…

  5. Resource allocation in circuit-switched all-optical networks

    NASA Astrophysics Data System (ADS)

    Marquis, Douglas; Barry, Richard A.; Finn, Steven G.; Parikh, Salil A.; Swanson, Eric A.; Thomas, Robert E.

    1996-03-01

    We describe an all-optical network testbed deployed in the Boston area, and research surrounding the allocation of optical resources -- frequencies and time slots -- within the network. The network was developed by a consortium of AT&T Bell Laboratories, Digital Equipment Corporation, and Massachusetts Institute of Technology under a grant from ARPA. The network is organized as a hierarchy consisting of local, metropolitan, and wide area nodes tea support optical broadcast and routing modes. Frequencies are shared and reused to enhance network scalability. Electronic access is provided through optical terminals that support multiple services having data rates between 10 Mbps/user and 10 Gbps/user. Of particular interest for this work is the 'B-service,' which simultaneously hops frequency and time slots on each optical terminal to allow frequency sharing within the AON. B-service provides 1.244 Gbps per optical terminal, with bandwidth for individual connections divided in increments as small as 10 Mbps. We have created interfaces between the AON and commercially available electronic circuit-switched and packet-switched networks. The packet switches provide FDDI (datacomm), T3 (telecomm), and ATM/SONET switching at backplane rates of over 3 Gbps. We show results on network applications that dynamically allocate optical bandwidth between electronic packet-switches based on the offered load presented by users. Bandwidth allocation granularity is proportional to B-Service slots (10-1244 Mbps), and switching times are on the order of one second. We have also studied the effects of wavelength changers upon the network capacity and blocking probabilities in wide area all-optical networks. Wavelength changers allow a change in the carrier frequency (within the network) without disturbing the data modulation. The study includes both a theoretical model of blocking probabilities based on network design parameters, and a computer simulation of blocking in networks with and

  6. Task allocation among multiple intelligent robots

    NASA Technical Reports Server (NTRS)

    Gasser, L.; Bekey, G.

    1987-01-01

    Researchers describe the design of a decentralized mechanism for allocating assembly tasks in a multiple robot assembly workstation. Currently, the approach focuses on distributed allocation to explore its feasibility and its potential for adaptability to changing circumstances, rather than for optimizing throughput. Individual greedy robots make their own local allocation decisions using both dynamic allocation policies which propagate through a network of allocation goals, and local static and dynamic constraints describing which robots are elibible for which assembly tasks. Global coherence is achieved by proper weighting of allocation pressures propagating through the assembly plan. Deadlock avoidance and synchronization is achieved using periodic reassessments of local allocation decisions, ageing of allocation goals, and short-term allocation locks on goals.

  7. The allocation of pancreas allografts on donor age and duration of intensive care unit stay: the experience of the North Italy Transplant program.

    PubMed

    Cardillo, Massimo; Nano, Rita; de Fazio, Nicola; Melzi, Raffaella; Drago, Francesca; Mercalli, Alessia; Dell'Acqua, Antonio; Scavini, Marina; Piemonti, Lorenzo

    2014-04-01

    Starting in 2011, the North Italy Transplant program (NITp) has based on the allocation of pancreas allografts on donor age and duration of intensive care unit (ICU) stay, but not on donor weight or BMI. We analyzed the detailed allocation protocols of all NITp pancreas donors (2011-2012; n = 433). Outcome measures included donor characteristics and pancreas loss reasons during the allocation process. Twenty-three percent of the 433 pancreases offered for allocation were transplanted. Younger age, shorter ICU stay, traumatic brain death, and higher eGFR were predictors of pancreas transplant, either as vascularized organ or as islets. Among pancreas allografts offered to vascularized organ programs, 35% were indeed transplanted, and younger donor age was the only predictor of transplant. The most common reasons for pancreas withdrawal from the allocation process were donor-related factors. Among pancreas offered to islet programs, 48% were processed, but only 14.2% were indeed transplanted, with unsuccessful isolation being the most common reason for pancreas loss. Younger donor age and higher BMI were predictors of islet allograft transplant. The current allocation strategy has allowed an equal distribution of pancreas allografts between programs for either vascularized organ or islet transplant. The high rate of discarded organs remained an unresolved issue.

  8. Priorities and allocations support for energy: keeping energy programs on schedule

    SciTech Connect

    Not Available

    1985-08-01

    This publication has covered DOE and DOC procedures related to an application for rating authority for an energy program, the DPAS (Defense Priorities and Allocations System) and the Special Priorities Assistance Program's relation to priorities and allocations for energy programs. A person engaged in an eligible energy program or project must be familiar with DPAS rules, regulations and procedures. There are important benefits to be gained by the Government and the energy industry through the proper use of the priorities and allocations system.

  9. Rate-based process modeling study of CO{sub 2} capture with aqueous monoethanolamine solution

    SciTech Connect

    Zhang, Y.; Chen, H.; Chen, C.C.; Plaza, J.M.; Dugas, R.; Rochelle, G.T.

    2009-10-15

    Rate-based process modeling technology has matured and is increasingly gaining acceptance over traditional equilibrium-stage modeling approaches. Recently comprehensive pilot plant data for carbon dioxide (CO{sub 2}) capture with aqueous monoethanolamine (MEA) solution have become available from the University of Texas at Austin. The pilot plant data cover key process variables including CO{sub 2} concentration in the gas stream, CO{sub 2} loading in lean MEA solution, liquid to gas ratio, and packing type. In this study, we model the pilot plant operation with Aspen RateSep, a second generation rate-based multistage separation unit operation model in Aspen Plus. After a brief review of rate-based modeling, thermodynamic and kinetic models for CO{sub 2} absorption with the MEA solution, and transport property models, we show excellent match of the rate-based model predictions against the comprehensive pilot plant data and we validate the superiority of the rate-based models over the traditional equilibrium-stage models. We further examine the impacts of key rate-based modeling options, i.e., film discretization options and flow model options. The rate-based model provides excellent predictive capability, and it should be very useful for design and scale-up of CO{sub 2} capture processes.

  10. Regulating nutrient allocation in plants

    SciTech Connect

    Udvardi, Michael; Yang, Jiading; Worley, Eric

    2014-12-09

    The invention provides coding and promoter sequences for a VS-1 and AP-2 gene, which affects the developmental process of senescence in plants. Vectors, transgenic plants, seeds, and host cells comprising heterologous VS-1 and AP-2 genes are also provided. Additionally provided are methods of altering nutrient allocation and composition in a plant using the VS-1 and AP-2 genes.

  11. Administrators' Decisions about Resource Allocation

    ERIC Educational Resources Information Center

    Knight, William E.; Folkins, John W.; Hakel, Milton D.; Kennell, Richard P.

    2011-01-01

    Do academic administrators make decisions about resource allocation differently depending on the discipline receiving the funding? Does an administrator's academic identity influence these decisions? This study explored those questions with a sample of 1,690 academic administrators at doctoral-research universities. Participants used fictional…

  12. Report on Tribal Priority Allocations.

    ERIC Educational Resources Information Center

    Bureau of Indian Affairs (Dept. of Interior), Washington, DC.

    As part of Bureau of Indian Affairs (BIA) funding, Tribal Priority Allocations (TPA) are the principal source of funds for tribal governments and agency offices at the reservation level. According to their unique needs and circumstances, tribes may prioritize funding among eight general categories: government, human services, education, public…

  13. The Discipline of Asset Allocation.

    ERIC Educational Resources Information Center

    Petzel, Todd E.

    2000-01-01

    Discussion of asset allocation for college/university endowment funds focuses on three levels of risk: (1) the absolute risk of the portfolio (usually leading to asset diversification); (2) the benchmark risk (usually comparison with peer institutions; and (3) personal career risk (which may incline managers toward maximizing short-term returns,…

  14. Food restriction alters energy allocation strategy during growth in tobacco hornworms (Manduca sexta larvae).

    PubMed

    Jiao, Lihong; Amunugama, Kaushalya; Hayes, Matthew B; Jennings, Michael; Domingo, Azriel; Hou, Chen

    2015-08-01

    Growing animals must alter their energy budget in the face of environmental changes and prioritize the energy allocation to metabolism for life-sustaining requirements and energy deposition in new biomass growth. We hypothesize that when food availability is low, larvae of holometabolic insects with a short development stage (relative to the low food availability period) prioritize biomass growth at the expense of metabolism. Driven by this hypothesis, we develop a simple theoretical model, based on conservation of energy and allometric scaling laws, for understanding the dynamic energy budget of growing larvae under food restriction. We test the hypothesis by manipulative experiments on fifth instar hornworms at three temperatures. At each temperature, food restriction increases the scaling power of growth rate but decreases that of metabolic rate, as predicted by the hypothesis. During the fifth instar, the energy budgets of larvae change dynamically. The free-feeding larvae slightly decrease the energy allocated to growth as body mass increases and increase the energy allocated to life sustaining. The opposite trends were observed in food restricted larvae, indicating the predicted prioritization in the energy budget under food restriction. We compare the energy budgets of a few endothermic and ectothermic species and discuss how different life histories lead to the differences in the energy budgets under food restriction. PMID:26105046

  15. Food restriction alters energy allocation strategy during growth in tobacco hornworms ( Manduca sexta larvae)

    NASA Astrophysics Data System (ADS)

    Jiao, Lihong; Amunugama, Kaushalya; Hayes, Matthew B.; Jennings, Michael; Domingo, Azriel; Hou, Chen

    2015-08-01

    Growing animals must alter their energy budget in the face of environmental changes and prioritize the energy allocation to metabolism for life-sustaining requirements and energy deposition in new biomass growth. We hypothesize that when food availability is low, larvae of holometabolic insects with a short development stage (relative to the low food availability period) prioritize biomass growth at the expense of metabolism. Driven by this hypothesis, we develop a simple theoretical model, based on conservation of energy and allometric scaling laws, for understanding the dynamic energy budget of growing larvae under food restriction. We test the hypothesis by manipulative experiments on fifth instar hornworms at three temperatures. At each temperature, food restriction increases the scaling power of growth rate but decreases that of metabolic rate, as predicted by the hypothesis. During the fifth instar, the energy budgets of larvae change dynamically. The free-feeding larvae slightly decrease the energy allocated to growth as body mass increases and increase the energy allocated to life sustaining. The opposite trends were observed in food restricted larvae, indicating the predicted prioritization in the energy budget under food restriction. We compare the energy budgets of a few endothermic and ectothermic species and discuss how different life histories lead to the differences in the energy budgets under food restriction.

  16. Food restriction alters energy allocation strategy during growth in tobacco hornworms (Manduca sexta larvae).

    PubMed

    Jiao, Lihong; Amunugama, Kaushalya; Hayes, Matthew B; Jennings, Michael; Domingo, Azriel; Hou, Chen

    2015-08-01

    Growing animals must alter their energy budget in the face of environmental changes and prioritize the energy allocation to metabolism for life-sustaining requirements and energy deposition in new biomass growth. We hypothesize that when food availability is low, larvae of holometabolic insects with a short development stage (relative to the low food availability period) prioritize biomass growth at the expense of metabolism. Driven by this hypothesis, we develop a simple theoretical model, based on conservation of energy and allometric scaling laws, for understanding the dynamic energy budget of growing larvae under food restriction. We test the hypothesis by manipulative experiments on fifth instar hornworms at three temperatures. At each temperature, food restriction increases the scaling power of growth rate but decreases that of metabolic rate, as predicted by the hypothesis. During the fifth instar, the energy budgets of larvae change dynamically. The free-feeding larvae slightly decrease the energy allocated to growth as body mass increases and increase the energy allocated to life sustaining. The opposite trends were observed in food restricted larvae, indicating the predicted prioritization in the energy budget under food restriction. We compare the energy budgets of a few endothermic and ectothermic species and discuss how different life histories lead to the differences in the energy budgets under food restriction.

  17. Acid-base chemical reaction model for nucleation rates in the polluted atmospheric boundary layer.

    PubMed

    Chen, Modi; Titcombe, Mari; Jiang, Jingkun; Jen, Coty; Kuang, Chongai; Fischer, Marc L; Eisele, Fred L; Siepmann, J Ilja; Hanson, David R; Zhao, Jun; McMurry, Peter H

    2012-11-13

    Climate models show that particles formed by nucleation can affect cloud cover and, therefore, the earth's radiation budget. Measurements worldwide show that nucleation rates in the atmospheric boundary layer are positively correlated with concentrations of sulfuric acid vapor. However, current nucleation theories do not correctly predict either the observed nucleation rates or their functional dependence on sulfuric acid concentrations. This paper develops an alternative approach for modeling nucleation rates, based on a sequence of acid-base reactions. The model uses empirical estimates of sulfuric acid evaporation rates obtained from new measurements of neutral molecular clusters. The model predicts that nucleation rates equal the sulfuric acid vapor collision rate times a prefactor that is less than unity and that depends on the concentrations of basic gaseous compounds and preexisting particles. Predicted nucleation rates and their dependence on sulfuric acid vapor concentrations are in reasonable agreement with measurements from Mexico City and Atlanta. PMID:23091030

  18. Acid-base chemical reaction model for nucleation rates in the polluted atmospheric boundary layer.

    PubMed

    Chen, Modi; Titcombe, Mari; Jiang, Jingkun; Jen, Coty; Kuang, Chongai; Fischer, Marc L; Eisele, Fred L; Siepmann, J Ilja; Hanson, David R; Zhao, Jun; McMurry, Peter H

    2012-11-13

    Climate models show that particles formed by nucleation can affect cloud cover and, therefore, the earth's radiation budget. Measurements worldwide show that nucleation rates in the atmospheric boundary layer are positively correlated with concentrations of sulfuric acid vapor. However, current nucleation theories do not correctly predict either the observed nucleation rates or their functional dependence on sulfuric acid concentrations. This paper develops an alternative approach for modeling nucleation rates, based on a sequence of acid-base reactions. The model uses empirical estimates of sulfuric acid evaporation rates obtained from new measurements of neutral molecular clusters. The model predicts that nucleation rates equal the sulfuric acid vapor collision rate times a prefactor that is less than unity and that depends on the concentrations of basic gaseous compounds and preexisting particles. Predicted nucleation rates and their dependence on sulfuric acid vapor concentrations are in reasonable agreement with measurements from Mexico City and Atlanta.

  19. The influence of sensor orientation on activity-based rate responsive pacing. Sensor Orientation Study Group.

    PubMed

    Theres, H; Philippon, F; Melzer, C; Combs, W; Prest-Berg, K

    1998-11-01

    Piezoelectric activity-based rate responsive pacemakers are commonly implanted with the sensor facing inward. This study was conducted to assess the safe and effective rate response of an activity-based rate responsive pacemaker implanted with the sensor facing outward. A comparison were made to a previously studied patient group with sensor facing inward. Patient and pacemaker data was collected at predischarge and 2-month follow-up. Two-minute hall walks in conjunction with programmer-assisted rate response assessment were utilized to standardize initial rate response parameter settings for both patient groups. At 2-month follow-up, sensor rate response to a stage 3 limited CAEP protocol was recorded. Adequate sensor rate response was achieved for both patient groups. No difference was noted in reported patient complications for both groups. A statistically significant difference in programmed rate response curve setting and activity threshold for the two groups was noted at 2-month follow-up. Adequate sensor rate response was achieved for a patient population implanted with an activity-based rate responsive pacemaker with sensor facing outward. In this orientation, one higher rate response curve setting and an activity threshold one value more sensitive were required on average when compared to the normal sensor orientation group. PMID:9826862

  20. 48 CFR 1631.203-70 - Allocation techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... base varies in proportion to the services performed. (4) Other method. Some cost groupings cannot.... Overall management costs should be grouped in relation to the activities managed. The base selected to measure the allocation of these indirect costs to cost objectives should be a base representative of...

  1. A review of alternative approaches to healthcare resource allocation.

    PubMed

    Petrou, S; Wolstenholme, J

    2000-07-01

    The resources available for healthcare are limited compared with demand, if not need, and all healthcare systems, regardless of their financing and organisation, employ mechanisms to ration or prioritise finite healthcare resources. This paper reviews alternative approaches that can be used to allocate healthcare resources. It discusses the problems encountered when allocating healthcare resources according to free market principles. It then proceeds to discuss the advantages and disadvantages of alternative resource allocation approaches that can be applied to public health systems. These include: (i) approaches based on the concept of meeting the needs of the population to maximising its capacity to benefit from interventions; (ii) economic approaches that identify the most efficient allocation of resources with the view of maximising health benefits or other measures of social welfare; (iii) approaches that seek to ration healthcare by age; and (iv) approaches that resolve resource allocation disputes through debate and bargaining. At present, there appears to be no consensus about the relative importance of the potentially conflicting principles that can be used to guide resource allocation decisions. It is concluded that whatever shape tomorrow's health service takes, the requirement to make equitable and efficient use of finite healthcare resources will remain.

  2. Optimal load allocation of multiple fuel boilers.

    PubMed

    Dunn, Alex C; Du, Yan Yi

    2009-04-01

    This paper presents a new methodology for optimally allocating a set of multiple industrial boilers that each simultaneously consumes multiple fuel types. Unlike recent similar approaches in the utility industry that use soft computing techniques, this approach is based on a second-order gradient search method that is easy to implement without any specialized optimization software. The algorithm converges rapidly and the application yields significant savings benefits, up to 3% of the overall operating cost of industrial boiler systems in the examples given and potentially higher in other cases, depending on the plant circumstances. Given today's energy prices, this can yield significant savings benefits to manufacturers that raise steam for plant operations.

  3. Pediatric heart allocation and transplantation in Eurotransplant.

    PubMed

    Smits, Jacqueline M; Thul, Josef; De Pauw, Michel; Delmo Walter, Eva; Strelniece, Agita; Green, Dave; de Vries, Erwin; Rahmel, Axel; Bauer, Juergen; Laufer, Guenther; Hetzer, Roland; Reichenspurner, Hermann; Meiser, Bruno

    2014-09-01

    Pediatric heart allocation in Eurotransplant (ET) has evolved over the past decades to better serve patients and improve utilization. Pediatric heart transplants (HT) account for 6% of the annual transplant volume in ET. Death rates on the pediatric heart transplant waiting list have decreased over the years, from 25% in 1997 to 18% in 2011. Within the first year after listing, 32% of all infants (<12 months), 20% of all children aged 1-10 years, and 15% of all children aged 11-15 years died without having received a heart transplant. Survival after transplantation improved over the years, and in almost a decade, the 1-year survival went from 83% to 89%, and the 3-year rates increased from 81% to 85%. Improved medical management of heart failure patients and the availability of mechanical support for children have significantly improved the prospects for children on the heart transplant waiting list.

  4. Effects of adaptive task allocation on monitoring of automated systems

    NASA Technical Reports Server (NTRS)

    Parasuraman, R.; Mouloua, M.; Molloy, R.

    1996-01-01

    The effects of adaptive task allocation on monitoring for automation failure during multitask flight simulation were examined. Participants monitored an automated engine status task while simultaneously performing tracking and fuel management tasks over three 30-min sessions. Two methods of adaptive task allocation, both involving temporary return of the automated engine status task to the human operator ("human control"), were examined as a possible countermeasure to monitoring inefficiency. For the model-based adaptive group, the engine status task was allocated to all participants in the middle of the second session for 10 min, following which it was again returned to automation control. The same occurred for the performance-based adaptive group, but only if an individual participant's monitoring performance up to that point did not meet a specified criterion. For the nonadaptive control groups, the engine status task remained automated throughout the experiment. All groups had low probabilities of detection of automation failures for the first 40 min spent with automation. However, following the 10-min intervening period of human control, both adaptive groups detected significantly more automation failures during the subsequent blocks under automation control. The results show that adaptive task allocation can enhance monitoring of automated systems. Both model-based and performance-based allocation improved monitoring of automation. Implications for the design of automated systems are discussed.

  5. Optimal irrigation water allocation for a center pivot using remotely sensed data

    NASA Astrophysics Data System (ADS)

    Hassan Esfahani, L.; McKee, M.

    2013-12-01

    Efficient irrigation can help avoid crop water stress, undesirable leaching of nutrients, and yield reduction due to water shortage, and runoff and soil erosion due to over irrigation. Gains in water use efficiency can be achieved when water application is precisely matched to the spatially distributed crop water demand. This is important to produce high quality crops and otherwise conserve water for greatest efficiency of use. Irrigation efficiency is a term which defines irrigation performance based on indicators such as irrigation uniformity, crop production, economic return and water resources sustainability. The present paper introduces a modeling approach for optimal water allocation to a center pivot irrigation unit in consideration of these types of indicators. Landsat images, weather data and field measurements were used to develop a soil water balance model using Artificial Neural Networks (ANN). The model includes two main modules, one for soil water forecasting and one for optimization of water allocation. The optimization module uses Genetic Algorithms (GA) to identify optimal crop water application rates based on the crop type, growing stage and sensitivity to water stress. The forecasting module allocates water through time across the area covered by the center pivot considering the results from the previous period of irrigation and the operational limitations of the center pivot irrigation system. The model was tested for a farm equipped with a modern sprinkler irrigation system in Scipio, Utah. The solution obtained from the model used up to 30 percent less water without reducing the benefits realized by the irrigator than traditional operating procedures.

  6. 15 CFR 335.3 - Applications to receive allocation.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... WORSTED WOOL FABRIC § 335.3 Applications to receive allocation. (a) In each year prior to a Tariff Rate... behalf, provided the applicant owned the fabric at the time it was cut and sewn. The application must... of the application, an applicant must have woven in the United States worsted wool fabrics...

  7. 15 CFR 335.3 - Applications to receive allocation.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... WORSTED WOOL FABRIC § 335.3 Applications to receive allocation. (a) In each year prior to a Tariff Rate... behalf, provided the applicant owned the fabric at the time it was cut and sewn. The application must... of the application, an applicant must have woven in the United States worsted wool fabrics...

  8. Parental Perceptions of Child Care Quality in Centre-Based and Home-Based Settings: Associations with External Quality Ratings

    ERIC Educational Resources Information Center

    Lehrer, Joanne S.; Lemay, Lise; Bigras, Nathalie

    2015-01-01

    The current study examined how parental perceptions of child care quality were related to external quality ratings and considered how parental perceptions of quality varied according to child care context (home-based or centre-based settings). Parents of 179 4-year-old children who attended child care centres (n = 141) and home-based settings…

  9. FPGA-based voltage and current dual drive system for high frame rate electrical impedance tomography.

    PubMed

    Khan, Shadab; Manwaring, Preston; Borsic, Andrea; Halter, Ryan

    2015-04-01

    Electrical impedance tomography (EIT) is used to image the electrical property distribution of a tissue under test. An EIT system comprises complex hardware and software modules, which are typically designed for a specific application. Upgrading these modules is a time-consuming process, and requires rigorous testing to ensure proper functioning of new modules with the existing ones. To this end, we developed a modular and reconfigurable data acquisition (DAQ) system using National Instruments' (NI) hardware and software modules, which offer inherent compatibility over generations of hardware and software revisions. The system can be configured to use up to 32-channels. This EIT system can be used to interchangeably apply current or voltage signal, and measure the tissue response in a semi-parallel fashion. A novel signal averaging algorithm, and 512-point fast Fourier transform (FFT) computation block was implemented on the FPGA. FFT output bins were classified as signal or noise. Signal bins constitute a tissue's response to a pure or mixed tone signal. Signal bins' data can be used for traditional applications, as well as synchronous frequency-difference imaging. Noise bins were used to compute noise power on the FPGA. Noise power represents a metric of signal quality, and can be used to ensure proper tissue-electrode contact. Allocation of these computationally expensive tasks to the FPGA reduced the required bandwidth between PC, and the FPGA for high frame rate EIT. In 16-channel configuration, with a signal-averaging factor of 8, the DAQ frame rate at 100 kHz exceeded 110 frames s (-1), and signal-to-noise ratio exceeded 90 dB across the spectrum. Reciprocity error was found to be for frequencies up to 1 MHz. Static imaging experiments were performed on a high-conductivity inclusion placed in a saline filled tank; the inclusion was clearly localized in the reconstructions obtained for both absolute current and voltage mode data. PMID:25376037

  10. FPGA-based voltage and current dual drive system for high frame rate electrical impedance tomography.

    PubMed

    Khan, Shadab; Manwaring, Preston; Borsic, Andrea; Halter, Ryan

    2015-04-01

    Electrical impedance tomography (EIT) is used to image the electrical property distribution of a tissue under test. An EIT system comprises complex hardware and software modules, which are typically designed for a specific application. Upgrading these modules is a time-consuming process, and requires rigorous testing to ensure proper functioning of new modules with the existing ones. To this end, we developed a modular and reconfigurable data acquisition (DAQ) system using National Instruments' (NI) hardware and software modules, which offer inherent compatibility over generations of hardware and software revisions. The system can be configured to use up to 32-channels. This EIT system can be used to interchangeably apply current or voltage signal, and measure the tissue response in a semi-parallel fashion. A novel signal averaging algorithm, and 512-point fast Fourier transform (FFT) computation block was implemented on the FPGA. FFT output bins were classified as signal or noise. Signal bins constitute a tissue's response to a pure or mixed tone signal. Signal bins' data can be used for traditional applications, as well as synchronous frequency-difference imaging. Noise bins were used to compute noise power on the FPGA. Noise power represents a metric of signal quality, and can be used to ensure proper tissue-electrode contact. Allocation of these computationally expensive tasks to the FPGA reduced the required bandwidth between PC, and the FPGA for high frame rate EIT. In 16-channel configuration, with a signal-averaging factor of 8, the DAQ frame rate at 100 kHz exceeded 110 frames s (-1), and signal-to-noise ratio exceeded 90 dB across the spectrum. Reciprocity error was found to be for frequencies up to 1 MHz. Static imaging experiments were performed on a high-conductivity inclusion placed in a saline filled tank; the inclusion was clearly localized in the reconstructions obtained for both absolute current and voltage mode data.

  11. A Systems Approach for Allocating Educational Space.

    ERIC Educational Resources Information Center

    Florida Univ., Gainesville. Center for Community Needs Assessment.

    A computer simulation model for allocating facilities and physical space is presented as a means of optimally allocating available educational resources. The model allows the decisionmaker to change specific program allocations, system parameters, and other controllable variables in order to determine the effects, both cost and utility, of these…

  12. 49 CFR 262.5 - Allocation requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Allocation requirements. 262.5 Section 262.5... IMPROVEMENT PROJECTS § 262.5 Allocation requirements. At least fifty percent of all grant funds awarded under... than $20,000,000 each. Designated, high-priority projects will be excluded from this allocation...

  13. 15 CFR 923.110 - Allocation formula.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Allocation formula. 923.110 Section... MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS Allocation of Section 306 Program Administration Grants § 923.110 Allocation formula. (a) As required by subsection 306(a), the Secretary may make...

  14. 25 CFR 39.902 - Allocation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Allocation. 39.902 Section 39.902 Indians BUREAU OF... Maintenance and Minor Repair Fund § 39.902 Allocation. (a) Interim Maintenance and Minor Repair funds shall be... determining school allocations shall be taken from the facilities inventory maintained by the Division...

  15. 24 CFR 945.203 - Allocation plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Allocation plan. 945.203 Section... FAMILIES Application and Approval Procedures § 945.203 Allocation plan. (a) Applicable terminology. (1) As used in this section, the terms “initial allocation plan” refers to the PHA's first submission of...

  16. 24 CFR 594.15 - Allocation amounts.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 3 2010-04-01 2010-04-01 false Allocation amounts. 594.15 Section... DEVELOPMENT COMMUNITY FACILITIES JOHN HEINZ NEIGHBORHOOD DEVELOPMENT PROGRAM Funding Allocation and Criteria § 594.15 Allocation amounts. (a) Amounts and match requirement. HUD will make grants, in the form...

  17. 42 CFR 412.332 - Payment based on the hospital-specific rate.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Payment based on the hospital-specific rate. 412... HUMAN SERVICES MEDICARE PROGRAM PROSPECTIVE PAYMENT SYSTEMS FOR INPATIENT HOSPITAL SERVICES Prospective Payment System for Inpatient Hospital Capital Costs Determination of Transition Period Payment Rates...

  18. The Working Memory Rating Scale: A Classroom-Based Behavioral Assessment of Working Memory

    ERIC Educational Resources Information Center

    Alloway, Tracy Packiam; Gathercole, Susan Elizabeth; Kirkwood, Hannah; Elliott, Julian

    2009-01-01

    The aim of the present study was to investigate the potential of the Working Memory Rating Scale (WMRS), an observer-based rating scale that reflects behavioral difficulties of children with poor working memory. The findings indicate good internal reliability and adequate psychometric properties for use as a screening tool by teachers. Higher…

  19. 42 CFR 412.332 - Payment based on the hospital-specific rate.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... HUMAN SERVICES MEDICARE PROGRAM PROSPECTIVE PAYMENT SYSTEMS FOR INPATIENT HOSPITAL SERVICES Prospective Payment System for Inpatient Hospital Capital Costs Determination of Transition Period Payment Rates for Capital-Related Costs § 412.332 Payment based on the hospital-specific rate. The payment amount for...

  20. 42 CFR 412.312 - Payment based on the Federal rate.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 412.312 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... System for Inpatient Hospital Capital Costs Basic Methodology for Determining the Federal Rate for Capital-Related Costs § 412.312 Payment based on the Federal rate. (a) General. The payment amount...

  1. 42 CFR 412.312 - Payment based on the Federal rate.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 412.312 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... System for Inpatient Hospital Capital Costs Basic Methodology for Determining the Federal Rate for Capital-Related Costs § 412.312 Payment based on the Federal rate. (a) General. The payment amount...

  2. Latent Profile Analysis of Sixth Graders Based on Teacher Ratings: Association with School Dropout

    ERIC Educational Resources Information Center

    Orpinas, Pamela; Raczynski, Katherine; Peters, Jaclyn Wetherington; Colman, Laura; Bandalos, Deborah

    2015-01-01

    The goal of this study was to identify meaningful groups of sixth graders with common characteristics based on teacher ratings of assets and maladaptive behaviors, describe dropout rates for each group, and examine the validity of these groups using students' self-reports. The sample consisted of racially diverse students (n = 675) attending sixth…

  3. 42 CFR 412.332 - Payment based on the hospital-specific rate.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... HUMAN SERVICES MEDICARE PROGRAM PROSPECTIVE PAYMENT SYSTEMS FOR INPATIENT HOSPITAL SERVICES Prospective Payment System for Inpatient Hospital Capital Costs Determination of Transition Period Payment Rates for Capital-Related Costs § 412.332 Payment based on the hospital-specific rate. The payment amount for...

  4. 42 CFR 412.312 - Payment based on the Federal rate.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 412.312 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... System for Inpatient Hospital Capital Costs Basic Methodology for Determining the Federal Rate for Capital-Related Costs § 412.312 Payment based on the Federal rate. (a) General. The payment amount...

  5. 42 CFR 412.332 - Payment based on the hospital-specific rate.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... HUMAN SERVICES MEDICARE PROGRAM PROSPECTIVE PAYMENT SYSTEMS FOR INPATIENT HOSPITAL SERVICES Prospective Payment System for Inpatient Hospital Capital Costs Determination of Transition Period Payment Rates for Capital-Related Costs § 412.332 Payment based on the hospital-specific rate. The payment amount for...

  6. 42 CFR 412.332 - Payment based on the hospital-specific rate.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... HUMAN SERVICES MEDICARE PROGRAM PROSPECTIVE PAYMENT SYSTEMS FOR INPATIENT HOSPITAL SERVICES Prospective Payment System for Inpatient Hospital Capital Costs Determination of Transition Period Payment Rates for Capital-Related Costs § 412.332 Payment based on the hospital-specific rate. The payment amount for...

  7. 42 CFR 412.312 - Payment based on the Federal rate.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 412.312 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... System for Inpatient Hospital Capital Costs Basic Methodology for Determining the Federal Rate for Capital-Related Costs § 412.312 Payment based on the Federal rate. (a) General. The payment amount...

  8. Cosmogenic Ne-21 Production Rates in H-Chondrites Based on Cl-36 - Ar-36 Ages

    NASA Technical Reports Server (NTRS)

    Leya, I.; Graf, Th.; Nishiizumi, K.; Guenther, D.; Wieler, R.

    2000-01-01

    We measured Ne-21 production rates in 14 H-chondrites in good agreement with model calculations. The production rates are based on Ne-21 concentrations measured on bulk samples or the non-magnetic fraction and Cl-36 - Ar-36 ages determined from the metal phase.

  9. Can attentional theory explain the inverse base rate effect? Comment on Kruschke (2001).

    PubMed

    Winman, Anders; Wennerholm, Pia; Juslin, Peter

    2003-11-01

    In J. K. Kruschke's (2001; see record 2001-18940-005) study, it is argued that attentional theory is the sole satisfactory explanation of the inverse base rate effect and that eliminative inference (P. Juslin, P. Wennerholm, & A. Winman, 2001; see record 2001-07828-016) plays no role in the phenomenon. In this comment, the authors demonstrate that, in contrast to the central tenets of attentional theory, (a) rapid attention shifts as implemented in ADIT decelerate learning in the inverse base-rate task and (b) the claim that the inverse base-rate effect is directly caused by an attentional asymmetry is refuted by data. It is proposed that a complete account of the inverse base-rate effect needs to integrate attention effects with inference rules that are flexibly used for both induction and elimination. PMID:14622069

  10. When learning order affects sensitivity to base rates: challenges for theories of causal learning.

    PubMed

    Reips, Ulf-Dietrich; Waldmann, Michael R

    2008-01-01

    In three experiments we investigated whether two procedures of acquiring knowledge about the same causal structure, predictive learning (from causes to effects) versus diagnostic learning (from effects to causes), would lead to different base-rate use in diagnostic judgments. Results showed that learners are capable of incorporating base-rate information in their judgments regardless of the direction in which the causal structure is learned. However, this only holds true for relatively simple scenarios. When complexity was increased, base rates were only used after diagnostic learning, but were largely neglected after predictive learning. It could be shown that this asymmetry is not due to a failure of encoding base rates in predictive learning because participants in all conditions were fairly good at reporting them. The findings present challenges for all theories of causal learning.

  11. Communication patterns and allocation strategies.

    SciTech Connect

    Leung, Vitus Joseph; Mache, Jens Wolfgang; Bunde, David P.

    2004-01-01

    Motivated by observations about job runtimes on the CPlant system, we use a trace-driven microsimulator to begin characterizing the performance of different classes of allocation algorithms on jobs with different communication patterns in space-shared parallel systems with mesh topology. We show that relative performance varies considerably with communication pattern. The Paging strategy using the Hilbert space-filling curve and the Best Fit heuristic performed best across several communication patterns.

  12. Minority Transportation Expenditure Allocation Model

    SciTech Connect

    Vyas, Anant D.; Santini, Danilo J.; Marik, Sheri K.

    1993-04-12

    MITRAM (Minority TRansportation expenditure Allocation Model) can project various transportation related attributes of minority (Black and Hispanic) and majority (white) populations. The model projects vehicle ownership, vehicle miles of travel, workers, new car and on-road fleet fuel economy, amount and share of household income spent on gasoline, and household expenditures on public transportation and taxis. MITRAM predicts reactions to sustained fuel price changes for up to 10 years after the change.

  13. Resource allocation to reproduction in animals.

    PubMed

    Kooijman, Sebastiaan A L M; Lika, Konstadia

    2014-11-01

    The standard Dynamic Energy Budget (DEB) model assumes that a fraction κ of mobilised reserve is allocated to somatic maintenance plus growth, while the rest is allocated to maturity maintenance plus maturation (in embryos and juveniles) or reproduction (in adults). All DEB parameters have been estimated for 276 animal species from most large phyla and all chordate classes. The goodness of fit is generally excellent. We compared the estimated values of κ with those that would maximise reproduction in fully grown adults with abundant food. Only 13% of these species show a reproduction rate close to the maximum possible (assuming that κ can be controlled), another 4% have κ lower than the optimal value, and 83% have κ higher than the optimal value. Strong empirical support hence exists for the conclusion that reproduction is generally not maximised. We also compared the parameters of the wild chicken with those of races selected for meat and egg production and found that the latter indeed maximise reproduction in terms of κ, while surface-specific assimilation was not affected by selection. We suggest that small values of κ relate to the down-regulation of maximum body size, and large values to the down-regulation of reproduction. We briefly discuss the ecological context for these findings.

  14. Equitable fund allocation, an economical approach for sustainable waste load allocation.

    PubMed

    Ashtiani, Elham Feizi; Niksokhan, Mohammad Hossein; Jamshidi, Shervin

    2015-08-01

    This research aims to study a novel approach for waste load allocation (WLA) to meet environmental, economical, and equity objectives, simultaneously. For this purpose, based on a simulation-optimization model developed for Haraz River in north of Iran, the waste loads are allocated according to discharge permit market. The non-dominated solutions are initially achieved through multiobjective particle swarm optimization (MOPSO). Here, the violation of environmental standards based on dissolved oxygen (DO) versus biochemical oxidation demand (BOD) removal costs is minimized to find economical total maximum daily loads (TMDLs). This can save 41% in total abatement costs in comparison with the conventional command and control policy. The BOD discharge permit market then increases the revenues to 45%. This framework ensures that the environmental limits are fulfilled but the inequity index is rather high (about 4.65). For instance, the discharge permit buyer may not be satisfied about the equity of WLA. Consequently, it is recommended that a third party or institution should be in charge of reallocating the funds. It means that the polluters which gain benefits by unfair discharges should pay taxes (or funds) to compensate the losses of other polluters. This intends to reduce the costs below the required values of the lowest inequity index condition. These compensations of equitable fund allocation (EFA) may help to reduce the dissatisfactions and develop WLA policies. It is concluded that EFA in integration with water quality trading (WQT) is a promising approach to meet the objectives.

  15. 34 CFR 300.816 - Allocations to LEAs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... DISABILITIES Preschool Grants for Children with Disabilities § 300.816 Allocations to LEAs. (a) Base payments... serving children with disabilities now being served by the new LEA, among the new LEA and affected LEAs based on the relative numbers of children with disabilities ages three through five currently...

  16. 34 CFR 300.816 - Allocations to LEAs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... DISABILITIES Preschool Grants for Children with Disabilities § 300.816 Allocations to LEAs. (a) Base payments... serving children with disabilities now being served by the new LEA, among the new LEA and affected LEAs based on the relative numbers of children with disabilities ages three through five currently...

  17. 15 CFR 700.21 - Application for priority rating authority.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... BASE REGULATIONS DEFENSE PRIORITIES AND ALLOCATIONS SYSTEM Industrial Priorities for Energy Programs... energy supplies, a person may request priority rating authority for scarce, critical, and essential... installation, repair, or maintenance of equipment) by submitting a request to the Department of Energy....

  18. It takes six to boogie: allocating cadaver kidneys in Eurotransplant.

    PubMed

    Doxiadis, Ilias I N; Smits, Jacqueline M A; Persijn, Guido G; Frei, Ulrich; Claas, Frans H J

    2004-02-27

    In March 1996, a new allocation point system for cadaver kidneys, the Eurotransplant (ET) Kidney Allocation System (KAS), was introduced in ET, the first multinational organ exchange organization. The aims of ETKAS were to reduce average and maximum waiting time, to allow patients with rare human leukocyte antigen (HLA) phenotypes or combinations to receive an "optimal" offer, to keep the exchange rates between the participating countries balanced, and finally to keep optimal graft survival, by means of HLA matching. Elderly patients and highly sensitized patients profit in addition from special programs, the ET Senior Program and the Acceptable Mismatch Program, respectively. All kidneys are offered to the pool and are allocated according to the degree of HLA matching, mismatch probability, waiting time, distance from the donor center, and balance between the countries participating in ET. A summary of 6 years' experience with the ETKAS is presented in this article.

  19. Quantifying and understanding reproductive allocation schedules in plants.

    PubMed

    Wenk, Elizabeth Hedi; Falster, Daniel S

    2015-12-01

    A plant's reproductive allocation (RA) schedule describes the fraction of surplus energy allocated to reproduction as it increases in size. While theorists use RA schedules as the connection between life history and energy allocation, little is known about RA schedules in real vegetation. Here we review what is known about RA schedules for perennial plants using studies either directly quantifying RA or that collected data from which the shape of an RA schedule can be inferred. We also briefly review theoretical models describing factors by which variation in RA may arise. We identified 34 studies from which aspects of an RA schedule could be inferred. Within those, RA schedules varied considerably across species: some species abruptly shift all resources from growth to reproduction; most others gradually shift resources into reproduction, but under a variety of graded schedules. Available data indicate the maximum fraction of energy allocated to production ranges from 0.1 to 1 and that shorter lived species tend to have higher initial RA and increase their RA more quickly than do longer-lived species. Overall, our findings indicate, little data exist about RA schedules in perennial plants. Available data suggest a wide range of schedules across species. Collection of more data on RA schedules would enable a tighter integration between observation and a variety of models predicting optimal energy allocation, plant growth rates, and biogeochemical cycles.

  20. Quantifying and understanding reproductive allocation schedules in plants.

    PubMed

    Wenk, Elizabeth Hedi; Falster, Daniel S

    2015-12-01

    A plant's reproductive allocation (RA) schedule describes the fraction of surplus energy allocated to reproduction as it increases in size. While theorists use RA schedules as the connection between life history and energy allocation, little is known about RA schedules in real vegetation. Here we review what is known about RA schedules for perennial plants using studies either directly quantifying RA or that collected data from which the shape of an RA schedule can be inferred. We also briefly review theoretical models describing factors by which variation in RA may arise. We identified 34 studies from which aspects of an RA schedule could be inferred. Within those, RA schedules varied considerably across species: some species abruptly shift all resources from growth to reproduction; most others gradually shift resources into reproduction, but under a variety of graded schedules. Available data indicate the maximum fraction of energy allocated to production ranges from 0.1 to 1 and that shorter lived species tend to have higher initial RA and increase their RA more quickly than do longer-lived species. Overall, our findings indicate, little data exist about RA schedules in perennial plants. Available data suggest a wide range of schedules across species. Collection of more data on RA schedules would enable a tighter integration between observation and a variety of models predicting optimal energy allocation, plant growth rates, and biogeochemical cycles. PMID:27069603

  1. Environmental flow allocation and statistics calculator

    USGS Publications Warehouse

    Konrad, Christopher P.

    2011-01-01

    The Environmental Flow Allocation and Statistics Calculator (EFASC) is a computer program that calculates hydrologic statistics based on a time series of daily streamflow values. EFASC will calculate statistics for daily streamflow in an input file or will generate synthetic daily flow series from an input file based on rules for allocating and protecting streamflow and then calculate statistics for the synthetic time series. The program reads dates and daily streamflow values from input files. The program writes statistics out to a series of worksheets and text files. Multiple sites can be processed in series as one run. EFASC is written in MicrosoftRegistered Visual BasicCopyright for Applications and implemented as a macro in MicrosoftOffice Excel 2007Registered. EFASC is intended as a research tool for users familiar with computer programming. The code for EFASC is provided so that it can be modified for specific applications. All users should review how output statistics are calculated and recognize that the algorithms may not comply with conventions used to calculate streamflow statistics published by the U.S. Geological Survey.

  2. Dose rate properties of NIPAM-based x-ray CT polymer gel dosimeters

    NASA Astrophysics Data System (ADS)

    Jirasek, A.; Johnston, H.; Hilts, M.

    2015-06-01

    In this work we investigate radiation dose rate dependencies of N-isopropylacrylamide (NIPAM) based polymer gel dosimeters (PGDs) used in conjunction with x-ray computed tomography imaging for radiotherapy dose verification. We define four primary forms of dose rate variation: constant mean dose rate where beam on and beam off times both vary, variable mean dose rate where beam on time varies, variable mean dose rate where beam off time varies and machine dose rate (MU min-1). We utilize both small (20 mL) vials and large volume (1L) gel containers to identify and characterize dose rate dependence in NIPAM PGDs. Results indicate that all investigated constant and variable mean dose rates had negligible affect on PGD dose response with the exception of machine dose rates (100-600 MU min-1) which produced variations in dose response significantly lower than previously reported. Explanations of the reduced variability in dose response are given. It is also shown that NIPAM PGD dose response is not affected by variations in dose rate that may occur in modulated treatment deliveries. Finally, compositional changes in NIPAM PGDs are investigated as potential mitigating strategies for dose rate-dependent response variability.

  3. A model-based technique for predicting pilot opinion ratings for large commercial transports

    NASA Technical Reports Server (NTRS)

    Levison, W. H.

    1982-01-01

    A model-based technique for predicting pilot opinion ratings is described. Features of this procedure, which is based on the optimal-control model for pilot/vehicle systems, include (1) capability to treat "unconventional" aircraft dynamics, (2) a relatively free-form pilot model, (3) a simple scalar metric for attentional workload, and (4) a straightforward manner of proceeding from descriptions of the flight task environment and requirements to a prediction of pilot opinion rating. The method was able to provide a good match to a set of pilot opinion ratings obtained in a manned simulation study of large commercial aircraft in landing approach.

  4. A model-based technique for predicting pilot opinion ratings for large commercial transports

    NASA Technical Reports Server (NTRS)

    Levison, W. H.

    1980-01-01

    A model-based technique for predicting pilot opinion ratings is described. Features of this procedure, which is based on the optimal-control model for pilot/vehicle systems, include (1) capability to treat 'unconventional' aircraft dynamics, (2) a relatively free-form pilot model, (3) a simple scalar metric for attentional workload, and (4) a straightforward manner of proceeding from descriptions of the flight task environment and requirements to a prediction of pilot opinion rating. The method is able to provide a good match to a set of pilot opinion ratings obtained in a manned simulation study of large commercial aircraft in landing approach.

  5. The Relationship Between Hospital Value-Based Purchasing Program Scores and Hospital Bond Ratings.

    PubMed

    Rangnekar, Anooja; Johnson, Tricia; Garman, Andrew; O'Neil, Patricia

    2015-01-01

    Tax-exempt hospitals and health systems often borrow long-term debt to fund capital investments. Lenders use bond ratings as a standard metric to assess whether to lend funds to a hospital. Credit rating agencies have historically relied on financial performance measures and a hospital's ability to service debt obligations to determine bond ratings. With the growth in pay-for-performance-based reimbursement models, rating agencies are expanding their hospital bond rating criteria to include hospital utilization and value-based purchasing (VBP) measures. In this study, we evaluated the relationship between the Hospital VBP domains--Clinical Process of Care, Patient Experience of Care, Outcome, and Medicare Spending per Beneficiary (MSPB)--and hospital bond ratings. Given the historical focus on financial performance, we hypothesized that hospital bond ratings are not associated with any of the Hospital VBP domains. This was a retrospective, cross-sectional study of all hospitals that were rated by Moody's for fiscal year 2012 and participated in the Centers for Medicare & Medicaid Services' VBP program as of January 2014 (N = 285). Of the 285 hospitals in the study, 15% had been assigned a bond rating of Aa, and 46% had been assigned an A rating. Using a binary logistic regression model, we found an association between MSPB only and bond ratings, after controlling for other VBP and financial performance scores; however, MSPB did not improve the overall predictive accuracy of the model. Inclusion of VBP scores in the methodology used to determine hospital bond ratings is likely to affect hospital bond ratings in the near term. PMID:26554267

  6. Allocating Variability and Reserve Requirements (Presentation)

    SciTech Connect

    Kirby, B.; King, J.; Milligan, M.

    2011-10-01

    This presentation describes how you could conceivably allocate variability and reserve requirements, including how to allocate aggregation benefits. Conclusions of this presentation are: (1) Aggregation provides benefits because individual requirements are not 100% correlated; (2) Method needed to allocate reduced requirement among participants; (3) Differences between allocation results are subtle - (a) Not immediately obvious which method is 'better'; (b) Many are numerically 'correct', they sum to the physical requirement; (c) Many are not 'fair', Results depend on sub-aggregation and/or the order individuals are included; and (4) Vector allocation method is simple and fair.

  7. Efficient Resource Allocation for Multiclass Services in Multiuser OFDM Systems

    NASA Astrophysics Data System (ADS)

    Lee, Jae Soong; Lee, Jae Young; Lee, Soobin; Lee, Hwang Soo

    Although each application has its own quality of service (QoS) requirements, the resource allocation for multiclass services has not been studied adequately in multiuser orthogonal frequency division multiplexing (OFDM) systems. In this paper, a total transmit power minimization problem for downlink transmission is examined while satisfying multiclass services consisting of different data rates and target bit-error rates (BER). Lagrangian relaxation is used to find an optimal subcarrier allocation criterion in the context of subcarrier time-sharing by all users. We suggest an iterative algorithm using this criterion to find the upper and lower bounds of optimal power consumption. We also propose a prioritized subcarrier allocation (PSA) algorithm that provides low computation cost and performance very close to that of the iterative algorithm. The PSA algorithm employs subcarrier selection order (SSO) in order to decide which user takes its best subcarrier first over other users. The SSO is determined by the data rates, channel gain, and target BER of each user. The proposed algorithms are simulated in various QoS parameters and the fading channel model. Furthermore, resource allocation is performed not only subcarrier by subcarrier but also frequency block by frequency block (comprises several subcarriers). These extensive simulation environments provide a more complete assessment of the proposed algorithms. Simulation results show that the proposed algorithms significantly outperform existing algorithms in terms of total transmit power consumption.

  8. PRESSURE AND TEMPERATURE DEPENDENT DEFLAGRATION RATE MEASUREMENTS OF LLM-105 AND TATB BASED EXPLOSIVES

    SciTech Connect

    Glascoe, E A; Tan, N; Koerner, J; Lorenz, K T; Maienschein, J L

    2009-11-10

    The pressure dependent deflagration rates of LLM-105 and TATB based formulations were measured in the LLNL high pressure strand burner. The role of binder amount, explosive type, and thermal damage and their effects on the deflagration rate will be discussed. Two different formulations of LLM-105 and three formulations of TATB were studied and results indicate that binder amount and type play a minor role in the deflagration behavior. This is in sharp contrast to the HMX based formulations which strongly depend on binder amount and type. The effect of preheating these samples was considerably more dramatic. In the case of LLM-105, preheating the sample appears to have little effect on the deflagration rate. In contrast, preheating TATB formulations causes the deflagration rate to accelerate and become erratic. The thermal and mechanical properties of these formulations will be discussed in the context of their pressure and temperature dependent deflagration rates.

  9. Reliability of file-based retrospective ratings of psychopathy with the PCL-R.

    PubMed

    Grann, M; Långström, N; Tengström, A; Stålenheim, E G

    1998-06-01

    A rapidly emerging consensus recognizes Hare's Psychopathy Checklist-Revised (PCL-R; Hare, 1991) as the most valid and useful instrument to assess psychopathy (Fulero, 1995; Stone, 1995). We compared independent clinical PCL-R ratings of 40 forensic adult male criminal offenders to retrospective file-only ratings. File-based PCL-R ratings, in comparison to the clinical ratings, yielded categorical psychopathy diagnoses with a sensitivity of .57 and a specificity of .96. The intraclass correlation (ICC) of the total scores as estimated by ICC(2,1) was .88, and was markedly better on Factor 2, ICC(2,1) = .89, than on Factor 1, ICC(2,1) = .69. The findings support the belief that for research purposes, file-only PCL-R ratings based on Swedish forensic psychiatric investigation records can be made with good alternate-form reliability.

  10. Mathematical modeling of high-rate Anammox UASB reactor based on granular packing patterns.

    PubMed

    Tang, Chong-Jian; He, Rui; Zheng, Ping; Chai, Li-Yuan; Min, Xiao-Bo

    2013-04-15

    A novel mathematical model was developed to estimate the volumetric nitrogen conversion rates of a high-rate Anammox UASB reactor based on the packing patterns of granular sludge. A series of relationships among granular packing density, sludge concentration, hydraulic retention time and volumetric conversion rate were constructed to correlate Anammox reactor performance with granular packing patterns. It was suggested that the Anammox granules packed as the equivalent simple cubic pattern in high-rate UASB reactor with packing density of 50-55%, which not only accommodated a high concentration of sludge inside the reactor, but also provided large pore volume, thus prolonging the actual substrate conversion time. Results also indicated that it was necessary to improve Anammox reactor performance by enhancing substrate loading when sludge concentration was higher than 37.8 gVSS/L. The established model was carefully calibrated and verified, and it well simulated the performance of granule-based high-rate Anammox UASB reactor.

  11. Mass weighted urn design--A new randomization algorithm for unequal allocations.

    PubMed

    Zhao, Wenle

    2015-07-01

    Unequal allocations have been used in clinical trials motivated by ethical, efficiency, or feasibility concerns. Commonly used permuted block randomization faces a tradeoff between effective imbalance control with a small block size and accurate allocation target with a large block size. Few other unequal allocation randomization designs have been proposed in literature with applications in real trials hardly ever been reported, partly due to their complexity in implementation compared to the permuted block randomization. Proposed in this paper is the mass weighted urn design, in which the number of balls in the urn equals to the number of treatments, and remains unchanged during the study. The chance a ball being randomly selected is proportional to the mass of the ball. After each treatment assignment, a part of the mass of the selected ball is re-distributed to all balls based on the target allocation ratio. This design allows any desired optimal unequal allocations be accurately targeted without approximation, and provides a consistent imbalance control throughout the allocation sequence. The statistical properties of this new design is evaluated with the Euclidean distance between the observed treatment distribution and the desired treatment distribution as the treatment imbalance measure; and the Euclidean distance between the conditional allocation probability and the target allocation probability as the allocation predictability measure. Computer simulation results are presented comparing the mass weighted urn design with other randomization designs currently available for unequal allocations.

  12. Using hierarchical P-median problem for public school allocation

    NASA Astrophysics Data System (ADS)

    Nasir, Noryanti; Shariff, S. Sarifah Radiah

    2014-09-01

    Student's orientation from primary to the secondary schools is a vital process and need to be considered as yearly problem by The District Education Office of Ministry of Education Malaysia. The allocation of students to the right schools becomes complicated due to several constraints like capacity of seats in secondary schools, widespread demand areas as well as preferences by the parents. Considering all the constraints, this study proposes the application of location model to allocate students the right school. The allocation of student from primary to secondary school is based on total number of students, the availability of seats from primary and at the secondary schools, and distance from student's home to the nearest schools, both primary and secondary schools. The problem is modelled as hierarchical P-median problem as there are two hierarchical levels in this orientation. An alternative Genetic Algorithm (GA) based heuristic is applied to solve the problem and results are compared with Excel Solver.

  13. 40 CFR 96.53 - Recordation of NOX allowance allocations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS (CONTINUED) NOX BUDGET TRADING PROGRAM AND CAIR NOX AND SO2 TRADING PROGRAMS FOR STATE... allocated to an allocation set-aside. (c) Serial numbers for allocated NO X allowances. When allocating...

  14. 40 CFR 96.53 - Recordation of NOX allowance allocations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROGRAMS (CONTINUED) NOX BUDGET TRADING PROGRAM AND CAIR NOX AND SO 2 TRADING PROGRAMS FOR STATE... allocated to an allocation set-aside. (c) Serial numbers for allocated NO X allowances. When allocating...

  15. 45 CFR 402.31 - Determination of allocations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ASSISTANCE GRANTS State Allocations § 402.31 Determination of allocations. (a) Allocation formula. Allocations will be computed according to a formula using the following factors and weights: (1) 50...

  16. Parameter allocation of parallel array bistable stochastic resonance and its application in communication systems

    NASA Astrophysics Data System (ADS)

    Liu, Jian; Wang, You-Guo; Zhai, Qi-Qing; Liu, Jin

    2016-10-01

    In this paper, we propose a parameter allocation scheme in a parallel array bistable stochastic resonance-based communication system (P-BSR-CS) to improve the performance of weak binary pulse amplitude modulated (BPAM) signal transmissions. The optimal parameter allocation policy of the P-BSR-CS is provided to minimize the bit error rate (BER) and maximize the channel capacity (CC) under the adiabatic approximation condition. On this basis, we further derive the best parameter selection theorem in realistic communication scenarios via variable transformation. Specifically, the P-BSR structure design not only brings the robustness of parameter selection optimization, where the optimal parameter pair is not fixed but variable in quite a wide range, but also produces outstanding system performance. Theoretical analysis and simulation results indicate that in the P-BSR-CS the proposed parameter allocation scheme yields considerable performance improvement, particularly in very low signal-to-noise ratio (SNR) environments. Project supported by the National Natural Science Foundation of China (Grant No. 61179027), the Qinglan Project of Jiangsu Province of China (Grant No. QL06212006), and the University Postgraduate Research and Innovation Project of Jiangsu Province (Grant Nos. KYLX15_0829, KYLX15_0831).

  17. Sample size allocation for food item radiation monitoring and safety inspection.

    PubMed

    Seto, Mayumi; Uriu, Koichiro

    2015-03-01

    The objective of this study is to identify a procedure for determining sample size allocation for food radiation inspections of more than one food item to minimize the potential risk to consumers of internal radiation exposure. We consider a simplified case of food radiation monitoring and safety inspection in which a risk manager is required to monitor two food items, milk and spinach, in a contaminated area. Three protocols for food radiation monitoring with different sample size allocations were assessed by simulating random sampling and inspections of milk and spinach in a conceptual monitoring site. Distributions of (131)I and radiocesium concentrations were determined in reference to (131)I and radiocesium concentrations detected in Fukushima prefecture, Japan, for March and April 2011. The results of the simulations suggested that a protocol that allocates sample size to milk and spinach based on the estimation of (131)I and radiocesium concentrations using the apparent decay rate constants sequentially calculated from past monitoring data can most effectively minimize the potential risks of internal radiation exposure.

  18. 77 FR 60984 - World Digital Innovations; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-05

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission World Digital Innovations; Supplemental Notice That Initial Market-Based... above-referenced proceeding, of World Digital Innovations' application for market-based rate...

  19. 77 FR 77073 - Carson Cogeneration Company; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-31

    ... Energy Regulatory Commission Carson Cogeneration Company; Supplemental Notice That Initial Market-Based... above-referenced proceeding, of Carson Cogeneration Company's application for market-based rate... clicking on the appropriate link in the above list. They are also available for review in the...

  20. Exploring Latent Class Based on Growth Rates in Number Sense Ability

    ERIC Educational Resources Information Center

    Kim, Dongil; Shin, Jaehyun; Lee, Kijyung

    2013-01-01

    The purpose of this study was to explore latent class based on growth rates in number sense ability by using latent growth class modeling (LGCM). LGCM is one of the noteworthy methods for identifying growth patterns of the progress monitoring within the response to intervention framework in that it enables us to analyze latent sub-groups based not…

  1. Sub-Optimal Allocation of Time in Sequential Movements

    PubMed Central

    Wu, Shih-Wei; Dal Martello, Maria F.; Maloney, Laurence T.

    2009-01-01

    The allocation of limited resources such as time or energy is a core problem that organisms face when planning complex actions. Most previous research concerning planning of movement has focused on the planning of single, isolated movements. Here we investigated the allocation of time in a pointing task where human subjects attempted to touch two targets in a specified order to earn monetary rewards. Subjects were required to complete both movements within a limited time but could freely allocate the available time between the movements. The time constraint presents an allocation problem to the subjects: the more time spent on one movement, the less time is available for the other. In different conditions we assigned different rewards to the two tokens. How the subject allocated time between movements affected their expected gain on each trial. We also varied the angle between the first and second movements and the length of the second movement. Based on our results, we developed and tested a model of speed-accuracy tradeoff for sequential movements. Using this model we could predict the time allocation that would maximize the expected gain of each subject in each experimental condition. We compared human performance with predicted optimal performance. We found that all subjects allocated time sub-optimally, spending more time than they should on the first movement even when the reward of the second target was five times larger than the first. We conclude that the movement planning system fails to maximize expected reward in planning sequences of as few as two movements and discuss possible interpretations drawn from economic theory. PMID:20011047

  2. Molecule-based approach for computing chemical-reaction rates in upper atmosphere hypersonic flows.

    SciTech Connect

    Gallis, Michail A.; Bond, Ryan Bomar; Torczynski, John Robert

    2009-08-01

    This report summarizes the work completed during FY2009 for the LDRD project 09-1332 'Molecule-Based Approach for Computing Chemical-Reaction Rates in Upper-Atmosphere Hypersonic Flows'. The goal of this project was to apply a recently proposed approach for the Direct Simulation Monte Carlo (DSMC) method to calculate chemical-reaction rates for high-temperature atmospheric species. The new DSMC model reproduces measured equilibrium reaction rates without using any macroscopic reaction-rate information. Since it uses only molecular properties, the new model is inherently able to predict reaction rates for arbitrary nonequilibrium conditions. DSMC non-equilibrium reaction rates are compared to Park's phenomenological non-equilibrium reaction-rate model, the predominant model for hypersonic-flow-field calculations. For near-equilibrium conditions, Park's model is in good agreement with the DSMC-calculated reaction rates. For far-from-equilibrium conditions, corresponding to a typical shock layer, the difference between the two models can exceed 10 orders of magnitude. The DSMC predictions are also found to be in very good agreement with measured and calculated non-equilibrium reaction rates. Extensions of the model to reactions typically found in combustion flows and ionizing reactions are also found to be in very good agreement with available measurements, offering strong evidence that this is a viable and reliable technique to predict chemical reaction rates.

  3. 12 CFR Appendix C to Part 567 - Risk-Based Capital Requirements-Internal-Ratings-Based and Advanced Measurement Approaches

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Risk-Based Capital Requirements-Internal-Ratings-Based and Advanced Measurement Approaches C Appendix C to Part 567 Banks and Banking OFFICE OF... Exposure Report); (iii) Is a subsidiary of a depository institution that uses 12 CFR part 3, appendix C,...

  4. ESTIMATION OF THE RATE OF VOC EMISSIONS FROM SOLVENT-BASED INDOOR COATING MATERIALS BASED ON PRODUCT FORMULATION

    EPA Science Inventory

    Two computational methods are proposed for estimation of the emission rate of volatile organic compounds (VOCs) from solvent-based indoor coating materials based on the knowledge of product formulation. The first method utilizes two previously developed mass transfer models with ...

  5. Results of Propellant Mixing Variable Study Using Precise Pressure-Based Burn Rate Calculations

    NASA Technical Reports Server (NTRS)

    Stefanski, Philip L.

    2014-01-01

    A designed experiment was conducted in which three mix processing variables (pre-curative addition mix temperature, pre-curative addition mixing time, and mixer speed) were varied to estimate their effects on within-mix propellant burn rate variability. The chosen discriminator for the experiment was the 2-inch diameter by 4-inch long (2x4) Center-Perforated (CP) ballistic evaluation motor. Motor nozzle throat diameters were sized to produce a common targeted chamber pressure. Initial data analysis did not show a statistically significant effect. Because propellant burn rate must be directly related to chamber pressure, a method was developed that showed statistically significant effects on chamber pressure (either maximum or average) by adjustments to the process settings. Burn rates were calculated from chamber pressures and these were then normalized to a common pressure for comparative purposes. The pressure-based method of burn rate determination showed significant reduction in error when compared to results obtained from the Brooks' modification of the propellant web-bisector burn rate determination method. Analysis of effects using burn rates calculated by the pressure-based method showed a significant correlation of within-mix burn rate dispersion to mixing duration and the quadratic of mixing duration. The findings were confirmed in a series of mixes that examined the effects of mixing time on burn rate variation, which yielded the same results.

  6. 10 CFR 217.53 - Types of allocation orders.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Types of allocation orders. 217.53 Section 217.53 Energy DEPARTMENT OF ENERGY OIL ENERGY PRIORITIES AND ALLOCATIONS SYSTEM Allocation Actions § 217.53 Types of allocation orders. There are three types of allocation orders available for communicating allocation...

  7. Resource allocation using constraint propagation

    NASA Technical Reports Server (NTRS)

    Rogers, John S.

    1990-01-01

    The concept of constraint propagation was discussed. Performance increases are possible with careful application of these constraint mechanisms. The degree of performance increase is related to the interdependence of the different activities resource usage. Although this method of applying constraints to activities and resources is often beneficial, it is obvious that this is no panacea cure for the computational woes that are experienced by dynamic resource allocation and scheduling problems. A combined effort for execution optimization in all areas of the system during development and the selection of the appropriate development environment is still the best method of producing an efficient system.

  8. Modeling of Rate-Dependent Hysteresis Using a GPO-Based Adaptive Filter.

    PubMed

    Zhang, Zhen; Ma, Yaopeng

    2016-02-06

    A novel generalized play operator-based (GPO-based) nonlinear adaptive filter is proposed to model rate-dependent hysteresis nonlinearity for smart actuators. In the proposed filter, the input signal vector consists of the output of a tapped delay line. GPOs with various thresholds are used to construct a nonlinear network and connected with the input signals. The output signal of the filter is composed of a linear combination of signals from the output of GPOs. The least-mean-square (LMS) algorithm is used to adjust the weights of the nonlinear filter. The modeling results of four adaptive filter methods are compared: GPO-based adaptive filter, Volterra filter, backlash filter and linear adaptive filter. Moreover, a phenomenological operator-based model, the rate-dependent generalized Prandtl-Ishlinskii (RDGPI) model, is compared to the proposed adaptive filter. The various rate-dependent modeling methods are applied to model the rate-dependent hysteresis of a giant magnetostrictive actuator (GMA). It is shown from the modeling results that the GPO-based adaptive filter can describe the rate-dependent hysteresis nonlinear of the GMA more accurately and effectively.

  9. The base rate principle and the fairness principle in social judgment.

    PubMed

    Cao, Jack; Banaji, Mahzarin R

    2016-07-01

    Meet Jonathan and Elizabeth. One person is a doctor and the other is a nurse. Who is the doctor? When nothing else is known, the base rate principle favors Jonathan to be the doctor and the fairness principle favors both individuals equally. However, when individuating facts reveal who is actually the doctor, base rates and fairness become irrelevant, as the facts make the correct answer clear. In three experiments, explicit and implicit beliefs were measured before and after individuating facts were learned. These facts were either stereotypic (e.g., Jonathan is the doctor, Elizabeth is the nurse) or counterstereotypic (e.g., Elizabeth is the doctor, Jonathan is the nurse). Results showed that before individuating facts were learned, explicit beliefs followed the fairness principle, whereas implicit beliefs followed the base rate principle. After individuating facts were learned, explicit beliefs correctly aligned with stereotypic and counterstereotypic facts. Implicit beliefs, however, were immune to counterstereotypic facts and continued to follow the base rate principle. Having established the robustness and generality of these results, a fourth experiment verified that gender stereotypes played a causal role: when both individuals were male, explicit and implicit beliefs alike correctly converged with individuating facts. Taken together, these experiments demonstrate that explicit beliefs uphold fairness and incorporate obvious and relevant facts, but implicit beliefs uphold base rates and appear relatively impervious to counterstereotypic facts. PMID:27325760

  10. Modeling of Rate-Dependent Hysteresis Using a GPO-Based Adaptive Filter.

    PubMed

    Zhang, Zhen; Ma, Yaopeng

    2016-01-01

    A novel generalized play operator-based (GPO-based) nonlinear adaptive filter is proposed to model rate-dependent hysteresis nonlinearity for smart actuators. In the proposed filter, the input signal vector consists of the output of a tapped delay line. GPOs with various thresholds are used to construct a nonlinear network and connected with the input signals. The output signal of the filter is composed of a linear combination of signals from the output of GPOs. The least-mean-square (LMS) algorithm is used to adjust the weights of the nonlinear filter. The modeling results of four adaptive filter methods are compared: GPO-based adaptive filter, Volterra filter, backlash filter and linear adaptive filter. Moreover, a phenomenological operator-based model, the rate-dependent generalized Prandtl-Ishlinskii (RDGPI) model, is compared to the proposed adaptive filter. The various rate-dependent modeling methods are applied to model the rate-dependent hysteresis of a giant magnetostrictive actuator (GMA). It is shown from the modeling results that the GPO-based adaptive filter can describe the rate-dependent hysteresis nonlinear of the GMA more accurately and effectively. PMID:26861349

  11. System and method for memory allocation in a multiclass memory system

    DOEpatents

    Loh, Gabriel; Meswani, Mitesh; Ignatowski, Michael; Nutter, Mark

    2016-06-28

    A system for memory allocation in a multiclass memory system includes a processor coupleable to a plurality of memories sharing a unified memory address space, and a library store to store a library of software functions. The processor identifies a type of a data structure in response to a memory allocation function call to the library for allocating memory to the data structure. Using the library, the processor allocates portions of the data structure among multiple memories of the multiclass memory system based on the type of the data structure.

  12. Effect of ECAP-based choice of stimulation rate on speech-perception performance

    PubMed Central

    Bournique, Jennifer L.; Hughes, Michelle L.; Baudhuin, Jacquelyn L.; Goehring, Jenny L.

    2012-01-01

    Objectives The objective determination of an optimal stimulation rate for CI users could save time and take the uncertainty out of choosing a rate based on patient preference. Electrically evoked compound action potential (ECAP) temporal response patterns vary across stimulation rates and cochlear regions, and could be useful in objectively predicting an optimal rate. Given that only one rate of stimulation can be used for current CI devices, we propose two potential ways to investigate whether a rate that produces stochastic ECAP responses (termed stochastic rate) can be used to predict an optimal stimulation rate. The first approach follows that of Hochmair et al. (2003), which compared performance across three cochlear regions using limited electrode sets. This approach, which has inherent limitations, may provide insight into the effects of region-specific stochastic rates on performance. The second, more direct approach is to compare speech perception for full-array maps that each employs a stochastic rate from a different region of the cochlea. Using both of these methods in a set of two acute experiments, the goal of the present study was to assess the effects of stochastic rate on speech perception. Design Speech-perception stimuli included the Hearing in Noise Test (HINT sentences), Consonant-Nucleus-Consonant (CNC) phonemes, and Iowa Medial Consonants. For Experiment 1, 22 ears in 20 CI recipients were tested in three map conditions (basal-only, middle-only, and apical-only electrode sets) using the subject’s daily-use stimulation rate to first explore the level of performance possible with region-specific maps. A one-way repeated-measures analysis of variance (RM ANOVA) was used to examine the effect of electrode region on performance. A subset of nine subjects was tested with three additional maps (basal-only, middle-only, and apical-only electrode sets) using the region-specific stochastic rate, as measured in a previous study. A two-way RM ANOVA was

  13. Carbon allocation under light and nitrogen resource gradients in two model marine phytoplankton(1).

    PubMed

    Bittar, Thais B; Lin, Yajuan; Sassano, Lara R; Wheeler, Benjamin J; Brown, Susan L; Cochlan, William P; Johnson, Zackary I

    2013-06-01

    Marine phytoplankton have conserved elemental stoichiometry, but there can be significant deviations from this Redfield ratio. Moreover, phytoplankton allocate reduced carbon (C) to different biochemical pools based on nutritional status and light availability, adding complexity to this relationship. This allocation influences physiology, ecology, and biogeochemistry. Here, we present results on the physiological and biochemical properties of two evolutionarily distinct model marine phytoplankton, a diatom (cf. Staurosira sp. Ehrenberg) and a chlorophyte (Chlorella sp. M. Beijerinck) grown under light and nitrogen resource gradients to characterize how carbon is allocated under different energy and substrate conditions. We found that nitrogen (N)-replete growth rate increased monotonically with light until it reached a threshold intensity (~200 μmol photons · m(-2)  · s(-1) ). For Chlorella sp., the nitrogen quota (pg · μm(-3) ) was greatest below this threshold, beyond which it was reduced by the effect of N-stress, while for Staurosira sp. there was no trend. Both species maintained constant maximum quantum yield of photosynthesis (mol C · mol photons(-1) ) over the range of light and N-gradients studied (although each species used different photophysiological strategies). In both species, C:chl a (g · g(-1) ) increased as a function of light and N-stress, while C:N (mol · mol(-1) ) and relative neutral lipid:C (rel. lipid · g(-1) ) were most strongly influenced by N-stress above the threshold light intensity. These results demonstrated that the interaction of substrate (N-availability) and energy gradients influenced C-allocation, and that general patterns of biochemical responses may be conserved among phytoplankton; they provided a framework for predicting phytoplankton biochemical composition in ecological, biogeochemical, or biotechnological applications.

  14. Raman spectroscopy for the characterization of the polymerization rate in an acrylamide-based photopolymer

    NASA Astrophysics Data System (ADS)

    Jallapuram, Raghavendra; Naydenova, Izabela; Byrne, Hugh J.; Martin, Suzanne; Howard, Robert; Toal, Vincent

    2008-01-01

    Investigations of polymerization rates in an acrylamide-based photopolymer are presented. The polymerization rate for acrylamide and methylenebisacrylamide was determined by monitoring the changes in the characteristic vibrational peaks at 1284 and 1607 cm-1 corresponding to the bending mode of the CH bond and CC double bonds of acrylamide and in the characteristic peak at 1629 cm-1 corresponding to the carbon-carbon double bond of methylenebisacrylamide using Raman spectroscopy. To study the dependence of the polymerization rate on intensity and to find the dependence parameter, the polymerization rate constant is measured at different intensities. A comparison with a commercially available photopolymer shows that the polymerization rate in this photopolymer is much faster.

  15. Consumer-Resource Dynamics: Quantity, Quality, and Allocation

    PubMed Central

    Getz, Wayne M.; Owen-Smith, Norman

    2011-01-01

    Background The dominant paradigm for modeling the complexities of interacting populations and food webs is a system of coupled ordinary differential equations in which the state of each species, population, or functional trophic group is represented by an aggregated numbers-density or biomass-density variable. Here, using the metaphysiological approach to model consumer-resource interactions, we formulate a two-state paradigm that represents each population or group in a food web in terms of both its quantity and quality. Methodology and Principal Findings The formulation includes an allocation function controlling the relative proportion of extracted resources to increasing quantity versus elevating quality. Since lower quality individuals senesce more rapidly than higher quality individuals, an optimal allocation proportion exists and we derive an expression for how this proportion depends on population parameters that determine the senescence rate, the per-capita mortality rate, and the effects of these rates on the dynamics of the quality variable. We demonstrate that oscillations do not arise in our model from quantity-quality interactions alone, but require consumer-resource interactions across trophic levels that can be stabilized through judicious resource allocation strategies. Analysis and simulations provide compelling arguments for the necessity of populations to evolve quality-related dynamics in the form of maternal effects, storage or other appropriate structures. They also indicate that resource allocation switching between investments in abundance versus quality provide a powerful mechanism for promoting the stability of consumer-resource interactions in seasonally forcing environments. Conclusions/Significance Our simulations show that physiological inefficiencies associated with this switching can be favored by selection due to the diminished exposure of inefficient consumers to strong oscillations associated with the well-known paradox of

  16. Modified Direct Insertion/Cancellation Method Based Sample Rate Conversion for Software Defined Radio

    NASA Astrophysics Data System (ADS)

    Bostamam, Anas Muhamad; Sanada, Yukitoshi; Minami, Hideki

    In this paper, a new fractional sample rate conversion (SRC) scheme based on a direct insertion/cancellation scheme is proposed. This scheme is suitable for signals that are sampled at a high sample rate and converted to a lower sample rate. The direct insertion/cancellation scheme may achieve low-complexity and lower power consumption as compared to the other SRC techniques. However, the direct insertion/cancellation technique suffers from large aliasing and distortion. The aliasing from an adjacent channel interferes the desired signal and degrades the performance. Therefore, a modified direct insertion/cancellation scheme is proposed in order to realize high performance resampling.

  17. A Novel Microfluidic Flow Rate Detection Method Based on Surface Plasmon Resonance Temperature Imaging

    PubMed Central

    Deng, Shijie; Wang, Peng; Liu, Shengnan; Zhao, Tianze; Xu, Shanzhi; Guo, Mingjiang; Yu, Xinglong

    2016-01-01

    A novel microfluidic flow rate detection method based on surface plasmon resonance (SPR) temperature imaging is proposed. The measurement is performed by space-resolved SPR imaging of the flow induced temperature variations. Theoretical simulations and analysis were performed to demonstrate a proof of concept using this approach. Experiments were implemented and results showed that water flow rates within a wide range of tens to hundreds of μL/min could be detected. The flow rate sensor is resistant to disturbances and can be easily integrated into microfluidic lab-on-chip systems. PMID:27347960

  18. Survival rate of breast cancer patients in Malaysia: a population-based study.

    PubMed

    Abdullah, Nor Aini; Wan Mahiyuddin, Wan Rozita; Muhammad, Nor Asiah; Ali, Zainudin Mohamad; Ibrahim, Lailanor; Ibrahim Tamim, Nor Saleha; Mustafa, Amal Nasir; Kamaluddin, Muhammad Amir

    2013-01-01

    Breast cancer is the most common cancer among Malaysian women. Other than hospital-based results, there are no documented population-based survival rates of Malaysian women for breast cancers. This population- based retrospective cohort study was therefore conducted. Data were obtained from Health Informatics Centre, Ministry of Health Malaysia, National Cancer Registry and National Registration Department for the period from 1st Jan 2000 to 31st December 2005. Cases were captured by ICD-10 and linked to death certificates to identify the status. Only complete data were analysed. Survival time was calculated from the estimated date of diagnosis to the date of death or date of loss to follow-up. Observed survival rates were estimated by Kaplan- Meier method using SPSS Statistical Software version 17. A total of 10,230 complete data sets were analysed. The mean age at diagnosis was 50.6 years old. The overall 5-year survival rate was 49% with median survival time of 68.1 months. Indian women had a higher survival rate of 54% compared to Chinese women (49%) and Malays (45%). The overall 5-year survival rate of breast cancer patient among Malaysian women was still low for the cohort of 2000 to 2005 as compared to survival rates in developed nations. Therefore, it is necessary to enhance the strategies for early detection and intervention.

  19. Allocation of cadaver kidneys: new pressures, new solutions.

    PubMed

    Braun, W E

    1994-09-01

    Equitable allocation of human cadaver kidneys is complex and challenging, both from the ethical and scientific points of view. It is based on the principles of distributive justice and medical utility. However, the optimal application of ethical principles will require further resolution of medical issues that currently focus on the number of transplants for a single patient, six antigen matches, lesser degrees of HLA matching, marginal recipients, various positive cross-match situations, and cold ischemia time. New HLA matching techniques and enhanced computer organ allocation systems have the potential to surmount racial differences and increase significantly the number of compatible renal allografts.

  20. Simulation of biochemical reactions with time-dependent rates by the rejection-based algorithm

    SciTech Connect

    Thanh, Vo Hong; Priami, Corrado

    2015-08-07

    We address the problem of simulating biochemical reaction networks with time-dependent rates and propose a new algorithm based on our rejection-based stochastic simulation algorithm (RSSA) [Thanh et al., J. Chem. Phys. 141(13), 134116 (2014)]. The computation for selecting next reaction firings by our time-dependent RSSA (tRSSA) is computationally efficient. Furthermore, the generated trajectory is exact by exploiting the rejection-based mechanism. We benchmark tRSSA on different biological systems with varying forms of reaction rates to demonstrate its applicability and efficiency. We reveal that for nontrivial cases, the selection of reaction firings in existing algorithms introduces approximations because the integration of reaction rates is very computationally demanding and simplifying assumptions are introduced. The selection of the next reaction firing by our approach is easier while preserving the exactness.

  1. Integrating base rate data in violence risk assessments at capital sentencing.

    PubMed

    Cunningham, M D; Reidy, T J

    1998-01-01

    Prediction of violence in capital sentencing has been controversial. In the absence of a scientific basis for risk assessment, mental health professionals offering opinions in the capital sentencing context are prone to errors. Actuarial or group statistical data, known as base rates, have proven far superior to other methods for reducing predictive errors in many contexts, including risk assessment. Actuarial follow-up data on violent recidivism of capital murderers in prison and post release have been compiled and analyzed to demonstrate available base rates for use by mental health experts conducting risk assessments pertaining to capital sentencing. This paper also reviews various methods for individualizing the application of base rates to specific cases.

  2. Fuzzy entropy based motion artifact detection and pulse rate estimation for fingertip photoplethysmography.

    PubMed

    Paradkar, Neeraj; Chowdhury, Shubhajit Roy

    2014-01-01

    The paper presents a fingertip photoplethysmography (PPG) based technique to estimate the pulse rate of the subject. The PPG signal obtained from a pulse oximeter is used for the analysis. The input samples are corrupted with motion artifacts due to minor motion of the subjects. Entropy measure of the input samples is used to detect the motion artifacts and estimate the pulse rate. A three step methodology is adapted to identify and classify signal peaks as true systolic peaks or artifact. CapnoBase database and CSL Benchmark database are used to analyze the technique and pulse rate estimation was performed with positive predictive value and sensitivity figures of 99.84% and 99.32% respectively for CapnoBase and 98.83% and 98.84% for CSL database respectively.

  3. Smart LED allocation scheme for efficient multiuser visible light communication networks.

    PubMed

    Sewaiwar, Atul; Tiwari, Samrat Vikramaditya; Chung, Yeon Ho

    2015-05-18

    In a multiuser bidirectional visible light communication (VLC), a large number of LEDs or an LED array needs to be allocated in an efficient manner to ensure sustainable data rate and link quality. Moreover, in order to support an increasing or decreasing number of users in the network, the LED allocation is required to be performed dynamically. In this paper, a novel smart LED allocation scheme for efficient multiuser VLC networks is presented. The proposed scheme allocates RGB LEDs to multiple users in a dynamic and efficient fashion, while satisfying illumination requirements in an indoor environment. The smart LED array comprised of RGB LEDs is divided into sectors according to the location of the users. The allocated sectors then provide optical power concentration toward the users for efficient and reliable data transmission. An algorithm for the dynamic allocation of the LEDs is also presented. To verify its effective resource allocation feature of the proposed scheme, simulations were performed. It is found that the proposed smart LED allocation scheme provides the effect of optical beamforming toward individual users, thereby increasing the collective power concentration of the optical signals on the desirable users and resulting in significantly increased data rate, while ensuring sufficient illumination in a multiuser VLC environment.

  4. Optimal Allocation of Node Capacity in Cascade-Robustness Networks.

    PubMed

    Chen, Zhen; Zhang, Jun; Du, Wen-Bo; Lordan, Oriol; Tang, Jiangjun

    2015-01-01

    The robustness of large scale critical infrastructures, which can be modeled as complex networks, is of great significance. One of the most important means to enhance robustness is to optimize the allocation of resources. Traditional allocation of resources is mainly based on the topology information, which is neither realistic nor systematic. In this paper, we try to build a framework for searching for the most favorable pattern of node capacity allocation to reduce the vulnerability to cascading failures at a low cost. A nonlinear and multi-objective optimization model is proposed and tackled using a particle swarm optimization algorithm (PSO). It is found that the network becomes more robust and economical when less capacity is left on the heavily loaded nodes and the optimized network performs better resisting noise. Our work is helpful in designing a robust economical network. PMID:26496705

  5. Optimal Allocation of Node Capacity in Cascade-Robustness Networks

    PubMed Central

    Chen, Zhen; Zhang, Jun; Du, Wen-Bo; Lordan, Oriol; Tang, Jiangjun

    2015-01-01

    The robustness of large scale critical infrastructures, which can be modeled as complex networks, is of great significance. One of the most important means to enhance robustness is to optimize the allocation of resources. Traditional allocation of resources is mainly based on the topology information, which is neither realistic nor systematic. In this paper, we try to build a framework for searching for the most favorable pattern of node capacity allocation to reduce the vulnerability to cascading failures at a low cost. A nonlinear and multi-objective optimization model is proposed and tackled using a particle swarm optimization algorithm (PSO). It is found that the network becomes more robust and economical when less capacity is left on the heavily loaded nodes and the optimized network performs better resisting noise. Our work is helpful in designing a robust economical network. PMID:26496705

  6. Ka-band (32 GHz) allocations for deep space

    NASA Technical Reports Server (NTRS)

    Degroot, N. F.

    1987-01-01

    At the 1979 World Administrative Conference, two new bands were allocated for deep space telecommunications: 31.8 to 32.3 GHz, space-to-Earth, and 34.2 to 34.7 GHz, Earth-to-space. These bands provide opportunity for further development of the Deep Space Network and its support of deep space research. The history of the process by which JPL/NASA developed the rationale, technical background, and statement of requirement for the bands are discussed. Based on this work, United States proposals to the conference included the bands, and subsequent U.S. and NASA participation in the conference led to successful allocations for deep space telecommunications in the 30 GHz region of the spectrum. A detailed description of the allocations is included.

  7. Latent IBP Compound Dirichlet Allocation.

    PubMed

    Archambeau, Cedric; Lakshminarayanan, Balaji; Bouchard, Guillaume

    2015-02-01

    We introduce the four-parameter IBP compound Dirichlet process (ICDP), a stochastic process that generates sparse non-negative vectors with potentially an unbounded number of entries. If we repeatedly sample from the ICDP we can generate sparse matrices with an infinite number of columns and power-law characteristics. We apply the four-parameter ICDP to sparse nonparametric topic modelling to account for the very large number of topics present in large text corpora and the power-law distribution of the vocabulary of natural languages. The model, which we call latent IBP compound Dirichlet allocation (LIDA), allows for power-law distributions, both, in the number of topics summarising the documents and in the number of words defining each topic. It can be interpreted as a sparse variant of the hierarchical Pitman-Yor process when applied to topic modelling. We derive an efficient and simple collapsed Gibbs sampler closely related to the collapsed Gibbs sampler of latent Dirichlet allocation (LDA), making the model applicable in a wide range of domains. Our nonparametric Bayesian topic model compares favourably to the widely used hierarchical Dirichlet process and its heavy tailed version, the hierarchical Pitman-Yor process, on benchmark corpora. Experiments demonstrate that accounting for the power-distribution of real data is beneficial and that sparsity provides more interpretable results. PMID:26353244

  8. The case to abandon human leukocyte antigen matching for kidney allocation: would it be wise to throw out the baby with the bathwater?

    PubMed

    Hiesse, Christian; Pessione, Fabienne; Houssin, Didier

    2004-02-27

    Since major histocompatibility (MHC) antigen matching was introduced in the early 1970s as the key factor determining kidney transplant allocation, several studies, mainly arising from organ-sharing organizations in the United States and Europe, have debated this complex issue. The first fundamental concern is the interaction of human leukocyte antigen matching with other transplant outcome risk factors, for example, prolongation of ischemia and matching for age. Much concordant data advocate restraining MHC antigen-based allocation in terms of space and time limits. The second fundamental concern is the balancing of the advantages of better antigen matching in terms of improved graft survival and the improved transplantation rate in immunologically high-risk patients with the major drawback of inequitable access for ethnic minorities and patients with rare MHC haplotypes. These issues have led to considering renewed kidney allocation rules, discarding human leukocyte antigen matching from algorithms, or modifying the specificity allocation level by using cross-reactive group matching or class II MHC antigen matching. The evolving concepts in the field of histocompatibility support the need for periodically updated, flexible, and hybrid allocation systems, as designed in France by the Etablissement français des Greffes.

  9. Assessment of heart rate variability based on mobile device for planning physical activity

    NASA Astrophysics Data System (ADS)

    Svirin, I. S.; Epishina, E. V.; Voronin, V. V.; Semenishchev, E. A.; Solodova, E. N.; Nabilskaya, N. V.

    2015-05-01

    In this paper we present a method for the functional analysis of human heart based on electrocardiography (ECG) signals. The approach using the apparatus of analytical and differential geometry and correlation and regression analysis. ECG contains information on the current condition of the cardiovascular system as well as on the pathological changes in the heart. Mathematical processing of the heart rate variability allows to obtain a great set of mathematical and statistical characteristics. These characteristics of the heart rate are used when solving research problems to study physiological changes that determine functional changes of an individual. The proposed method implemented for up-to-date mobile Android and iOS based devices.

  10. A GEM-based thermal neutron detector for high counting rate applications

    NASA Astrophysics Data System (ADS)

    Perelli Cippo, E.; Croci, G.; Muraro, A.; Menelle, A.; Albani, G.; Cavenago, M.; Cazzaniga, C.; Claps, G.; Grosso, G.; Murtas, F.; Rebai, M.; Tardocchi, M.; Gorini, G.

    2015-10-01

    Among other neutron detector systems proposed as a possible substitute for 3He tubes, GEM-based ones have shown appealing characteristics, when coupled with suitable neutron-converter cathodes. In this paper, we present the results of a GEM-based neutron detector in a high-flux environment (the ORPHÉE reactor in Saclay), especially in terms of maximum rate capability and linearity. Recorded data show that the detector can manage neutron counting rates in the order of 50 × 106 counts/sec cm2 while maintaining a reasonable linearity and with no sign of instability.

  11. Resource Allocation Procedure at Queensland University: A Dynamic Modelling Project.

    ERIC Educational Resources Information Center

    Galbraith, Peter L.; Carss, Brian W.

    A structural reorganization of the University of Queensland, Australia, was undertaken to promote efficient resource management, and a resource allocation model was developed to aid in policy evaluation and planning. The operation of the restructured system was based on creating five resource groups to manage the distribution of academic resources…

  12. Personnel Resource Allocation Strategies in a Time of Fiscal Stress

    ERIC Educational Resources Information Center

    Hausner, Larry Joseph, III

    2013-01-01

    The purpose of this study was to collect and analyze school level data related to the allocation of resources, and to determine how those resources are used to increase student achievement in the Hampton School District. The study was based on an analysis of one school district located in Los Angeles County in Southern California. All of the…

  13. 18 CFR 801.3 - Allocations, diversions, withdrawals and release.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... COMMISSION GENERAL POLICIES § 801.3 Allocations, diversions, withdrawals and release. (a) The extremes in..., or withdrawals of water be based on the common law principles of riparian rights which entitles..., municipal, agricultural, or industrial uses in excess of such quantities as the Commission may prescribe...

  14. Evaluating Student Allocation in the Portuguese Public Higher Education System

    ERIC Educational Resources Information Center

    Portela, Miguel; Areal, Nelson; Sa, Carla; Alexandre, Fernando; Cerejeira, Joao; Carvalho, Ana; Rodrigues, Artur

    2008-01-01

    This paper characterizes and evaluates the student allocation in the Portuguese public higher education system. It describes the supply and demand sides of the system by looking at the "numerus clausus" across areas of study and institutions, institutions' degree of diversity, and performance and adjustment indicators based on students' revealed…

  15. Rates and spatial variations of soil erosion in Europe: A study based on erosion plot data

    NASA Astrophysics Data System (ADS)

    Cerdan, O.; Govers, G.; Le Bissonnais, Y.; Van Oost, K.; Poesen, J.; Saby, N.; Gobin, A.; Vacca, A.; Quinton, J.; Auerswald, K.; Klik, A.; Kwaad, F. J. P. M.; Raclot, D.; Ionita, I.; Rejman, J.; Rousseva, S.; Muxart, T.; Roxo, M. J.; Dostal, T.

    2010-10-01

    An extensive database of short to medium-term erosion rates as measured on erosion plots in Europe under natural rainfall was compiled from the literature. Statistical analysis confirmed the dominant influence of land use and cover on soil erosion rates. Sheet and rill erosion rates are highest on bare soil; vineyards show the second highest soil losses, followed by other arable lands (spring crops, orchards and winter crops). A land with a permanent vegetation cover (shrubs, grassland and forest) is characterised by soil losses which are generally more than an order of magnitude lower than those on arable land. Disturbance of permanent vegetation by fire leads to momentarily higher erosion rates but rates are still lower than those measured on arable land. We also noticed important regional differences in erosion rates. Erosion rates are generally much lower in the Mediterranean as compared to other areas in Europe; this is mainly attributed to the high soil stoniness in the Mediterranean. Measured erosion rates on arable and bare land were related to topography (slope steepness and length) and soil texture, while this was not the case for plots with a permanent land cover. We attribute this to a fundamental difference in runoff generation and sediment transfer according to land cover types. On the basis of these results we calculated mean sheet and rill erosion rates for the European area covered by the CORINE database: estimated rill and interrill erosion rates are ca. 1.2 t ha - 1 year - 1 for the whole CORINE area and ca. 3.6 t ha - 1 year - 1 for arable land. These estimates are much lower than some earlier estimates which were based on the erroneous extrapolation of small datasets. High erosion rates occur in areas dominated by vineyards, the hilly loess areas in West and Central Europe and the agricultural areas located in the piedmont areas of the major European mountain ranges.

  16. A Heuristic Optimal Discrete Bit Allocation Algorithm for Margin Maximization in DMT Systems

    NASA Astrophysics Data System (ADS)

    Zhu, Li-Ping; Yao, Yan; Zhou, Shi-Dong; Dong, Shi-Wei

    2007-12-01

    A heuristic optimal discrete bit allocation algorithm is proposed for solving the margin maximization problem in discrete multitone (DMT) systems. Starting from an initial equal power assignment bit distribution, the proposed algorithm employs a multistaged bit rate allocation scheme to meet the target rate. If the total bit rate is far from the target rate, a multiple-bits loading procedure is used to obtain a bit allocation close to the target rate. When close to the target rate, a parallel bit-loading procedure is used to achieve the target rate and this is computationally more efficient than conventional greedy bit-loading algorithm. Finally, the target bit rate distribution is checked, if it is efficient, then it is also the optimal solution; else, optimal bit distribution can be obtained only by few bit swaps. Simulation results using the standard asymmetric digital subscriber line (ADSL) test loops show that the proposed algorithm is efficient for practical DMT transmissions.

  17. Making Time for Literacy: Teacher Knowledge and Time Allocation in Instructional Planning

    ERIC Educational Resources Information Center

    Spear-Swerling, Louise; Zibulsky, Jamie

    2014-01-01

    This study examined how K-5 general and special educators (N = 102) would choose to allocate time in a 2-h language arts block if they could do so as they wished, and how these choices related to their knowledge base for reading instruction. Preferences for time allocation were assessed through an open grid on which participants listed…

  18. Bringing the Budget Back into Academic Work Allocation Models: A Management Perspective

    ERIC Educational Resources Information Center

    Robertson, Michael; Germov, John

    2015-01-01

    Issues surrounding increasingly constrained resources and reducing levels of sector-based funding require consideration of a different Academic Work Allocation Model (AWAM) approach. Evidence from the literature indicates that an effective work allocation model is founded on the principles of equity and transparency in the distribution and…

  19. Effect of Peripheral Communication Pace on Attention Allocation in a Dual-Task Situation

    NASA Astrophysics Data System (ADS)

    Gueddana, Sofiane; Roussel, Nicolas

    Peripheral displays allow continuous awareness of information while performing other activities. Monitoring such a display while performing a central task has a cognitive cost that depends on its perceptual salience and the distraction it causes, i.e. the amount of attention it attracts away from the user’s primary action. This paper considers the particular case of peripheral displays for interpersonal communication. It reports on an experiment that studied the effect of peripheral communication pace on subjects’ allocation of attention in a dual-task situation: a snapshot-based peripheral monitoring task where participants need to assess the presence of a remote person, and a central text-correcting task against the clock. Our results show that the addition of the peripheral task caused a drop in the success rate of the central task. As the pace of snapshots increased, success rate decreased on the peripheral task while on the central one, success rate remained the same but failures to reply in time occurred more frequently. These results suggest that the increase in pace of snapshots caused participants to change their strategy for the central task and allocate more attention to the peripheral one, not enough to maintain peripheral performance but also not to the point where it would affect central performance. Overall, our work suggests that peripheral communication pace subtly influences attention allocation in dual-task situations. We conclude by discussing how control over information pace could help users of communication systems to adjust their local distraction as well as the attention they draw from remote users.

  20. Computational models and resource allocation for supercomputers

    NASA Technical Reports Server (NTRS)

    Mauney, Jon; Agrawal, Dharma P.; Harcourt, Edwin A.; Choe, Young K.; Kim, Sukil

    1989-01-01

    There are several different architectures used in supercomputers, with differing computational models. These different models present a variety of resource allocation problems that must be solved. The computational needs of a program must be cast in terms of the computational model supported by the supercomputer, and this must be done in a way that makes effective use of the machine's resources. This is the resource allocation problem. The computational models of available supercomputers and the associated resource allocation techniques are surveyed. It is shown that many problems and solutions appear repeatedly in very different computing environments. Some case studies are presented, showing concrete computational models and the allocation strategies used.