Sample records for rule based thresholding

  1. DTFP-Growth: Dynamic Threshold-Based FP-Growth Rule Mining Algorithm Through Integrating Gene Expression, Methylation, and Protein-Protein Interaction Profiles.

    PubMed

    Mallik, Saurav; Bhadra, Tapas; Mukherji, Ayan; Mallik, Saurav; Bhadra, Tapas; Mukherji, Ayan; Mallik, Saurav; Bhadra, Tapas; Mukherji, Ayan

    2018-04-01

    Association rule mining is an important technique for identifying interesting relationships between gene pairs in a biological data set. Earlier methods basically work for a single biological data set, and, in maximum cases, a single minimum support cutoff can be applied globally, i.e., across all genesets/itemsets. To overcome this limitation, in this paper, we propose dynamic threshold-based FP-growth rule mining algorithm that integrates gene expression, methylation and protein-protein interaction profiles based on weighted shortest distance to find the novel associations among different pairs of genes in multi-view data sets. For this purpose, we introduce three new thresholds, namely, Distance-based Variable/Dynamic Supports (DVS), Distance-based Variable Confidences (DVC), and Distance-based Variable Lifts (DVL) for each rule by integrating co-expression, co-methylation, and protein-protein interactions existed in the multi-omics data set. We develop the proposed algorithm utilizing these three novel multiple threshold measures. In the proposed algorithm, the values of , , and are computed for each rule separately, and subsequently it is verified whether the support, confidence, and lift of each evolved rule are greater than or equal to the corresponding individual , , and values, respectively, or not. If all these three conditions for a rule are found to be true, the rule is treated as a resultant rule. One of the major advantages of the proposed method compared with other related state-of-the-art methods is that it considers both the quantitative and interactive significance among all pairwise genes belonging to each rule. Moreover, the proposed method generates fewer rules, takes less running time, and provides greater biological significance for the resultant top-ranking rules compared to previous methods.

  2. Detailed statistical assessment of the characteristics of the ESMO Magnitude of Clinical Benefit Scale (ESMO-MCBS) threshold rules.

    PubMed

    Dafni, Urania; Karlis, Dimitris; Pedeli, Xanthi; Bogaerts, Jan; Pentheroudakis, George; Tabernero, Josep; Zielinski, Christoph C; Piccart, Martine J; de Vries, Elisabeth G E; Latino, Nicola Jane; Douillard, Jean-Yves; Cherny, Nathan I

    2017-01-01

    The European Society for Medical Oncology (ESMO) has developed the ESMO Magnitude of Clinical Benefit Scale (ESMO-MCBS), a tool to assess the magnitude of clinical benefit from new cancer therapies. Grading is guided by a dual rule comparing the relative benefit (RB) and the absolute benefit (AB) achieved by the therapy to prespecified threshold values. The ESMO-MCBS v1.0 dual rule evaluates the RB of an experimental treatment based on the lower limit of the 95%CI (LL95%CI) for the hazard ratio (HR) along with an AB threshold. This dual rule addresses two goals: inclusiveness: not unfairly penalising experimental treatments from trials designed with adequate power targeting clinically meaningful relative benefit; and discernment: penalising trials designed to detect a small inconsequential benefit. Based on 50 000 simulations of plausible trial scenarios, the sensitivity and specificity of the LL95%CI rule and the ESMO-MCBS dual rule, the robustness of their characteristics for reasonable power and range of targeted and true HRs, are examined. The per cent acceptance of maximal preliminary grade is compared with other dual rules based on point estimate (PE) thresholds for RB. For particularly small or particularly large studies, the observed benefit needs to be relatively big for the ESMO-MCBS dual rule to be satisfied and the maximal grade awarded. Compared with approaches that evaluate RB using the PE thresholds, simulations demonstrate that the MCBS approach better exhibits the desired behaviour achieving the goals of both inclusiveness and discernment. RB assessment using the LL95%CI for HR rather than a PE threshold has two advantages: it diminishes the probability of excluding big benefit positive studies from achieving due credit and, when combined with the AB assessment, it increases the probability of downgrading a trial with a statistically significant but clinically insignificant observed benefit.

  3. Detailed statistical assessment of the characteristics of the ESMO Magnitude of Clinical Benefit Scale (ESMO-MCBS) threshold rules

    PubMed Central

    Dafni, Urania; Karlis, Dimitris; Pedeli, Xanthi; Bogaerts, Jan; Pentheroudakis, George; Tabernero, Josep; Zielinski, Christoph C; Piccart, Martine J; de Vries, Elisabeth G E; Latino, Nicola Jane; Douillard, Jean-Yves; Cherny, Nathan I

    2017-01-01

    Background The European Society for Medical Oncology (ESMO) has developed the ESMO Magnitude of Clinical Benefit Scale (ESMO-MCBS), a tool to assess the magnitude of clinical benefit from new cancer therapies. Grading is guided by a dual rule comparing the relative benefit (RB) and the absolute benefit (AB) achieved by the therapy to prespecified threshold values. The ESMO-MCBS v1.0 dual rule evaluates the RB of an experimental treatment based on the lower limit of the 95%CI (LL95%CI) for the hazard ratio (HR) along with an AB threshold. This dual rule addresses two goals: inclusiveness: not unfairly penalising experimental treatments from trials designed with adequate power targeting clinically meaningful relative benefit; and discernment: penalising trials designed to detect a small inconsequential benefit. Methods Based on 50 000 simulations of plausible trial scenarios, the sensitivity and specificity of the LL95%CI rule and the ESMO-MCBS dual rule, the robustness of their characteristics for reasonable power and range of targeted and true HRs, are examined. The per cent acceptance of maximal preliminary grade is compared with other dual rules based on point estimate (PE) thresholds for RB. Results For particularly small or particularly large studies, the observed benefit needs to be relatively big for the ESMO-MCBS dual rule to be satisfied and the maximal grade awarded. Compared with approaches that evaluate RB using the PE thresholds, simulations demonstrate that the MCBS approach better exhibits the desired behaviour achieving the goals of both inclusiveness and discernment. Conclusions RB assessment using the LL95%CI for HR rather than a PE threshold has two advantages: it diminishes the probability of excluding big benefit positive studies from achieving due credit and, when combined with the AB assessment, it increases the probability of downgrading a trial with a statistically significant but clinically insignificant observed benefit. PMID:29067214

  4. 77 FR 38729 - Alternate Tonnage Threshold for Oil Spill Response Vessels

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-29

    ...The Coast Guard is establishing an alternate size threshold based on the measurement system established under the International Convention on Tonnage Measurement of Ships, 1969, for oil spill response vessels, which are properly certificated under 46 CFR chapter I, subchapter L. The present size threshold of 500 gross register tons is based on the U.S. regulatory measurement system. This final rule provides an alternative for owners and operators of offshore supply vessels that may result in an increase in oil spill response capacity and capability. This final rule adopts, without change, the interim rule amending 46 CFR part 126 published in the Federal Register on Monday, December 12, 2011.

  5. 78 FR 4032 - Prompt Corrective Action, Requirements for Insurance, and Promulgation of NCUA Rules and Regulations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-18

    ... interest rate risk requirements. The amended IRPS increases the asset threshold that identifies credit... asset threshold used to define a ``complex'' credit union for determining whether risk-based net worth... or credit unions) with assets of $50 million or less from interest rate risk rule requirements. To...

  6. 77 FR 36149 - Disclosure Requirements and Prohibitions Concerning Franchising

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-18

    ... FTC announces revised monetary thresholds for three exemptions from the Franchise Rule. FTC is... July 1, 2012. FOR FURTHER INFORMATION CONTACT: Craig Tregillus, Franchise Rule Coordinator, Division of... and Prohibitions Concerning Franchising'' (Franchise Rule or Rule) \\1\\ provides three exemptions based...

  7. Recognition ROCS Are Curvilinear--Or Are They? On Premature Arguments against the Two-High-Threshold Model of Recognition

    ERIC Educational Resources Information Center

    Broder, Arndt; Schutz, Julia

    2009-01-01

    Recent reviews of recognition receiver operating characteristics (ROCs) claim that their curvilinear shape rules out threshold models of recognition. However, the shape of ROCs based on confidence ratings is not diagnostic to refute threshold models, whereas ROCs based on experimental bias manipulations are. Also, fitting predicted frequencies to…

  8. A principled approach to setting optimal diagnostic thresholds: where ROC and indifference curves meet.

    PubMed

    Irwin, R John; Irwin, Timothy C

    2011-06-01

    Making clinical decisions on the basis of diagnostic tests is an essential feature of medical practice and the choice of the decision threshold is therefore crucial. A test's optimal diagnostic threshold is the threshold that maximizes expected utility. It is given by the product of the prior odds of a disease and a measure of the importance of the diagnostic test's sensitivity relative to its specificity. Choosing this threshold is the same as choosing the point on the Receiver Operating Characteristic (ROC) curve whose slope equals this product. We contend that a test's likelihood ratio is the canonical decision variable and contrast diagnostic thresholds based on likelihood ratio with two popular rules of thumb for choosing a threshold. The two rules are appealing because they have clear graphical interpretations, but they yield optimal thresholds only in special cases. The optimal rule can be given similar appeal by presenting indifference curves, each of which shows a set of equally good combinations of sensitivity and specificity. The indifference curve is tangent to the ROC curve at the optimal threshold. Whereas ROC curves show what is feasible, indifference curves show what is desirable. Together they show what should be chosen. Copyright © 2010 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  9. Optimal Sequential Rules for Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  10. 78 FR 75629 - Self-Regulatory Organizations; Miami International Securities Exchange LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... Effectiveness of a Proposed Rule Change To Amend the MIAX Fee Schedule December 6, 2013. Pursuant to the... Priority Customer Rebate Program (the ``Program'') to (i) lower the volume thresholds of the four highest... thresholds in a month as described below. The volume thresholds are calculated based on the customer average...

  11. Regression Discontinuity for Causal Effect Estimation in Epidemiology.

    PubMed

    Oldenburg, Catherine E; Moscoe, Ellen; Bärnighausen, Till

    Regression discontinuity analyses can generate estimates of the causal effects of an exposure when a continuously measured variable is used to assign the exposure to individuals based on a threshold rule. Individuals just above the threshold are expected to be similar in their distribution of measured and unmeasured baseline covariates to individuals just below the threshold, resulting in exchangeability. At the threshold exchangeability is guaranteed if there is random variation in the continuous assignment variable, e.g., due to random measurement error. Under exchangeability, causal effects can be identified at the threshold. The regression discontinuity intention-to-treat (RD-ITT) effect on an outcome can be estimated as the difference in the outcome between individuals just above (or below) versus just below (or above) the threshold. This effect is analogous to the ITT effect in a randomized controlled trial. Instrumental variable methods can be used to estimate the effect of exposure itself utilizing the threshold as the instrument. We review the recent epidemiologic literature reporting regression discontinuity studies and find that while regression discontinuity designs are beginning to be utilized in a variety of applications in epidemiology, they are still relatively rare, and analytic and reporting practices vary. Regression discontinuity has the potential to greatly contribute to the evidence base in epidemiology, in particular on the real-life and long-term effects and side-effects of medical treatments that are provided based on threshold rules - such as treatments for low birth weight, hypertension or diabetes.

  12. 16 CFR 801.20 - Acquisitions subsequent to exceeding threshold.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... threshold. 801.20 Section 801.20 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS... § 801.20 Acquisitions subsequent to exceeding threshold. Acquisitions meeting the criteria of section 7A... may have met or exceeded a notification threshold before the effective date of these rules; or (c) The...

  13. 16 CFR 801.20 - Acquisitions subsequent to exceeding threshold.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... threshold. 801.20 Section 801.20 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS... § 801.20 Acquisitions subsequent to exceeding threshold. Acquisitions meeting the criteria of section 7A... may have met or exceeded a notification threshold before the effective date of these rules; or (c) The...

  14. 16 CFR 801.20 - Acquisitions subsequent to exceeding threshold.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... threshold. 801.20 Section 801.20 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS... § 801.20 Acquisitions subsequent to exceeding threshold. Acquisitions meeting the criteria of section 7A... may have met or exceeded a notification threshold before the effective date of these rules; or (c) The...

  15. Larger than Life's Extremes: Rigorous Results for Simplified Rules and Speculation on the Phase Boundaries

    NASA Astrophysics Data System (ADS)

    Evans, Kellie Michele

    Larger than Life (LtL), is a four-parameter family of two-dimensional cellular automata that generalizes John Conway's Game of Life (Life) to large neighborhoods and general birth and survival thresholds. LtL was proposed by David Griffeath in the early 1990s to explore whether Life might be a clue to a critical phase point in the threshold-range scaling limit. The LtL family of rules includes Life as well as a rich set of two-dimensional rules, some of which exhibit dynamics vastly different from Life. In this chapter we present rigorous results and conjectures about the ergodic classifications of several sets of "simplified" LtL rules, each of which has a property that makes the rule easier to analyze. For example, these include symmetric rules such as the threshold voter automaton and the anti-voter automaton, monotone rules such as the threshold growth models, and others. We also provide qualitative results and speculation about LtL rules on various phase boundaries and summarize results and open questions about our favorite "Life-like" LtL rules.

  16. 76 FR 77128 - Alternate Tonnage Threshold for Oil Spill Response Vessels

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-12

    ...The Coast Guard is establishing an alternate size threshold based on the measurement system established under the International Convention on Tonnage Measurement of Ships, 1969, for Oil Spill Response Vessels (OSRVs), which are properly certificated under 46 CFR subchapter L. The present size threshold of 500 gross registered tons is based on the U.S. regulatory measurement system. This rule provides an alternative for owners and operators of offshore supply vessels (OSVs) that may result in an increase in oil spill response capacity and capability.

  17. Evaluation of a multi-arm multi-stage Bayesian design for phase II drug selection trials - an example in hemato-oncology.

    PubMed

    Jacob, Louis; Uvarova, Maria; Boulet, Sandrine; Begaj, Inva; Chevret, Sylvie

    2016-06-02

    Multi-Arm Multi-Stage designs aim at comparing several new treatments to a common reference, in order to select or drop any treatment arm to move forward when such evidence already exists based on interim analyses. We redesigned a Bayesian adaptive design initially proposed for dose-finding, focusing our interest in the comparison of multiple experimental drugs to a control on a binary criterion measure. We redesigned a phase II clinical trial that randomly allocates patients across three (one control and two experimental) treatment arms to assess dropping decision rules. We were interested in dropping any arm due to futility, either based on historical control rate (first rule) or comparison across arms (second rule), and in stopping experimental arm due to its ability to reach a sufficient response rate (third rule), using the difference of response probabilities in Bayes binomial trials between the treated and control as a measure of treatment benefit. Simulations were then conducted to investigate the decision operating characteristics under a variety of plausible scenarios, as a function of the decision thresholds. Our findings suggest that one experimental treatment was less efficient than the control and could have been dropped from the trial based on a sample of approximately 20 instead of 40 patients. In the simulation study, stopping decisions were reached sooner for the first rule than for the second rule, with close mean estimates of response rates and small bias. According to the decision threshold, the mean sample size to detect the required 0.15 absolute benefit ranged from 63 to 70 (rule 3) with false negative rates of less than 2 % (rule 1) up to 6 % (rule 2). In contrast, detecting a 0.15 inferiority in response rates required a sample size ranging on average from 23 to 35 (rules 1 and 2, respectively) with a false positive rate ranging from 3.6 to 0.6 % (rule 3). Adaptive trial design is a good way to improve clinical trials. It allows removing ineffective drugs and reducing the trial sample size, while maintaining unbiased estimates. Decision thresholds can be set according to predefined fixed error decision rates. ClinicalTrials.gov Identifier: NCT01342692 .

  18. 76 FR 41839 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-15

    ... Effectiveness of Proposed Rule Change Relating to PAR Official Fees in Volatility Index Options July 7, 2011... threshold tiers for the assessment of PAR Official Fees in Volatility Index Options classes based on the..., 2011 to establish volume threshold tiers for the assessment of PAR Official Fees in Volatility Index...

  19. Medicare Program; Explanation of FY 2004 Outlier Fixed-Loss Threshold as Required by Court Rulings. Clarification.

    PubMed

    2016-01-22

    In accordance with court rulings in cases that challenge the federal fiscal year (FY) 2004 outlier fixed-loss threshold rulemaking, this document provides further explanation of certain methodological choices made in the FY 2004 fixed-loss threshold determination.

  20. Mate choice when males are in patches: optimal strategies and good rules of thumb.

    PubMed

    Hutchinson, John M C; Halupka, Konrad

    2004-11-07

    In standard mate-choice models, females encounter males sequentially and decide whether to inspect the quality of another male or to accept a male already inspected. What changes when males are clumped in patches and there is a significant cost to travel between patches? We use stochastic dynamic programming to derive optimum strategies under various assumptions. With zero costs to returning to a male in the current patch, the optimal strategy accepts males above a quality threshold which is constant whenever one or more males in the patch remain uninspected; this threshold drops when inspecting the last male in the patch, so returns may occur only then and are never to a male in a previously inspected patch. With non-zero within-patch return costs, such a two-threshold rule still performs extremely well, but a more gradual decline in acceptance threshold is optimal. Inability to return at all need not decrease performance by much. The acceptance threshold should also decline if it gets harder to discover the last males in a patch. Optimal strategies become more complex when mean male quality varies systematically between patches or years, and females estimate this in a Bayesian manner through inspecting male qualities. It can then be optimal to switch patch before inspecting all males on a patch, or, exceptionally, to return to an earlier patch. We compare performance of various rules of thumb in these environments and in ones without a patch structure. A two-threshold rule performs excellently, as do various simplifications of it. The best-of-N rule outperforms threshold rules only in non-patchy environments with between-year quality variation. The cutoff rule performs poorly.

  1. 77 FR 5700 - Approval and Promulgation of Implementation Plans; New Hampshire: Prevention of Significant...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ... appropriate emission thresholds for determining which new stationary sources and modification projects become... affects major stationary sources in New Hampshire that have GHG emissions above the thresholds established... higher thresholds in the Tailoring Rule, EPA published a final rule on December 30, 2010, narrowing its...

  2. 77 FR 59139 - Prompt Corrective Action, Requirements for Insurance, and Promulgation of NCUA Rules and Regulations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-26

    ... threshold is used to define a ``complex'' credit union for determining whether risk-based net worth... credit union (FICU) is subject to certain interest rate risk rule requirements. \\1\\ IRPS 03-2, 68 FR... multiple applications, while avoiding undue risk to the National Credit Union Share Insurance Fund (NCUSIF...

  3. Predicting missing values in a home care database using an adaptive uncertainty rule method.

    PubMed

    Konias, S; Gogou, G; Bamidis, P D; Vlahavas, I; Maglaveras, N

    2005-01-01

    Contemporary literature illustrates an abundance of adaptive algorithms for mining association rules. However, most literature is unable to deal with the peculiarities, such as missing values and dynamic data creation, that are frequently encountered in fields like medicine. This paper proposes an uncertainty rule method that uses an adaptive threshold for filling missing values in newly added records. A new approach for mining uncertainty rules and filling missing values is proposed, which is in turn particularly suitable for dynamic databases, like the ones used in home care systems. In this study, a new data mining method named FiMV (Filling Missing Values) is illustrated based on the mined uncertainty rules. Uncertainty rules have quite a similar structure to association rules and are extracted by an algorithm proposed in previous work, namely AURG (Adaptive Uncertainty Rule Generation). The main target was to implement an appropriate method for recovering missing values in a dynamic database, where new records are continuously added, without needing to specify any kind of thresholds beforehand. The method was applied to a home care monitoring system database. Randomly, multiple missing values for each record's attributes (rate 5-20% by 5% increments) were introduced in the initial dataset. FiMV demonstrated 100% completion rates with over 90% success in each case, while usual approaches, where all records with missing values are ignored or thresholds are required, experienced significantly reduced completion and success rates. It is concluded that the proposed method is appropriate for the data-cleaning step of the Knowledge Discovery process in databases. The latter, containing much significance for the output efficiency of any data mining technique, can improve the quality of the mined information.

  4. Predicting the threshold of pulse-train electrical stimuli using a stochastic auditory nerve model: the effects of stimulus noise.

    PubMed

    Xu, Yifang; Collins, Leslie M

    2004-04-01

    The incorporation of low levels of noise into an electrical stimulus has been shown to improve auditory thresholds in some human subjects (Zeng et al., 2000). In this paper, thresholds for noise-modulated pulse-train stimuli are predicted utilizing a stochastic neural-behavioral model of ensemble fiber responses to bi-phasic stimuli. The neural refractory effect is described using a Markov model for a noise-free pulse-train stimulus and a closed-form solution for the steady-state neural response is provided. For noise-modulated pulse-train stimuli, a recursive method using the conditional probability is utilized to track the neural responses to each successive pulse. A neural spike count rule has been presented for both threshold and intensity discrimination under the assumption that auditory perception occurs via integration over a relatively long time period (Bruce et al., 1999). An alternative approach originates from the hypothesis of the multilook model (Viemeister and Wakefield, 1991), which argues that auditory perception is based on several shorter time integrations and may suggest an NofM model for prediction of pulse-train threshold. This motivates analyzing the neural response to each individual pulse within a pulse train, which is considered to be the brief look. A logarithmic rule is hypothesized for pulse-train threshold. Predictions from the multilook model are shown to match trends in psychophysical data for noise-free stimuli that are not always matched by the long-time integration rule. Theoretical predictions indicate that threshold decreases as noise variance increases. Theoretical models of the neural response to pulse-train stimuli not only reduce calculational overhead but also facilitate utilization of signal detection theory and are easily extended to multichannel psychophysical tasks.

  5. 40 CFR 52.1072 - Conditional approval.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... to provide MOBILE6-based mobile source emission budgets and adopted measures sufficient to achieve... threshold to 25 tons per year. (9) Revises Reasonably Available Control Technology (RACT) rules to include...

  6. 78 FR 38420 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-26

    ... Proposed Rule Change Amending NYSE Rule 1000 To Increase the Price Threshold for Those Securities... threshold for those securities ineligible for automatic executions from $1,000.00 or more to $10,000.00 or...). Specifically, the Exchange believes that increasing the dollar threshold for high-priced securities would...

  7. 40 CFR 52.473 - Conditional approval.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-progress plan to provide MOBILE6-based mobile source emission budgets and adopted measures sufficient to... threshold to 25 tons per year. (9) Revises Reasonably Available Control Technology (RACT) rules to include...

  8. Automated implementation of rule-based expert systems with neural networks for time-critical applications

    NASA Technical Reports Server (NTRS)

    Ramamoorthy, P. A.; Huang, Song; Govind, Girish

    1991-01-01

    In fault diagnosis, control and real-time monitoring, both timing and accuracy are critical for operators or machines to reach proper solutions or appropriate actions. Expert systems are becoming more popular in the manufacturing community for dealing with such problems. In recent years, neural networks have revived and their applications have spread to many areas of science and engineering. A method of using neural networks to implement rule-based expert systems for time-critical applications is discussed here. This method can convert a given rule-based system into a neural network with fixed weights and thresholds. The rules governing the translation are presented along with some examples. We also present the results of automated machine implementation of such networks from the given rule-base. This significantly simplifies the translation process to neural network expert systems from conventional rule-based systems. Results comparing the performance of the proposed approach based on neural networks vs. the classical approach are given. The possibility of very large scale integration (VLSI) realization of such neural network expert systems is also discussed.

  9. Drug pricing and control of health expenditures: a comparison between a proportional decision rule and a cost-per-QALY rule.

    PubMed

    Gandjour, Afschin

    2015-01-01

    In Germany, the Institute for Quality and Efficiency in Health Care (IQWiG) makes recommendations for reimbursement prices of drugs on the basis of a proportional relationship between costs and health benefits. This paper analyzed the potential of IQWiG's decision rule to control health expenditures and used a cost-per-quality-adjusted life year (QALY) rule as a comparison. A literature search was conducted, and a theoretical model of health expenditure growth was built. The literature search shows that the median incremental cost-effectiveness ratio of German cost-effectiveness analyses was €7650 per QALY gained, thus yielding a much lower threshold cost-effectiveness ratio for IQWiG's rule than an absolute rule at €30 000 per QALY. The theoretical model shows that IQWiG's rule is able to contain the long-term growth of health expenditures under the conservative assumption that future health increases at a constant absolute rate and that the threshold incremental cost-effectiveness ratio increases at a smaller rate than health expenditures. In contrast, an absolute rule offers the potential for manufacturers to raise drug prices in response to the threshold, thus resulting in an initial spike in expenditures. Results suggest that IQWiG's proportional rule will lead to lower drug prices and a slower growth of health expenditures than an absolute cost-effectiveness threshold at €30 000 per QALY. This finding is surprising as IQWiG's rule-in contrast to a cost-per-QALY rule-does not start from a fixed budget. Copyright © 2014 John Wiley & Sons, Ltd.

  10. Cooperation and charity in spatial public goods game under different strategy update rules

    NASA Astrophysics Data System (ADS)

    Li, Yixiao; Jin, Xiaogang; Su, Xianchuang; Kong, Fansheng; Peng, Chengbin

    2010-03-01

    Human cooperation can be influenced by other human behaviors and recent years have witnessed the flourishing of studying the coevolution of cooperation and punishment, yet the common behavior of charity is seldom considered in game-theoretical models. In this article, we investigate the coevolution of altruistic cooperation and egalitarian charity in spatial public goods game, by considering charity as the behavior of reducing inter-individual payoff differences. Our model is that, in each generation of the evolution, individuals play games first and accumulate payoff benefits, and then each egalitarian makes a charity donation by payoff transfer in its neighborhood. To study the individual-level evolutionary dynamics, we adopt different strategy update rules and investigate their effects on charity and cooperation. These rules can be classified into two global rules: random selection rule in which individuals randomly update strategies, and threshold selection rule where only those with payoffs below a threshold update strategies. Simulation results show that random selection enhances the cooperation level, while threshold selection lowers the threshold of the multiplication factor to maintain cooperation. When charity is considered, it is incapable in promoting cooperation under random selection, whereas it promotes cooperation under threshold selection. Interestingly, the evolution of charity strongly depends on the dispersion of payoff acquisitions of the population, which agrees with previous results. Our work may shed light on understanding human egalitarianism.

  11. Defining operating rules for mitigation of drought effects on water supply systems

    NASA Astrophysics Data System (ADS)

    Rossi, G.; Caporali, E.; Garrote, L.; Federici, G. V.

    2012-04-01

    Reservoirs play a pivotal role for water supply systems regulation and management especially during drought periods. Optimization of reservoir releases, related to drought mitigation rules is particularly required. The hydrologic state of the system is evaluated defining some threshold values, expressed in probabilistic terms. Risk deficit curves are used to reduce the ensemble of possible rules for simulation. Threshold values can be linked to specific actions in an operational context in different levels of severity, i.e. normal, pre-alert, alert and emergency scenarios. A simplified model of the water resources system is built to evaluate the threshold values and the management rules. The threshold values are defined considering the probability to satisfy a given fraction of the demand in a certain time horizon, and are validated with a long term simulation that takes into account the characteristics of the evaluated system. The threshold levels determine some curves that define reservoir releases as a function of existing storage volume. A demand reduction is related to each threshold level. The rules to manage the system in drought conditions, the threshold levels and the reductions are optimized using long term simulations with different hypothesized states of the system. Synthetic sequences of flows with the same statistical properties of the historical ones are produced to evaluate the system behaviour. Performances of different values of reduction and different threshold curves are evaluated using different objective function and performances indices. The methodology is applied to the urban area Firenze-Prato-Pistoia in central Tuscany, in Central Italy. The considered demand centres are Firenze and Bagno a Ripoli that have, accordingly to the census ISTAT 2001, a total of 395.000 inhabitants.

  12. Improving Sector Hash Carving with Rule-Based and Entropy-Based Non-Probative Block Filters

    DTIC Science & Technology

    2015-03-01

    0x20 exceeds the histogram rule’s threshold of 256 instances of a single 4-byte value. The 0x20 bytes are part of an Extensible Metadata Platform (XMP...block consists of data separated by NULL bytes of padding. The histogram rule is triggered for the block because the block contains more than 256 4...sdash can reduce the rate of false positive matches. After characteristic features have been selected, the features are hashed using SHA -1, which creates

  13. Utility of Decision Rules for Transcutaneous Bilirubin Measurements

    PubMed Central

    Burgos, Anthony E.; Flaherman, Valerie; Chung, Esther K.; Simpson, Elizabeth A.; Goyal, Neera K.; Von Kohorn, Isabelle; Dhepyasuwan, Niramol

    2016-01-01

    BACKGROUND: Transcutaneous bilirubin (TcB) meters are widely used for screening newborns for jaundice, with a total serum bilirubin (TSB) measurement indicated when the TcB value is classified as “positive” by using a decision rule. The goal of our study was to assess the clinical utility of 3 recommended TcB screening decision rules. METHODS: Paired TcB/TSB measurements were collected at 34 newborn nursery sites. At 27 sites (sample 1), newborns were routinely screened with a TcB measurement. For sample 2, sites that typically screen with TSB levels also obtained a TcB measurement for the study. Three decision rules to define a positive TcB measurement were evaluated: ≥75th percentile on the Bhutani nomogram, 70% of the phototherapy level, and within 3 mg/dL of the phototherapy threshold. The primary outcome was a TSB level at/above the phototherapy threshold. The rate of false-negative TcB screens and percentage of blood draws avoided were calculated for each decision rule. RESULTS: For sample 1, data were analyzed on 911 paired TcB-TSB measurements from a total of 8316 TcB measurements. False-negative rates were <10% with all decision rules; none identified all 31 newborns with a TSB level at/above the phototherapy threshold. The percentage of blood draws avoided ranged from 79.4% to 90.7%. In sample 2, each rule correctly identified all 8 newborns with TSB levels at/above the phototherapy threshold. CONCLUSIONS: Although all of the decision rules can be used effectively to screen newborns for jaundice, each will “miss” some infants with a TSB level at/above the phototherapy threshold. PMID:27244792

  14. Explosive site percolation with a product rule.

    PubMed

    Choi, Woosik; Yook, Soon-Hyung; Kim, Yup

    2011-08-01

    We study the site percolation under Achlioptas process with a product rule in a two-dimensional square lattice. From the measurement of the cluster size distribution P(s), we find that P(s) has a very robust power-law regime followed by a stable hump near the transition threshold. Based on the careful analysis on the PP(s) distribution, we show that the transition should be discontinuous. The existence of the hysteresis loop in order parameter also verifies that the transition is discontinuous in two dimensions. Moreover, we also show that the transition nature from the product rule is not the same as that from a sum rule in two dimensions.

  15. Wavelet tree structure based speckle noise removal for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Yuan, Xin; Liu, Xuan; Liu, Yang

    2018-02-01

    We report a new speckle noise removal algorithm in optical coherence tomography (OCT). Though wavelet domain thresholding algorithms have demonstrated superior advantages in suppressing noise magnitude and preserving image sharpness in OCT, the wavelet tree structure has not been investigated in previous applications. In this work, we propose an adaptive wavelet thresholding algorithm via exploiting the tree structure in wavelet coefficients to remove the speckle noise in OCT images. The threshold for each wavelet band is adaptively selected following a special rule to retain the structure of the image across different wavelet layers. Our results demonstrate that the proposed algorithm outperforms conventional wavelet thresholding, with significant advantages in preserving image features.

  16. Comparison of two insulin assays for first-phase insulin release in type 1 diabetes prediction and prevention studies

    PubMed Central

    Mahon, Jeffrey L.; Beam, Craig A.; Marcovina, Santica M.; Boulware, David C.; Palmer, Jerry P.; Winter, William E.; Skyler, Jay S.; Krischer, Jeffrey P.

    2018-01-01

    Background Detection of below-threshold first-phase insulin release or FPIR (1 + 3 minute insulin concentrations during an intravenous glucose tolerance test [IVGTT]) is important in type 1 diabetes prediction and prevention studies including the TrialNet Oral Insulin Prevention Trial. We assessed whether an insulin immunoenzymometric assay (IEMA) could replace the less practical but current standard of a radioimmunoassay (RIA) for FPIR. Methods One hundred thirty-three islet autoantibody positive relatives of persons with type 1 diabetes underwent 161 IVGTTs. Insulin concentrations were measured by both assays in 1056 paired samples. A rule classifying FPIR (below-threshold, above-threshold, uncertain) by the IEMA was derived and validated against FPIR by the RIA. Results The insulin IEMA-based rule accurately classified below- and above-threshold FPIRs by the RIA in 110/161 (68%) IVGTTs, but was uncertain in 51/161 (32%) tests for which FPIR by RIA is needed. An uncertain FPIR by the IEMA was more likely among below-threshold vs above-threshold FPIRs by the RIA (64% [30/47] vs. 18% [21/114], respectively; p < 0.05). Conclusions An insulin IEMA for FPIR in subjects at risk for type 1 diabetes accurately determined below- and above-threshold FPIRs in 2/3 of tests relative to the current standard of the insulin RIA, but could not reliably classify the remaining FPIRs. TrialNet is limiting the insulin RIA for FPIR to the latter given the practical advantages of the more specific IEMA. PMID:21843518

  17. The asymmetry of U.S. monetary policy: Evidence from a threshold Taylor rule with time-varying threshold values

    NASA Astrophysics Data System (ADS)

    Zhu, Yanli; Chen, Haiqiang

    2017-05-01

    In this paper, we revisit the issue whether U.S. monetary policy is asymmetric by estimating a forward-looking threshold Taylor rule with quarterly data from 1955 to 2015. In order to capture the potential heterogeneity for regime shift mechanism under different economic conditions, we modify the threshold model by assuming the threshold value as a latent variable following an autoregressive (AR) dynamic process. We use the unemployment rate as the threshold variable and separate the sample into two periods: expansion periods and recession periods. Our findings support that the U.S. monetary policy operations are asymmetric in these two regimes. More precisely, the monetary authority tends to implement an active Taylor rule with a weaker response to the inflation gap (the deviation of inflation from its target) and a stronger response to the output gap (the deviation of output from its potential level) in recession periods. The threshold value, interpreted as the targeted unemployment rate of monetary authorities, exhibits significant time-varying properties, confirming the conjecture that policy makers may adjust their reference point for the unemployment rate accordingly to reflect their attitude on the health of general economy.

  18. 16 CFR 801.20 - Acquisitions subsequent to exceeding threshold.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Acquisitions subsequent to exceeding threshold. 801.20 Section 801.20 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS, STATEMENTS AND INTERPRETATIONS UNDER THE HART-SCOTT-RODINO ANTITRUST IMPROVEMENTS ACT OF 1976 COVERAGE RULES § 801.20 Acquisitions subsequent to exceeding...

  19. 16 CFR 801.20 - Acquisitions subsequent to exceeding threshold.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Acquisitions subsequent to exceeding threshold. 801.20 Section 801.20 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS, STATEMENTS AND INTERPRETATIONS UNDER THE HART-SCOTT-RODINO ANTITRUST IMPROVEMENTS ACT OF 1976 COVERAGE RULES § 801.20 Acquisitions subsequent to exceeding...

  20. A novel association rule mining approach using TID intermediate itemset.

    PubMed

    Aqra, Iyad; Herawan, Tutut; Abdul Ghani, Norjihan; Akhunzada, Adnan; Ali, Akhtar; Bin Razali, Ramdan; Ilahi, Manzoor; Raymond Choo, Kim-Kwang

    2018-01-01

    Designing an efficient association rule mining (ARM) algorithm for multilevel knowledge-based transactional databases that is appropriate for real-world deployments is of paramount concern. However, dynamic decision making that needs to modify the threshold either to minimize or maximize the output knowledge certainly necessitates the extant state-of-the-art algorithms to rescan the entire database. Subsequently, the process incurs heavy computation cost and is not feasible for real-time applications. The paper addresses efficiently the problem of threshold dynamic updation for a given purpose. The paper contributes by presenting a novel ARM approach that creates an intermediate itemset and applies a threshold to extract categorical frequent itemsets with diverse threshold values. Thus, improving the overall efficiency as we no longer needs to scan the whole database. After the entire itemset is built, we are able to obtain real support without the need of rebuilding the itemset (e.g. Itemset list is intersected to obtain the actual support). Moreover, the algorithm supports to extract many frequent itemsets according to a pre-determined minimum support with an independent purpose. Additionally, the experimental results of our proposed approach demonstrate the capability to be deployed in any mining system in a fully parallel mode; consequently, increasing the efficiency of the real-time association rules discovery process. The proposed approach outperforms the extant state-of-the-art and shows promising results that reduce computation cost, increase accuracy, and produce all possible itemsets.

  1. A novel association rule mining approach using TID intermediate itemset

    PubMed Central

    Ali, Akhtar; Bin Razali, Ramdan; Ilahi, Manzoor; Raymond Choo, Kim-Kwang

    2018-01-01

    Designing an efficient association rule mining (ARM) algorithm for multilevel knowledge-based transactional databases that is appropriate for real-world deployments is of paramount concern. However, dynamic decision making that needs to modify the threshold either to minimize or maximize the output knowledge certainly necessitates the extant state-of-the-art algorithms to rescan the entire database. Subsequently, the process incurs heavy computation cost and is not feasible for real-time applications. The paper addresses efficiently the problem of threshold dynamic updation for a given purpose. The paper contributes by presenting a novel ARM approach that creates an intermediate itemset and applies a threshold to extract categorical frequent itemsets with diverse threshold values. Thus, improving the overall efficiency as we no longer needs to scan the whole database. After the entire itemset is built, we are able to obtain real support without the need of rebuilding the itemset (e.g. Itemset list is intersected to obtain the actual support). Moreover, the algorithm supports to extract many frequent itemsets according to a pre-determined minimum support with an independent purpose. Additionally, the experimental results of our proposed approach demonstrate the capability to be deployed in any mining system in a fully parallel mode; consequently, increasing the efficiency of the real-time association rules discovery process. The proposed approach outperforms the extant state-of-the-art and shows promising results that reduce computation cost, increase accuracy, and produce all possible itemsets. PMID:29351287

  2. Development of an Alert System to Detect Drug Interactions with Herbal Supplements using Medical Record Data.

    PubMed

    Archer, Melissa; Proulx, Joshua; Shane-McWhorter, Laura; Bray, Bruce E; Zeng-Treitler, Qing

    2014-01-01

    While potential medication-to-medication interaction alerting engines exist in many clinical applications, few systems exist to automatically alert on potential medication to herbal supplement interactions. We have developed a preliminary knowledge base and rules alerting engine that detects 259 potential interactions between 9 supplements, 62 cardiac medications, and 19 drug classes. The rules engine takes into consideration 12 patient risk factors and 30 interaction warning signs to help determine which of three different alert levels to categorize each potential interaction. A formative evaluation was conducted with two clinicians to set initial thresholds for each alert level. Additional work is planned add more supplement interactions, risk factors, and warning signs as well as to continue to set and adjust the inputs and thresholds for each potential interaction.

  3. Vehicle tracking using fuzzy-based vehicle detection window with adaptive parameters

    NASA Astrophysics Data System (ADS)

    Chitsobhuk, Orachat; Kasemsiri, Watjanapong; Glomglome, Sorayut; Lapamonpinyo, Pipatphon

    2018-04-01

    In this paper, fuzzy-based vehicle tracking system is proposed. The proposed system consists of two main processes: vehicle detection and vehicle tracking. In the first process, the Gradient-based Adaptive Threshold Estimation (GATE) algorithm is adopted to provide the suitable threshold value for the sobel edge detection. The estimated threshold can be adapted to the changes of diverse illumination conditions throughout the day. This leads to greater vehicle detection performance compared to a fixed user's defined threshold. In the second process, this paper proposes the novel vehicle tracking algorithms namely Fuzzy-based Vehicle Analysis (FBA) in order to reduce the false estimation of the vehicle tracking caused by uneven edges of the large vehicles and vehicle changing lanes. The proposed FBA algorithm employs the average edge density and the Horizontal Moving Edge Detection (HMED) algorithm to alleviate those problems by adopting fuzzy rule-based algorithms to rectify the vehicle tracking. The experimental results demonstrate that the proposed system provides the high accuracy of vehicle detection about 98.22%. In addition, it also offers the low false detection rates about 3.92%.

  4. Self-Associations Influence Task-Performance through Bayesian Inference

    PubMed Central

    Bengtsson, Sara L.; Penny, Will D.

    2013-01-01

    The way we think about ourselves impacts greatly on our behavior. This paper describes a behavioral study and a computational model that shed new light on this important area. Participants were primed “clever” and “stupid” using a scrambled sentence task, and we measured the effect on response time and error-rate on a rule-association task. First, we observed a confirmation bias effect in that associations to being “stupid” led to a gradual decrease in performance, whereas associations to being “clever” did not. Second, we observed that the activated self-concepts selectively modified attention toward one’s performance. There was an early to late double dissociation in RTs in that primed “clever” resulted in RT increase following error responses, whereas primed “stupid” resulted in RT increase following correct responses. We propose a computational model of subjects’ behavior based on the logic of the experimental task that involves two processes; memory for rules and the integration of rules with subsequent visual cues. The model incorporates an adaptive decision threshold based on Bayes rule, whereby decision thresholds are increased if integration was inferred to be faulty. Fitting the computational model to experimental data confirmed our hypothesis that priming affects the memory process. This model explains both the confirmation bias and double dissociation effects and demonstrates that Bayesian inferential principles can be used to study the effect of self-concepts on behavior. PMID:23966937

  5. Estimation of the diagnostic threshold accounting for decision costs and sampling uncertainty.

    PubMed

    Skaltsa, Konstantina; Jover, Lluís; Carrasco, Josep Lluís

    2010-10-01

    Medical diagnostic tests are used to classify subjects as non-diseased or diseased. The classification rule usually consists of classifying subjects using the values of a continuous marker that is dichotomised by means of a threshold. Here, the optimum threshold estimate is found by minimising a cost function that accounts for both decision costs and sampling uncertainty. The cost function is optimised either analytically in a normal distribution setting or empirically in a free-distribution setting when the underlying probability distributions of diseased and non-diseased subjects are unknown. Inference of the threshold estimates is based on approximate analytically standard errors and bootstrap-based approaches. The performance of the proposed methodology is assessed by means of a simulation study, and the sample size required for a given confidence interval precision and sample size ratio is also calculated. Finally, a case example based on previously published data concerning the diagnosis of Alzheimer's patients is provided in order to illustrate the procedure.

  6. A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks

    PubMed Central

    Alemi, Alireza; Baldassi, Carlo; Brunel, Nicolas; Zecchina, Riccardo

    2015-01-01

    Understanding the theoretical foundations of how memories are encoded and retrieved in neural populations is a central challenge in neuroscience. A popular theoretical scenario for modeling memory function is the attractor neural network scenario, whose prototype is the Hopfield model. The model simplicity and the locality of the synaptic update rules come at the cost of a poor storage capacity, compared with the capacity achieved with perceptron learning algorithms. Here, by transforming the perceptron learning rule, we present an online learning rule for a recurrent neural network that achieves near-maximal storage capacity without an explicit supervisory error signal, relying only upon locally accessible information. The fully-connected network consists of excitatory binary neurons with plastic recurrent connections and non-plastic inhibitory feedback stabilizing the network dynamics; the memory patterns to be memorized are presented online as strong afferent currents, producing a bimodal distribution for the neuron synaptic inputs. Synapses corresponding to active inputs are modified as a function of the value of the local fields with respect to three thresholds. Above the highest threshold, and below the lowest threshold, no plasticity occurs. In between these two thresholds, potentiation/depression occurs when the local field is above/below an intermediate threshold. We simulated and analyzed a network of binary neurons implementing this rule and measured its storage capacity for different sizes of the basins of attraction. The storage capacity obtained through numerical simulations is shown to be close to the value predicted by analytical calculations. We also measured the dependence of capacity on the strength of external inputs. Finally, we quantified the statistics of the resulting synaptic connectivity matrix, and found that both the fraction of zero weight synapses and the degree of symmetry of the weight matrix increase with the number of stored patterns. PMID:26291608

  7. A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks.

    PubMed

    Alemi, Alireza; Baldassi, Carlo; Brunel, Nicolas; Zecchina, Riccardo

    2015-08-01

    Understanding the theoretical foundations of how memories are encoded and retrieved in neural populations is a central challenge in neuroscience. A popular theoretical scenario for modeling memory function is the attractor neural network scenario, whose prototype is the Hopfield model. The model simplicity and the locality of the synaptic update rules come at the cost of a poor storage capacity, compared with the capacity achieved with perceptron learning algorithms. Here, by transforming the perceptron learning rule, we present an online learning rule for a recurrent neural network that achieves near-maximal storage capacity without an explicit supervisory error signal, relying only upon locally accessible information. The fully-connected network consists of excitatory binary neurons with plastic recurrent connections and non-plastic inhibitory feedback stabilizing the network dynamics; the memory patterns to be memorized are presented online as strong afferent currents, producing a bimodal distribution for the neuron synaptic inputs. Synapses corresponding to active inputs are modified as a function of the value of the local fields with respect to three thresholds. Above the highest threshold, and below the lowest threshold, no plasticity occurs. In between these two thresholds, potentiation/depression occurs when the local field is above/below an intermediate threshold. We simulated and analyzed a network of binary neurons implementing this rule and measured its storage capacity for different sizes of the basins of attraction. The storage capacity obtained through numerical simulations is shown to be close to the value predicted by analytical calculations. We also measured the dependence of capacity on the strength of external inputs. Finally, we quantified the statistics of the resulting synaptic connectivity matrix, and found that both the fraction of zero weight synapses and the degree of symmetry of the weight matrix increase with the number of stored patterns.

  8. Quantum-Secret-Sharing Scheme Based on Local Distinguishability of Orthogonal Seven-Qudit Entangled States

    NASA Astrophysics Data System (ADS)

    Liu, Cheng-Ji; Li, Zhi-Hui; Bai, Chen-Ming; Si, Meng-Meng

    2018-02-01

    The concept of judgment space was proposed by Wang et al. (Phys. Rev. A 95, 022320, 2017), which was used to study some important properties of quantum entangled states based on local distinguishability. In this study, we construct 15 kinds of seven-qudit quantum entangled states in the sense of permutation, calculate their judgment space and propose a distinguishability rule to make the judgment space more clearly. Based on this rule, we study the local distinguishability of the 15 kinds of seven-qudit quantum entangled states and then propose a ( k, n) threshold quantum secret sharing scheme. Finally, we analyze the security of the scheme.

  9. Comparison of two insulin assays for first-phase insulin release in type 1 diabetes prediction and prevention studies.

    PubMed

    Mahon, Jeffrey L; Beam, Craig A; Marcovina, Santica M; Boulware, David C; Palmer, Jerry P; Winter, William E; Skyler, Jay S; Krischer, Jeffrey P

    2011-11-20

    Detection of below-threshold first-phase insulin release or FPIR (1+3 minute insulin concentrations during an intravenous glucose tolerance test [IVGTT]) is important in type 1 diabetes prediction and prevention studies including the TrialNet Oral Insulin Prevention Trial. We assessed whether an insulin immunoenzymometric assay (IEMA) could replace the less practical but current standard of a radioimmunoassay (RIA) for FPIR. One hundred thirty-three islet autoantibody positive relatives of persons with type 1 diabetes underwent 161 IVGTTs. Insulin concentrations were measured by both assays in 1056 paired samples. A rule classifying FPIR (below-threshold, above-threshold, uncertain) by the IEMA was derived and validated against FPIR by the RIA. The insulin IEMA-based rule accurately classified below- and above-threshold FPIRs by the RIA in 110/161 (68%) IVGTTs, but was uncertain in 51/161 (32%) tests for which FPIR by RIA is needed. An uncertain FPIR by the IEMA was more likely among below-threshold vs above-threshold FPIRs by the RIA (64% [30/47] vs. 18% [21/114], respectively; p<0.05). An insulin IEMA for FPIR in subjects at risk for type 1 diabetes accurately determined below- and above-threshold FPIRs in 2/3 of tests relative to the current standard of the insulin RIA, but could not reliably classify the remaining FPIRs. TrialNet is limiting the insulin RIA for FPIR to the latter given the practical advantages of the more specific IEMA. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Time series regression-based pairs trading in the Korean equities market

    NASA Astrophysics Data System (ADS)

    Kim, Saejoon; Heo, Jun

    2017-07-01

    Pairs trading is an instance of statistical arbitrage that relies on heavy quantitative data analysis to profit by capitalising low-risk trading opportunities provided by anomalies of related assets. A key element in pairs trading is the rule by which open and close trading triggers are defined. This paper investigates the use of time series regression to define the rule which has previously been identified with fixed threshold-based approaches. Empirical results indicate that our approach may yield significantly increased excess returns compared to ones obtained by previous approaches on large capitalisation stocks in the Korean equities market.

  11. A Multinomial Model for Identifying Significant Pure-Tone Threshold Shifts

    ERIC Educational Resources Information Center

    Schlauch, Robert S.; Carney, Edward

    2007-01-01

    Purpose: Significant threshold differences on retest for pure-tone audiometry are often evaluated by application of ad hoc rules, such as a shift in a pure-tone average or in 2 adjacent frequencies that exceeds a predefined amount. Rules that are so derived do not consider the probability of observing a particular audiogram. Methods: A general…

  12. 77 FR 19126 - Defense Federal Acquisition Regulation Supplement: New Threshold for Peer Reviews of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... Federal Acquisition Regulation Supplement: New Threshold for Peer Reviews of Noncompetitive Contracts... Regulation Supplement (DFARS) to modify the threshold for noncompetitive contract peer reviews. DATES...-6088. SUPPLEMENTARY INFORMATION: I. Background This final rule reduces the threshold for DoD peer...

  13. Is cooperation viable in mobile organisms? Simple Walk Away rule favors the evolution of cooperation in groups

    PubMed Central

    Aktipis, C. Athena

    2011-01-01

    The evolution of cooperation through partner choice mechanisms is often thought to involve relatively complex cognitive abilities. Using agent-based simulations I model a simple partner choice rule, the ‘Walk Away’ rule, where individuals stay in groups that provide higher returns (by virtue of having more cooperators), and ‘Walk Away’ from groups providing low returns. Implementing this conditional movement rule in a public goods game leads to a number of interesting findings: 1) cooperators have a selective advantage when thresholds are high, corresponding to low tolerance for defectors, 2) high thresholds lead to high initial rates of movement and low final rates of movement (after selection), and 3) as cooperation is selected, the population undergoes a spatial transition from high migration (and a many small and ephemeral groups) to low migration (and large and stable groups). These results suggest that the very simple ‘Walk Away’ rule of leaving uncooperative groups can favor the evolution of cooperation, and that cooperation can evolve in populations in which individuals are able to move in response to local social conditions. A diverse array of organisms are able to leave degraded physical or social environments. The ubiquitous nature of conditional movement suggests that ‘Walk Away’ dynamics may play an important role in the evolution of social behavior in both cognitively complex and cognitively simple organisms. PMID:21666771

  14. The variance of length of stay and the optimal DRG outlier payments.

    PubMed

    Felder, Stefan

    2009-09-01

    Prospective payment schemes in health care often include supply-side insurance for cost outliers. In hospital reimbursement, prospective payments for patient discharges, based on their classification into diagnosis related group (DRGs), are complemented by outlier payments for long stay patients. The outlier scheme fixes the length of stay (LOS) threshold, constraining the profit risk of the hospitals. In most DRG systems, this threshold increases with the standard deviation of the LOS distribution. The present paper addresses the adequacy of this DRG outlier threshold rule for risk-averse hospitals with preferences depending on the expected value and the variance of profits. It first shows that the optimal threshold solves the hospital's tradeoff between higher profit risk and lower premium loading payments. It then demonstrates for normally distributed truncated LOS that the optimal outlier threshold indeed decreases with an increase in the standard deviation.

  15. hERG blocking potential of acids and zwitterions characterized by three thresholds for acidity, size and reactivity.

    PubMed

    Nikolov, Nikolai G; Dybdahl, Marianne; Jónsdóttir, Svava Ó; Wedebye, Eva B

    2014-11-01

    Ionization is a key factor in hERG K(+) channel blocking, and acids and zwitterions are known to be less probable hERG blockers than bases and neutral compounds. However, a considerable number of acidic compounds block hERG, and the physico-chemical attributes which discriminate acidic blockers from acidic non-blockers have not been fully elucidated. We propose a rule for prediction of hERG blocking by acids and zwitterionic ampholytes based on thresholds for only three descriptors related to acidity, size and reactivity. The training set of 153 acids and zwitterionic ampholytes was predicted with a concordance of 91% by a decision tree based on the rule. Two external validations were performed with sets of 35 and 48 observations, respectively, both showing concordances of 91%. In addition, a global QSAR model of hERG blocking was constructed based on a large diverse training set of 1374 chemicals covering all ionization classes, externally validated showing high predictivity and compared to the decision tree. The decision tree was found to be superior for the acids and zwitterionic ampholytes classes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. 77 FR 76839 - Home Mortgage Disclosure (Regulation C): Adjustment To Asset-Size Exemption Threshold

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-31

    ...The Bureau of Consumer Financial Protection (Bureau) is publishing a final rule amending the official commentary that interprets the requirements of the Bureau's Regulation C (Home Mortgage Disclosure) to reflect a change in the asset-size exemption threshold for banks, savings associations, and credit unions based on the annual percentage change in the Consumer Price Index for Urban Wage Earners and Clerical Workers (CPI-W). The exemption threshold is adjusted to increase to $42 million from $41 million. The adjustment is based on the 2.23 percent increase in the average of the CPI-W for the 12-month period ending in November 2012. Therefore, banks, savings associations, and credit unions with assets of $42 million or less as of December 31, 2012, are exempt from collecting data in 2013.

  17. 78 FR 79285 - Home Mortgage Disclosure (Regulation C): Adjustment to Asset-Size Exemption Threshold

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-30

    ...The Bureau of Consumer Financial Protection (Bureau) is publishing a final rule amending the official commentary that interprets the requirements of the Bureau's Regulation C (Home Mortgage Disclosure) to reflect a change in the asset-size exemption threshold for banks, savings associations, and credit unions based on the annual percentage change in the Consumer Price Index for Urban Wage Earners and Clerical Workers (CPI-W). The exemption threshold is adjusted to increase to $43 million from $42 million. The adjustment is based on the 1.4 percent increase in the average of the CPI-W for the 12-month period ending in November 2013. Therefore, banks, savings associations, and credit unions with assets of $43 million or less as of December 31, 2013, are exempt from collecting data in 2014.

  18. Does Teacher Certification Program Lead to Better Quality Teachers? Evidence from Indonesia

    ERIC Educational Resources Information Center

    Kusumawardhani, Prita Nurmalia

    2017-01-01

    This paper examines the impact of the teacher certification program in Indonesia in 2007 and 2008 on student and teacher outcomes. I create a rule-based instrumental variable from discontinuities arising from the assignment mechanism of teachers into certification program. The thresholds are determined empirically. The study applies a two-sample…

  19. Paper Circuits: A Tangible, Low Threshold, Low Cost Entry to Computational Thinking

    ERIC Educational Resources Information Center

    Lee, Victor R.; Recker, Mimi

    2018-01-01

    In this paper, we propose that paper circuitry provides a productive space for exploring aspects of computational thinking, an increasingly critical 21st century skills for all students. We argue that the creation and operation of paper circuits involve learning about computational concepts such as rule-based constraints, operations, and defined…

  20. 76 FR 65555 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-21

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-65583; File No. SR-ISE-2011-68] Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and Immediate Effectiveness of Proposed Rule Change to Amend the Volume Threshold for Tier-Based Rebates for Qualified Contingent Cross Orders and Solicitation Orders Executed...

  1. 76 FR 77279 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-12

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-65898; File No. SR-ISE-2011-78] Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and Immediate Effectiveness of Proposed Rule Change to Amend the Threshold Levels for Tier-Based Rebates for Qualified Contingent Cross Orders and Solicitation Orders Executed...

  2. Optimal Control Strategy Design Based on Dynamic Programming for a Dual-Motor Coupling-Propulsion System

    PubMed Central

    Zhang, Shuo; Zhang, Chengning; Han, Guangwei; Wang, Qinghui

    2014-01-01

    A dual-motor coupling-propulsion electric bus (DMCPEB) is modeled, and its optimal control strategy is studied in this paper. The necessary dynamic features of energy loss for subsystems is modeled. Dynamic programming (DP) technique is applied to find the optimal control strategy including upshift threshold, downshift threshold, and power split ratio between the main motor and auxiliary motor. Improved control rules are extracted from the DP-based control solution, forming near-optimal control strategies. Simulation results demonstrate that a significant improvement in reducing energy loss due to the dual-motor coupling-propulsion system (DMCPS) running is realized without increasing the frequency of the mode switch. PMID:25540814

  3. Optimal control strategy design based on dynamic programming for a dual-motor coupling-propulsion system.

    PubMed

    Zhang, Shuo; Zhang, Chengning; Han, Guangwei; Wang, Qinghui

    2014-01-01

    A dual-motor coupling-propulsion electric bus (DMCPEB) is modeled, and its optimal control strategy is studied in this paper. The necessary dynamic features of energy loss for subsystems is modeled. Dynamic programming (DP) technique is applied to find the optimal control strategy including upshift threshold, downshift threshold, and power split ratio between the main motor and auxiliary motor. Improved control rules are extracted from the DP-based control solution, forming near-optimal control strategies. Simulation results demonstrate that a significant improvement in reducing energy loss due to the dual-motor coupling-propulsion system (DMCPS) running is realized without increasing the frequency of the mode switch.

  4. Cost-effectiveness thresholds: pros and cons.

    PubMed

    Bertram, Melanie Y; Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-12-01

    Cost-effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost-effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost-effectiveness thresholds allow cost-effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization's Commission on Macroeconomics in Health suggested cost-effectiveness thresholds based on multiples of a country's per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this - in addition to uncertainty in the modelled cost-effectiveness ratios - can lead to the wrong decision on how to spend health-care resources. Cost-effectiveness information should be used alongside other considerations - e.g. budget impact and feasibility considerations - in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost-effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair.

  5. 77 FR 27550 - Federal Acquisition Regulation; Revision of Cost Accounting Standards Threshold

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-10

    ...] RIN 9000-AM25 Federal Acquisition Regulation; Revision of Cost Accounting Standards Threshold AGENCY... Federal Acquisition Regulation (FAR) to revise the threshold for applicability of cost accounting standards in order to implement a recent rule of the Cost Accounting Standards Board and statutory...

  6. The string-junction picture of multiquark states: an update

    NASA Astrophysics Data System (ADS)

    Rossi, G. C.; Veneziano, G.

    2016-06-01

    We recall and update, both theoretically and phenomenologically, our (nearly) forty-years-old proposal of a string-junction as a necessary complement to the conventional classification of hadrons based just on their quark-antiquark constituents. In that proposal single (though in general metastable) hadronic states are associated with "irreducible" gauge-invariant operators consisting of Wilson lines (visualized as strings of color flux tubes) that may either end on a quark or an antiquark, or annihilate in triplets at a junction J or an anti-junction overline{J} . For the junction-free sector (ordinary qoverline{q} mesons and glueballs) the picture is supported by large- N (number of colors) considerations as well as by a lattice strong-coupling expansion. Both imply the famous OZI rule suppressing quark-antiquark annihilation diagrams. For hadrons with J and/or overline{J} constituents the same expansions support our proposal, including its generalization of the OZI rule to the suppression of J-overline{J} annihilation diagrams. Such a rule implies that hadrons with junctions are "mesophobic" and thus unusually narrow if they are below threshold for decaying into as many baryons as their total number of junctions (two for a tetraquark, three for a pentaquark). Experimental support for our claim, based on the observation that narrow multiquark states typically lie below (well above) the relevant baryonic (mesonic) thresholds, will be presented.

  7. Wavelet analysis techniques applied to removing varying spectroscopic background in calibration model for pear sugar content

    NASA Astrophysics Data System (ADS)

    Liu, Yande; Ying, Yibin; Lu, Huishan; Fu, Xiaping

    2005-11-01

    A new method is proposed to eliminate the varying background and noise simultaneously for multivariate calibration of Fourier transform near infrared (FT-NIR) spectral signals. An ideal spectrum signal prototype was constructed based on the FT-NIR spectrum of fruit sugar content measurement. The performances of wavelet based threshold de-noising approaches via different combinations of wavelet base functions were compared. Three families of wavelet base function (Daubechies, Symlets and Coiflets) were applied to estimate the performance of those wavelet bases and threshold selection rules by a series of experiments. The experimental results show that the best de-noising performance is reached via the combinations of Daubechies 4 or Symlet 4 wavelet base function. Based on the optimization parameter, wavelet regression models for sugar content of pear were also developed and result in a smaller prediction error than a traditional Partial Least Squares Regression (PLSR) mode.

  8. 77 FR 27545 - Federal Acquisition Regulation; Federal Acquisition Circular 2005-59; Introduction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-10

    ... Revision of Cost 2012-003 Chambers. Accounting Standards Threshold. SUPPLEMENTARY INFORMATION: Summaries... substantial number of small entities. Item III--Revision of Cost Accounting Standards Threshold (FAR Case 2012-003) This final rule revises the cost accounting standards (CAS) threshold in order to implement in...

  9. 77 FR 60907 - Approval and Promulgation of Implementation Plans; Vermont: Prevention of Significant...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-05

    ...) program to establish appropriate emission thresholds for determining which new stationary sources and.... This action affects major stationary sources in Vermont that have GHG emissions above the thresholds... of GHG, and do not limit PSD applicability to GHGs to the higher thresholds in the Tailoring Rule...

  10. 76 FR 35721 - Consumer Leasing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-20

    ...The Board is publishing a final rule amending the staff commentary that interprets the requirements of Regulation M, which implements the Consumer Leasing Act (CLA). Effective July 21, 2011, the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd-Frank Act) amends the CLA by increasing the threshold for exempt consumer leases from $25,000 to $50,000. In addition, the Dodd-Frank Act requires that this threshold be adjusted annually by any annual percentage increase in the Consumer Price Index for Urban Wage Earners and Clerical Workers (CPI-W). Accordingly, based on the annual percentage increase in the CPI-W as of June 1, 2011, the Board is adjusting the exemption threshold from $50,000 to $51,800, effective January 1, 2012. Because the Dodd-Frank Act also requires similar adjustments in the Truth in Lending Act's threshold for exempt consumer credit transactions, the Board is making similar amendments to Regulation Z elsewhere in today's Federal Register.

  11. 76 FR 35722 - Truth in Lending

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-20

    ...The Board is publishing a final rule amending the staff commentary that interprets the requirements of Regulation Z, which implements the Truth in Lending Act (TILA). Effective July 21, 2011, the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd- Frank Act) amends TILA by increasing the threshold for exempt consumer credit transactions from $25,000 to $50,000. In addition, the Dodd- Frank Act requires that this threshold be adjusted annually by any annual percentage increase in the Consumer Price Index for Urban Wage Earners and Clerical Workers (CPI-W). Accordingly, based on the annual percentage increase in the CPI-W as of June 1, 2011, the Board is adjusting the exemption threshold from $50,000 to $51,800, effective January 1, 2012. Because the Dodd-Frank Act also requires similar adjustments in the Consumer Leasing Act's threshold for exempt consumer leases, the Board is making similar amendments to Regulation M elsewhere in today's Federal Register.

  12. Knowledge-based tracking algorithm

    NASA Astrophysics Data System (ADS)

    Corbeil, Allan F.; Hawkins, Linda J.; Gilgallon, Paul F.

    1990-10-01

    This paper describes the Knowledge-Based Tracking (KBT) algorithm for which a real-time flight test demonstration was recently conducted at Rome Air Development Center (RADC). In KBT processing, the radar signal in each resolution cell is thresholded at a lower than normal setting to detect low RCS targets. This lower threshold produces a larger than normal false alarm rate. Therefore, additional signal processing including spectral filtering, CFAR and knowledge-based acceptance testing are performed to eliminate some of the false alarms. TSC's knowledge-based Track-Before-Detect (TBD) algorithm is then applied to the data from each azimuth sector to detect target tracks. In this algorithm, tentative track templates are formed for each threshold crossing and knowledge-based association rules are applied to the range, Doppler, and azimuth measurements from successive scans. Lastly, an M-association out of N-scan rule is used to declare a detection. This scan-to-scan integration enhances the probability of target detection while maintaining an acceptably low output false alarm rate. For a real-time demonstration of the KBT algorithm, the L-band radar in the Surveillance Laboratory (SL) at RADC was used to illuminate a small Cessna 310 test aircraft. The received radar signal wa digitized and processed by a ST-100 Array Processor and VAX computer network in the lab. The ST-100 performed all of the radar signal processing functions, including Moving Target Indicator (MTI) pulse cancelling, FFT Doppler filtering, and CFAR detection. The VAX computers performed the remaining range-Doppler clustering, beamsplitting and TBD processing functions. The KBT algorithm provided a 9.5 dB improvement relative to single scan performance with a nominal real time delay of less than one second between illumination and display.

  13. Multispectral processing based on groups of resolution elements

    NASA Technical Reports Server (NTRS)

    Richardson, W.; Gleason, J. M.

    1975-01-01

    Several nine-point rules are defined and compared with previously studied rules. One of the rules performed well in boundary areas, but with reduced efficiency in field interiors; another combined best performance on field interiors with good sensitivity to boundary detail. The basic threshold gradient and some modifications were investigated as a means of boundary point detection. The hypothesis testing methods of closed-boundary formation were also tested and evaluated. An analysis of the boundary detection problem was initiated, employing statistical signal detection and parameter estimation techniques to analyze various formulations of the problem. These formulations permit the atmospheric and sensor system effects on the data to be thoroughly analyzed. Various boundary features and necessary assumptions can also be investigated in this manner.

  14. Measurement of the lowest dosage of phenobarbital that can produce drug discrimination in rats

    PubMed Central

    Overton, Donald A.; Stanwood, Gregg D.; Patel, Bhavesh N.; Pragada, Sreenivasa R.; Gordon, M. Kathleen

    2009-01-01

    Rationale Accurate measurement of the threshold dosage of phenobarbital that can produce drug discrimination (DD) may improve our understanding of the mechanisms and properties of such discrimination. Objectives Compare three methods for determining the threshold dosage for phenobarbital (D) versus no drug (N) DD. Methods Rats learned a D versus N DD in 2-lever operant training chambers. A titration scheme was employed to increase or decrease dosage at the end of each 18-day block of sessions depending on whether the rat had achieved criterion accuracy during the sessions just completed. Three criterion rules were employed, all based on average percent drug lever responses during initial links of the last 6 D and 6 N sessions of a block. The criteria were: D%>66 and N%<33; D%>50 and N%<50; (D%-N%)>33. Two squads of rats were trained, one immediately after the other. Results All rats discriminated drug versus no drug. In most rats, dosage decreased to low levels and then oscillated near the minimum level required to maintain criterion performance. The lowest discriminated dosage significantly differed under the three criterion rules. The squad that was trained 2nd may have benefited by partially duplicating the lever choices of the previous squad. Conclusions The lowest discriminated dosage is influenced by the criterion of discriminative control that is employed, and is higher than the absolute threshold at which discrimination entirely disappears. Threshold estimations closer to absolute threshold can be obtained when criteria are employed that are permissive, and that allow rats to maintain lever preferences. PMID:19082992

  15. 77 FR 27551 - Federal Acquisition Regulation; Federal Acquisition Circular 2005-59; Small Entity Compliance Guide

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-10

    .... Colombia. III Revision of Cost 2012-003 Chambers. Accounting Standards Threshold. SUPPLEMENTARY INFORMATION... economic impact on a substantial number of small entities. Item III--Revision of Cost Accounting Standards Threshold (FAR Case 2012-003) This final rule revises the cost accounting standards (CAS) threshold in order...

  16. 76 FR 59899 - Approval and Promulgation of Air Quality Implementation Plans; Indiana; Prevention of Significant...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... Significant Deterioration (PSD) program to establish appropriate emission thresholds for determining which new... emissions above the thresholds established in the PSD regulations. DATES: This final rule is effective on... of GHG, and that do not limit PSD applicability to GHGs to the higher thresholds in the Tailoring...

  17. 75 FR 32845 - Consultative Examination-Annual Onsite Review of Medical Providers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-10

    .... ACTION: Final rules. SUMMARY: We are revising the threshold billing amount that triggers annual on-site... titles II and XVI of the Social Security Act (Act). The revision will raise the threshold amount to reflect the increase in billing amounts since we first established the threshold amount in 1991. We expect...

  18. A self-adaptive algorithm for traffic sign detection in motion image based on color and shape features

    NASA Astrophysics Data System (ADS)

    Zhang, Ka; Sheng, Yehua; Gong, Zhijun; Ye, Chun; Li, Yongqiang; Liang, Cheng

    2007-06-01

    As an important sub-system in intelligent transportation system (ITS), the detection and recognition of traffic signs from mobile images is becoming one of the hot spots in the international research field of ITS. Considering the problem of traffic sign automatic detection in motion images, a new self-adaptive algorithm for traffic sign detection based on color and shape features is proposed in this paper. Firstly, global statistical color features of different images are computed based on statistics theory. Secondly, some self-adaptive thresholds and special segmentation rules for image segmentation are designed according to these global color features. Then, for red, yellow and blue traffic signs, the color image is segmented to three binary images by these thresholds and rules. Thirdly, if the number of white pixels in the segmented binary image exceeds the filtering threshold, the binary image should be further filtered. Fourthly, the method of gray-value projection is used to confirm top, bottom, left and right boundaries for candidate regions of traffic signs in the segmented binary image. Lastly, if the shape feature of candidate region satisfies the need of real traffic sign, this candidate region is confirmed as the detected traffic sign region. The new algorithm is applied to actual motion images of natural scenes taken by a CCD camera of the mobile photogrammetry system in Nanjing at different time. The experimental results show that the algorithm is not only simple, robust and more adaptive to natural scene images, but also reliable and high-speed on real traffic sign detection.

  19. 31 CFR 205.5 - What are the thresholds for major Federal assistance programs?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... AND PROCEDURES FOR EFFICIENT FEDERAL-STATE FUNDS TRANSFERS Rules Applicable to Federal Assistance..., a State receiving $1 billion in Federal Assistance would use Table A to learn that its threshold...

  20. 31 CFR 205.5 - What are the thresholds for major Federal assistance programs?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... AND PROCEDURES FOR EFFICIENT FEDERAL-STATE FUNDS TRANSFERS Rules Applicable to Federal Assistance..., a State receiving $1 billion in Federal Assistance would use Table A to learn that its threshold...

  1. 31 CFR 205.5 - What are the thresholds for major Federal assistance programs?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... AND PROCEDURES FOR EFFICIENT FEDERAL-STATE FUNDS TRANSFERS Rules Applicable to Federal Assistance..., a State receiving $1 billion in Federal Assistance would use Table A to learn that its threshold...

  2. 31 CFR 205.5 - What are the thresholds for major Federal assistance programs?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... AND PROCEDURES FOR EFFICIENT FEDERAL-STATE FUNDS TRANSFERS Rules Applicable to Federal Assistance..., a State receiving $1 billion in Federal Assistance would use Table A to learn that its threshold...

  3. Cost–effectiveness thresholds: pros and cons

    PubMed Central

    Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-01-01

    Abstract Cost–effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost–effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost–effectiveness thresholds allow cost–effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization’s Commission on Macroeconomics in Health suggested cost–effectiveness thresholds based on multiples of a country’s per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this – in addition to uncertainty in the modelled cost–effectiveness ratios – can lead to the wrong decision on how to spend health-care resources. Cost–effectiveness information should be used alongside other considerations – e.g. budget impact and feasibility considerations – in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost–effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair. PMID:27994285

  4. 2002 Toxic Chemical Release Inventory Report for the Emergency Planning and Community Right-to-Know Act of 1986, Title III, Section 313

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. Stockton

    2003-11-01

    For reporting year 2002, Los Alamos National Laboratory (LANL or the Laboratory) submitted Form R reports for lead compounds and mercury as required under the Emergency Planning and Community Right-to-Know Act (EPCRA), Section 313. No other EPCRA Section 313 chemicals were used in 2002 above the reportable thresholds. This document was prepared to provide a description of the evaluation of EPCRA Section 313 chemical usage and threshold determinations for LANL for calendar year 2002 as well as provide background information about the data included on the Form R reports. Section 313 of EPCRA specifically requires facilities to submit a Toxicmore » Chemical Release Inventory report (Form R) to the U.S. Environmental Protection Agency (EPA) and state agencies if the owners and operators manufacture, process, or otherwise use any of the listed toxic chemicals above listed threshold quantities. EPA compiles this data in the Toxic Release Inventory database. Form R reports for each chemical over threshold quantities must be submitted on or before July 1 each year and must cover activities that occurred at the facility during the previous year. In 1999 EPA promulgated a final rule on Persistent Bioaccumulative Toxics (PBTs). This rule added several chemicals to the EPCRA Section 313 list of toxic chemicals and established lower reporting thresholds for these and other PBT chemicals that were already reportable under EPCRA Section 313. These lower thresholds became applicable in reporting year 2000. In 2001, EPA expanded the PBT rule to include a lower reporting threshold for lead and lead compounds. Facilities that manufacture, process, or otherwise use more than 100 lb of lead or lead compounds must submit a Form R.« less

  5. 2006 Toxic Chemical Release Inventory Report for the Emergency Planning and Community Right-to-Know Act of 1986, Title III, Section 313

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ecology and Air Quality Group

    2007-12-12

    For reporting year 2006, Los Alamos National Laboratory (LANL or the Laboratory) submitted Form R reports for lead as required under the Emergency Planning and Community Right-to-Know Act (EPCRA) Section 313. No other EPCRA Section 313 chemicals were used in 2006 above the reportable thresholds. This document was prepared to provide a description of the evaluation of EPCRA Section 313 chemical use and threshold determinations for LANL for calendar year 2006, as well as to provide background information about data included on the Form R reports. Section 313 of EPCRA specifically requires facilities to submit a Toxic Chemical Release Inventorymore » Report (Form R) to the U.S. Environmental Protection Agency (EPA) and state agencies if the owners and operators manufacture, process, or otherwise use any of the listed toxic chemicals above listed threshold quantities. EPA compiles this data in the Toxic Release Inventory database. Form R reports for each chemical over threshold quantities must be submitted on or before July 1 each year and must cover activities that occurred at the facility during the previous year. In 1999, EPA promulgated a final rule on persistent bioaccumulative toxics (PBTs). This rule added several chemicals to the EPCRA Section 313 list of toxic chemicals and established lower reporting thresholds for these and other PBT chemicals that were already reportable. These lower thresholds became applicable in reporting year 2000. In 2001, EPA expanded the PBT rule to include a lower reporting threshold for lead and lead compounds. Facilities that manufacture, process, or otherwise use more than 100 lb of lead or lead compounds must submit a Form R.« less

  6. 2008 Toxic Chemical Release Inventory 2008 Toxic Chemical Release Inventory Community Right-to-Know Act of 1986, Title III, Section 313

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ecology and Air Quality Group

    2009-10-01

    For reporting year 2008, Los Alamos National Laboratory (LANL) submitted a Form R report for lead as required under the Emergency Planning and Community Right-to- Know Act (EPCRA) Section 313. No other EPCRA Section 313 chemicals were used in 2008 above the reportable thresholds. This document was prepared to provide a description of the evaluation of EPCRA Section 313 chemical use and threshold determinations for LANL for calendar year 2008, as well as to provide background information about data included on the Form R reports. Section 313 of EPCRA specifically requires facilities to submit a Toxic Chemical Release Inventory Reportmore » (Form R) to the U.S. Environmental Protection Agency (EPA) and state agencies if the owners and operators manufacture, process, or otherwise use any of the listed toxic chemicals above listed threshold quantities. EPA compiles this data in the Toxic Release Inventory database. Form R reports for each chemical over threshold quantities must be submitted on or before July 1 each year and must cover activities that occurred at the facility during the previous year. In 1999, EPA promulgated a final rule on persistent bioaccumulative toxics (PBTs). This rule added several chemicals to the EPCRA Section 313 list of toxic chemicals and established lower reporting thresholds for these and other PBT chemicals that were already reportable. These lower thresholds became applicable in reporting year 2000. In 2001, EPA expanded the PBT rule to include a lower reporting threshold for lead and lead compounds. Facilities that manufacture, process, or otherwise use more than 100 lb of lead or lead compounds must submit a Form R.« less

  7. Final Rule: Community Right-To-Know Reporting Requirements Federal Register Notice

    EPA Pesticide Factsheets

    Final reporting thresholds and threshold planning quantity (TPQ) for extremely hazardous substances (EHS) and non-EHS hazardous chemicals, required under Emergency Planning and Community Right-to-Know Act, Superfund Amendments and Reauthorization Act.

  8. Autonomous Flight Safety System September 27, 2005, Aircraft Test

    NASA Technical Reports Server (NTRS)

    Simpson, James C.

    2005-01-01

    This report describes the first aircraft test of the Autonomous Flight Safety System (AFSS). The test was conducted on September 27, 2005, near Kennedy Space Center (KSC) using a privately-owned single-engine plane and evaluated the performance of several basic flight safety rules using real-time data onboard a moving aerial vehicle. This test follows the first road test of AFSS conducted in February 2005 at KSC. AFSS is a joint KSC and Wallops Flight Facility (WEF) project that is in its third phase of development. AFSS is an independent subsystem intended for use with Expendable Launch Vehicles that uses tracking data from redundant onboard sensors to autonomously make flight termination decisions using software-based rules implemented on redundant flight processors. The goals of this project are to increase capabilities by allowing launches from locations that do not have or cannot afford extensive ground-based range safety assets, to decrease range costs, and to decrease reaction time for special situations. The mission rules are configured for each operation by the responsible Range Safety authorities and can be loosely categorized in four major categories: Parameter Threshold Violations, Physical Boundary Violations present position and instantaneous impact point (TIP), Gate Rules static and dynamic, and a Green-Time Rule. Examples of each of these rules were evaluated during this aircraft test.

  9. Self-Tuning Threshold Method for Real-Time Gait Phase Detection Based on Ground Contact Forces Using FSRs.

    PubMed

    Tang, Jing; Zheng, Jianbin; Wang, Yang; Yu, Lie; Zhan, Enqi; Song, Qiuzhi

    2018-02-06

    This paper presents a novel methodology for detecting the gait phase of human walking on level ground. The previous threshold method (TM) sets a threshold to divide the ground contact forces (GCFs) into on-ground and off-ground states. However, the previous methods for gait phase detection demonstrate no adaptability to different people and different walking speeds. Therefore, this paper presents a self-tuning triple threshold algorithm (STTTA) that calculates adjustable thresholds to adapt to human walking. Two force sensitive resistors (FSRs) were placed on the ball and heel to measure GCFs. Three thresholds (i.e., high-threshold, middle-threshold andlow-threshold) were used to search out the maximum and minimum GCFs for the self-adjustments of thresholds. The high-threshold was the main threshold used to divide the GCFs into on-ground and off-ground statuses. Then, the gait phases were obtained through the gait phase detection algorithm (GPDA), which provides the rules that determine calculations for STTTA. Finally, the STTTA reliability is determined by comparing the results between STTTA and Mariani method referenced as the timing analysis module (TAM) and Lopez-Meyer methods. Experimental results show that the proposed method can be used to detect gait phases in real time and obtain high reliability when compared with the previous methods in the literature. In addition, the proposed method exhibits strong adaptability to different wearers walking at different walking speeds.

  10. Prefrontal rTMS for treating depression: location and intensity results from the OPT-TMS multi-site clinical trial.

    PubMed

    Johnson, Kevin A; Baig, Mirza; Ramsey, Dave; Lisanby, Sarah H; Avery, David; McDonald, William M; Li, Xingbao; Bernhardt, Elisabeth R; Haynor, David R; Holtzheimer, Paul E; Sackeim, Harold A; George, Mark S; Nahas, Ziad

    2013-03-01

    Motor cortex localization and motor threshold determination often guide Transcranial Magnetic Stimulation (TMS) placement and intensity settings for non-motor brain stimulation. However, anatomic variability results in variability of placement and effective intensity. Post-study analysis of the OPT-TMS Study reviewed both the final positioning and the effective intensity of stimulation (accounting for relative prefrontal scalp-cortex distances). We acquired MRI scans of 185 patients in a multi-site trial of left prefrontal TMS for depression. Scans had marked motor sites (localized with TMS) and marked prefrontal sites (5 cm anterior of motor cortex by the "5 cm rule"). Based on a visual determination made before the first treatment, TMS therapy occurred either at the 5 cm location or was adjusted 1 cm forward. Stimulation intensity was 120% of resting motor threshold. The "5 cm rule" would have placed stimulation in premotor cortex for 9% of patients, which was reduced to 4% with adjustments. We did not find a statistically significant effect of positioning on remission, but no patients with premotor stimulation achieved remission (0/7). Effective stimulation ranged from 93 to 156% of motor threshold, and no seizures were induced across this range. Patients experienced remission with effective stimulation intensity ranging from 93 to 146% of motor threshold, and we did not find a significant effect of effective intensity on remission. Our data indicates that individualized positioning methods are useful to reduce variability in placement. Stimulation at 120% of motor threshold, unadjusted for scalp-cortex distances, appears safe for a broad range of patients. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Discrimination thresholds of normal and anomalous trichromats: Model of senescent changes in ocular media density on the Cambridge Colour Test

    PubMed Central

    Shinomori, Keizo; Panorgias, Athanasios; Werner, John S.

    2017-01-01

    Age-related changes in chromatic discrimination along dichromatic confusion lines were measured with the Cambridge Colour Test (CCT). One hundred and sixty-two individuals (16 to 88 years old) with normal Rayleigh matches were the major focus of this paper. An additional 32 anomalous trichromats classified by their Rayleigh matches were also tested. All subjects were screened to rule out abnormalities of the anterior and posterior segments. Thresholds on all three chromatic vectors measured with the CCT showed age-related increases. Protan and deutan vector thresholds increased linearly with age while the tritan vector threshold was described with a bilinear model. Analysis and modeling demonstrated that the nominal vectors of the CCT are shifted by senescent changes in ocular media density, and a method for correcting the CCT vectors is demonstrated. A correction for these shifts indicates that classification among individuals of different ages is unaffected. New vector thresholds for elderly observers and for all age groups are suggested based on calculated tolerance limits. PMID:26974943

  12. 77 FR 69735 - Consumer Leasing (Regulation M)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-21

    ...The Board and the Bureau are publishing final rules amending the official interpretations and commentary for the agencies' regulations that implement the Consumer Leasing Act (CLA). Effective July 21, 2011, the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd-Frank Act) amended the CLA by increasing the threshold for exempt consumer leases from $25,000 to $50,000 and requiring that, on or after December 31, 2011, this threshold be adjusted annually by any annual percentage increase in the Consumer Price Index for Urban Wage Earners and Clerical Workers (CPI-W). Accordingly, the exemption threshold was adjusted to $51,800 effective January 1, 2012. Based on the annual percentage increase in the CPI-W as of June 1, 2012, the Board and the Bureau are adjusting the exemption threshold from $51,800 to $53,000, effective January 1, 2013. Because the Dodd-Frank Act also requires similar adjustments in the Truth in Lending Act's threshold for exempt consumer credit transactions, the Board and the Bureau are making similar amendments to each of their respective regulations implementing the Truth in Lending Act elsewhere in the Federal Register.

  13. 77 FR 69736 - Truth in Lending (Regulation Z)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-21

    ...The Board and the Bureau are publishing final rules amending the official interpretations and commentary for the agencies' regulations that implement the Truth in Lending Act (TILA). Effective July 21, 2011, the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd-Frank Act) amended TILA by increasing the threshold for exempt consumer credit transactions from $25,000 to $50,000 and requiring that, on or after December 31, 2011, this threshold be adjusted annually by any annual percentage increase in the Consumer Price Index for Urban Wage Earners and Clerical Workers (CPI- W). Accordingly, the exemption threshold was adjusted to $51,800 effective January 1, 2012. Based on the annual percentage increase in the CPI-W as of June 1, 2012, the Board and the Bureau are adjusting the exemption threshold from $51,800 to $53,000, effective January 1, 2013. Because the Dodd-Frank Act also requires similar adjustments in the Consumer Leasing Act's threshold for exempt consumer leases, the Board and the Bureau are making similar amendments to each of their respective regulations implementing the Consumer Leasing Act elsewhere in the Federal Register.

  14. Pulse oximeter based mobile biotelemetry application.

    PubMed

    Işik, Ali Hakan; Güler, Inan

    2012-01-01

    Quality and features of tele-homecare are improved by information and communication technologies. In this context, a pulse oximeter-based mobile biotelemetry application is developed. With this application, patients can measure own oxygen saturation and heart rate through Bluetooth pulse oximeter at home. Bluetooth virtual serial port protocol is used to send the test results from pulse oximeter to the smart phone. These data are converted into XML type and transmitted to remote web server database via smart phone. In transmission of data, GPRS, WLAN or 3G can be used. The rule based algorithm is used in the decision making process. By default, the threshold value of oxygen saturation is 80; the heart rate threshold values are 40 and 150 respectively. If the patient's heart rate is out of the threshold values or the oxygen saturation is below the threshold value, an emergency SMS is sent to the doctor. By this way, the directing of an ambulance to the patient can be performed by doctor. The doctor for different patients can change these threshold values. The conversion of the result of the evaluated data to SMS XML template is done on the web server. Another important component of the application is web-based monitoring of pulse oximeter data. The web page provides access to of all patient data, so the doctors can follow their patients and send e-mail related to the evaluation of the disease. In addition, patients can follow own data on this page. Eight patients have become part of the procedure. It is believed that developed application will facilitate pulse oximeter-based measurement from anywhere and at anytime.

  15. 76 FR 9517 - Uniform National Threshold Entered Employment Rate for Veterans

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-18

    ...The Veterans' Employment and Training Service (VETS) of the Department of Labor (the Department) is proposing a rule to implement a uniform national threshold entered employment rate for veterans applicable to State employment service delivery systems. The Department undertakes this rulemaking in accordance with the Jobs for Veterans Act, which requires the Department to implement that threshold rate by regulation.

  16. Optimization of a matched-filter receiver for frequency hopping code acquisition in jamming

    NASA Astrophysics Data System (ADS)

    Pawlowski, P. R.; Polydoros, A.

    A matched-filter receiver for frequency hopping (FH) code acquisition is optimized when either partial-band tone jamming or partial-band Gaussian noise jamming is present. The receiver is matched to a segment of the FH code sequence, sums hard per-channel decisions to form a test, and uses multiple tests to verify acquisition. The length of the matched filter and the number of verification tests are fixed. Optimization is then choosing thresholds to maximize performance based upon the receiver's degree of knowledge about the jammer ('side-information'). Four levels of side-information are considered, ranging from none to complete. The latter level results in a constant-false-alarm-rate (CFAR) design. At each level, performance sensitivity to threshold choice is analyzed. Robust thresholds are chosen to maximize performance as the jammer varies its power distribution, resulting in simple design rules which aid threshold selection. Performance results, which show that optimum distributions for the jammer power over the total FH bandwidth exist, are presented.

  17. Repeated-Sprint Sequences During Female Soccer Matches Using Fixed and Individual Speed Thresholds.

    PubMed

    Nakamura, Fábio Y; Pereira, Lucas A; Loturco, Irineu; Rosseti, Marcelo; Moura, Felipe A; Bradley, Paul S

    2017-07-01

    Nakamura, FY, Pereira, LA, Loturco, I, Rosseti, M, Moura, FA, and Bradley, PS. Repeated-sprint sequences during female soccer matches using fixed and individual speed thresholds. J Strength Cond Res 31(7): 1802-1810, 2017-The main objective of this study was to characterize the occurrence of single sprint and repeated-sprint sequences (RSS) during elite female soccer matches, using fixed (20 km·h) and individually based speed thresholds (>90% of the mean speed from a 20-m sprint test). Eleven elite female soccer players from the same team participated in the study. All players performed a 20-m linear sprint test, and were assessed in up to 10 official matches using Global Positioning System technology. Magnitude-based inferences were used to test for meaningful differences. Results revealed that irrespective of adopting fixed or individual speed thresholds, female players produced only a few RSS during matches (2.3 ± 2.4 sequences using the fixed threshold and 3.3 ± 3.0 sequences using the individually based threshold), with most sequences composing of just 2 sprints. Additionally, central defenders performed fewer sprints (10.2 ± 4.1) than other positions (fullbacks: 28.1 ± 5.5; midfielders: 21.9 ± 10.5; forwards: 31.9 ± 11.1; with the differences being likely to almost certainly associated with effect sizes ranging from 1.65 to 2.72), and sprinting ability declined in the second half. The data do not support the notion that RSS occurs frequently during soccer matches in female players, irrespective of using fixed or individual speed thresholds to define sprint occurrence. However, repeated-sprint ability development cannot be ruled out from soccer training programs because of its association with match-related performance.

  18. Criterion learning in rule-based categorization: Simulation of neural mechanism and new data

    PubMed Central

    Helie, Sebastien; Ell, Shawn W.; Filoteo, J. Vincent; Maddox, W. Todd

    2015-01-01

    In perceptual categorization, rule selection consists of selecting one or several stimulus-dimensions to be used to categorize the stimuli (e.g, categorize lines according to their length). Once a rule has been selected, criterion learning consists of defining how stimuli will be grouped using the selected dimension(s) (e.g., if the selected rule is line length, define ‘long’ and ‘short’). Very little is known about the neuroscience of criterion learning, and most existing computational models do not provide a biological mechanism for this process. In this article, we introduce a new model of rule learning called Heterosynaptic Inhibitory Criterion Learning (HICL). HICL includes a biologically-based explanation of criterion learning, and we use new category-learning data to test key aspects of the model. In HICL, rule selective cells in prefrontal cortex modulate stimulus-response associations using pre-synaptic inhibition. Criterion learning is implemented by a new type of heterosynaptic error-driven Hebbian learning at inhibitory synapses that uses feedback to drive cell activation above/below thresholds representing ionic gating mechanisms. The model is used to account for new human categorization data from two experiments showing that: (1) changing rule criterion on a given dimension is easier if irrelevant dimensions are also changing (Experiment 1), and (2) showing that changing the relevant rule dimension and learning a new criterion is more difficult, but also facilitated by a change in the irrelevant dimension (Experiment 2). We conclude with a discussion of some of HICL’s implications for future research on rule learning. PMID:25682349

  19. Criterion learning in rule-based categorization: simulation of neural mechanism and new data.

    PubMed

    Helie, Sebastien; Ell, Shawn W; Filoteo, J Vincent; Maddox, W Todd

    2015-04-01

    In perceptual categorization, rule selection consists of selecting one or several stimulus-dimensions to be used to categorize the stimuli (e.g., categorize lines according to their length). Once a rule has been selected, criterion learning consists of defining how stimuli will be grouped using the selected dimension(s) (e.g., if the selected rule is line length, define 'long' and 'short'). Very little is known about the neuroscience of criterion learning, and most existing computational models do not provide a biological mechanism for this process. In this article, we introduce a new model of rule learning called Heterosynaptic Inhibitory Criterion Learning (HICL). HICL includes a biologically-based explanation of criterion learning, and we use new category-learning data to test key aspects of the model. In HICL, rule selective cells in prefrontal cortex modulate stimulus-response associations using pre-synaptic inhibition. Criterion learning is implemented by a new type of heterosynaptic error-driven Hebbian learning at inhibitory synapses that uses feedback to drive cell activation above/below thresholds representing ionic gating mechanisms. The model is used to account for new human categorization data from two experiments showing that: (1) changing rule criterion on a given dimension is easier if irrelevant dimensions are also changing (Experiment 1), and (2) showing that changing the relevant rule dimension and learning a new criterion is more difficult, but also facilitated by a change in the irrelevant dimension (Experiment 2). We conclude with a discussion of some of HICL's implications for future research on rule learning. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Convergence of decision rules for value-based pricing of new innovative drugs.

    PubMed

    Gandjour, Afschin

    2015-04-01

    Given the high costs of innovative new drugs, most European countries have introduced policies for price control, in particular value-based pricing (VBP) and international reference pricing. The purpose of this study is to describe how profit-maximizing manufacturers would optimally adjust their launch sequence to these policies and how VBP countries may best respond. To decide about the launching sequence, a manufacturer must consider a tradeoff between price and sales volume in any given country as well as the effect of price in a VBP country on the price in international reference pricing countries. Based on the manufacturer's rationale, it is best for VBP countries in Europe to implicitly collude in the long term and set cost-effectiveness thresholds at the level of the lowest acceptable VBP country. This way, international reference pricing countries would also converge towards the lowest acceptable threshold in Europe.

  1. A simple threshold rule is sufficient to explain sophisticated collective decision-making.

    PubMed

    Robinson, Elva J H; Franks, Nigel R; Ellis, Samuel; Okuda, Saki; Marshall, James A R

    2011-01-01

    Decision-making animals can use slow-but-accurate strategies, such as making multiple comparisons, or opt for simpler, faster strategies to find a 'good enough' option. Social animals make collective decisions about many group behaviours including foraging and migration. The key to the collective choice lies with individual behaviour. We present a case study of a collective decision-making process (house-hunting ants, Temnothorax albipennis), in which a previously proposed decision strategy involved both quality-dependent hesitancy and direct comparisons of nests by scouts. An alternative possible decision strategy is that scouting ants use a very simple quality-dependent threshold rule to decide whether to recruit nest-mates to a new site or search for alternatives. We use analytical and simulation modelling to demonstrate that this simple rule is sufficient to explain empirical patterns from three studies of collective decision-making in ants, and can account parsimoniously for apparent comparison by individuals and apparent hesitancy (recruitment latency) effects, when available nests differ strongly in quality. This highlights the need to carefully design experiments to detect individual comparison. We present empirical data strongly suggesting that best-of-n comparison is not used by individual ants, although individual sequential comparisons are not ruled out. However, by using a simple threshold rule, decision-making groups are able to effectively compare options, without relying on any form of direct comparison of alternatives by individuals. This parsimonious mechanism could promote collective rationality in group decision-making.

  2. 75 FR 75911 - Adjustment of Monetary Threshold for Reporting Rail Equipment Accidents/Incidents for Calendar...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-07

    ...This rule increases the rail equipment accident/incident reporting threshold from $9,200 to $9,400 for certain railroad accidents/incidents involving property damage that occur during calendar year 2011. This action is needed to ensure that FRA's reporting requirements reflect cost increases that have occurred since the reporting threshold was last computed in December of 2009.

  3. 76 FR 72850 - Adjustment of Monetary Threshold for Reporting Rail Equipment Accidents/Incidents for Calendar...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-28

    ...This rule increases the rail equipment accident/incident reporting threshold from $9,400 to $9,500 for certain railroad accidents/incidents involving property damage that occur during calendar year 2012. This action is needed to ensure that FRA's reporting requirements reflect cost increases that have occurred since the reporting threshold was last published in December of 2010.

  4. 77 FR 71354 - Adjustment of Monetary Threshold for Reporting Rail Equipment Accidents/Incidents for Calendar...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-30

    ...This rule increases the rail equipment accident/incident reporting threshold from $9,500 to $9,900 for certain railroad accidents/incidents involving property damage that occur during calendar year 2013. This action is needed to ensure that FRA's reporting requirements reflect cost increases that have occurred since the reporting threshold was last published in November of 2011.

  5. 77 FR 14225 - Prevention of Significant Deterioration and Title V Greenhouse Gas Tailoring Rule Step 3, GHG...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-08

    ... thresholds. \\2\\ 75 FR 31559. As we committed to do in the Tailoring Rule, we have been exploring a variety of... Corporate Average Fuel Economy Standards; Final Rule,'' 75 FR 25,324 (May 7, 2010) (the Light-duty Vehicle... announced a plan to explore streamlining techniques that could make the permitting programs more efficient...

  6. 76 FR 67315 - Supplemental Nutrition Assistance Program: Quality Control Error Tolerance Threshold

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-01

    ...This direct final rule is amending the Quality Control (QC) review error threshold in our regulations from $25.00 to $50.00. The purpose for raising the QC error threshold is to make permanent the temporary threshold change that was required by the American Recovery and Reinvestment Act of 2008. This change does not have an impact on the public. The QC system measures the accuracy of the eligibility system for the Supplemental Nutrition Assistance Program (SNAP).

  7. 78 FR 71468 - Rules Relating to Additional Medicare Tax

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    ... Rules Relating to Additional Medicare Tax AGENCY: Internal Revenue Service (IRS), Treasury. ACTION... Insurance Tax on income above threshold amounts (``Additional Medicare Tax''), as added by the Affordable... to the implementation of Additional Medicare Tax, including the requirement to withhold Additional...

  8. 78 FR 6272 - Rules Relating to Additional Medicare Tax; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-30

    ... Rules Relating to Additional Medicare Tax; Correction AGENCY: Internal Revenue Service (IRS), Treasury... regulations are relating to Additional Hospital Insurance Tax on income above threshold amounts (``Additional Medicare Tax''), as added by the Affordable Care Act. Specifically, these proposed regulations provide...

  9. 78 FR 70193 - Consumer Leasing (Regulation M)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-25

    ...The Board and the Bureau are publishing final rules amending the official interpretations and commentary for the agencies' regulations that implement the Consumer Leasing Act (CLA). The Dodd- Frank Wall Street Reform and Consumer Protection Act (Dodd-Frank Act) amended the CLA by requiring that the dollar threshold for exempt consumer leases be adjusted annually by any annual percentage increase in the Consumer Price Index for Urban Wage Earners and Clerical Workers (CPI-W). Based on the annual percentage increase in the CPI-W as of June 1, 2013, the Board and the Bureau are adjusting the exemption threshold to $53,500, effective January 1, 2014. Because the Dodd-Frank Act also requires similar adjustments in the Truth in Lending Act's threshold for exempt consumer credit transactions, the Board and the Bureau are making similar amendments to each of their respective regulations implementing the Truth in Lending Act elsewhere in the Federal Register.

  10. 78 FR 70194 - Truth in Lending (Regulation Z)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-25

    ...The Board and the Bureau are publishing final rules amending the official interpretations and commentary for the agencies' regulations that implement the Truth in Lending Act (TILA). The Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd-Frank Act) amended TILA by requiring that the dollar threshold for exempt consumer credit transactions be adjusted annually by any annual percentage increase in the Consumer Price Index for Urban Wage Earners and Clerical Workers (CPI-W). Based on the annual percentage increase in the CPI-W as of June 1, 2013, the Board and the Bureau are adjusting the exemption threshold to $53,500, effective January 1, 2014. Because the Dodd-Frank Act also requires similar adjustments in the Consumer Leasing Act's threshold for exempt consumer leases, the Board and the Bureau are making similar amendments to each of their respective regulations implementing the Consumer Leasing Act elsewhere in the Federal Register.

  11. Medicare Program: Hospital Outpatient Prospective Payment and Ambulatory Surgical Center Payment Systems and Quality Reporting Programs; Organ Procurement Organization Reporting and Communication; Transplant Outcome Measures and Documentation Requirements; Electronic Health Record (EHR) Incentive Programs; Payment to Nonexcepted Off-Campus Provider-Based Department of a Hospital; Hospital Value-Based Purchasing (VBP) Program; Establishment of Payment Rates Under the Medicare Physician Fee Schedule for Nonexcepted Items and Services Furnished by an Off-Campus Provider-Based Department of a Hospital. Final rule with comment period and interim final rule with comment period.

    PubMed

    2016-11-14

    This final rule with comment period revises the Medicare hospital outpatient prospective payment system (OPPS) and the Medicare ambulatory surgical center (ASC) payment system for CY 2017 to implement applicable statutory requirements and changes arising from our continuing experience with these systems. In this final rule with comment period, we describe the changes to the amounts and factors used to determine the payment rates for Medicare services paid under the OPPS and those paid under the ASC payment system. In addition, this final rule with comment period updates and refines the requirements for the Hospital Outpatient Quality Reporting (OQR) Program and the ASC Quality Reporting (ASCQR) Program. Further, in this final rule with comment period, we are making changes to tolerance thresholds for clinical outcomes for solid organ transplant programs; to Organ Procurement Organizations (OPOs) definitions, outcome measures, and organ transport documentation; and to the Medicare and Medicaid Electronic Health Record Incentive Programs. We also are removing the HCAHPS Pain Management dimension from the Hospital Value-Based Purchasing (VBP) Program. In addition, we are implementing section 603 of the Bipartisan Budget Act of 2015 relating to payment for certain items and services furnished by certain off-campus provider-based departments of a provider. In this document, we also are issuing an interim final rule with comment period to establish the Medicare Physician Fee Schedule payment rates for the nonexcepted items and services billed by a nonexcepted off-campus provider-based department of a hospital in accordance with the provisions of section 603.

  12. 77 FR 12930 - Federal Acquisition Regulation: Socioeconomic Program Parity

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... on May 6, 2011, reinstating the Rule of Two. C. Sole Source Dollar Thresholds Vary Among the... all socioeconomic programs had the same sole source dollar threshold. Response: The sole source dollar... business socioeconomic contracting program to utilize. D. Sole Source Authority Under the SDVOSB Program...

  13. Final Rule: Extremely Hazardous Substance List and Threshold Planning Quantities; Emergency Planning and Release Notification Requirements (52 FR 13378)

    EPA Pesticide Factsheets

    April 22, 1987: This FR established the list of extremely hazardous substances (EHSs) and their threshold planning quantities (TPQs). Also codified reporting and notification requirements for facilities with EHS. Do not use for current compliance purposes.

  14. Prediction of vesicoureteral reflux after a first febrile urinary tract infection in children: validation of a clinical decision rule.

    PubMed

    Leroy, S; Marc, E; Adamsbaum, C; Gendrel, D; Bréart, G; Chalumeau, M

    2006-03-01

    To test the reproducibility of a highly sensitive clinical decision rule proposed to predict vesicoureteral reflux (VUR) after a first febrile urinary tract infection in children. This rule combines clinical (family history of uropathology, male gender, young age), biological (raised C reactive protein), and radiological (urinary tract dilation on renal ultrasound) predictors in a score, and provides 100% sensitivity. A retrospective hospital based cohort study included all children, 1 month to 4 years old, with a first febrile urinary tract infection. The sensitivities and specificities of the rule at the two previously proposed score thresholds (< or =0 and < or =5) to predict respectively, all-grade or grade > or =3 VUR, were calculated. A total of 149 children were included. VUR prevalence was 25%. The rule yielded 100% sensitivity and 3% specificity for all-grade VUR, and 93% sensitivity and 13% specificity for grade > or =3 VUR. Some methodological weaknesses explain this lack of reproducibility. The reproducibility of the previously proposed decision rule was poor and its potential contribution to clinical management of children with febrile urinary tract infection seems to be modest.

  15. Prediction of vesicoureteral reflux after a first febrile urinary tract infection in children: validation of a clinical decision rule

    PubMed Central

    Leroy, S; Marc, E; Adamsbaum, C; Gendrel, D; Bréart, G; Chalumeau, M

    2006-01-01

    Aims To test the reproducibility of a highly sensitive clinical decision rule proposed to predict vesicoureteral reflux (VUR) after a first febrile urinary tract infection in children. This rule combines clinical (family history of uropathology, male gender, young age), biological (raised C reactive protein), and radiological (urinary tract dilation on renal ultrasound) predictors in a score, and provides 100% sensitivity. Methods A retrospective hospital based cohort study included all children, 1 month to 4 years old, with a first febrile urinary tract infection. The sensitivities and specificities of the rule at the two previously proposed score thresholds (⩽0 and ⩽5) to predict respectively, all‐grade or grade ⩾3 VUR, were calculated. Results A total of 149 children were included. VUR prevalence was 25%. The rule yielded 100% sensitivity and 3% specificity for all‐grade VUR, and 93% sensitivity and 13% specificity for grade ⩾3 VUR. Some methodological weaknesses explain this lack of reproducibility. Conclusions The reproducibility of the previously proposed decision rule was poor and its potential contribution to clinical management of children with febrile urinary tract infection seems to be modest. PMID:15890693

  16. 75 FR 51509 - Self-Regulatory Organizations; Notice of Filing of a Proposed Rule Change by the NASDAQ Stock...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-20

    ... necessary to have a Professional designation rule as is commonplace in the industry, particularly where NOS... number over a calendar month will prevent gaming of the 390 order threshold.\\25\\ \\23\\ 390 orders is equal...

  17. Probe-specific mixed-model approach to detect copy number differences using multiplex ligation-dependent probe amplification (MLPA)

    PubMed Central

    González, Juan R; Carrasco, Josep L; Armengol, Lluís; Villatoro, Sergi; Jover, Lluís; Yasui, Yutaka; Estivill, Xavier

    2008-01-01

    Background MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample. Results Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace. Conclusion Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed. PMID:18522760

  18. A model-based approach for the scattering-bar printing avoidance

    NASA Astrophysics Data System (ADS)

    Du, Yaojun; Li, Liang; Zhang, Jingjing; Shao, Feng; Zuniga, Christian; Deng, Yunfei

    2018-03-01

    As the technology node for the semiconductor manufacturing approaches advanced nodes, the scattering-bars (SBs) are more crucial than ever to ensure a good on-wafer printability of the line space pattern and hole pattern. The main pattern with small pitches requires a very narrow PV (process variation) band. A delicate SB addition scheme is thus needed to maintain a sufficient PW (process window) for the semi-iso- and iso-patterns. In general, the wider, longer, and closer to main feature SBs will be more effective in enhancing the printability; on the other hand, they are also more likely to be printed on the wafer; resulting in undesired defects transferable to subsequent processes. In this work, we have developed a model based approach for the scattering-bar printing avoidance (SPA). A specially designed optical model was tuned based on a broad range of test patterns which contain a variation of CDs and SB placements showing printing and non-printing scattering bars. A printing threshold is then obtained to check the extra-printings of SBs. The accuracy of this threshold is verified by pre-designed test patterns. The printing threshold associated with our novel SPA model allows us to set up a proper SB rule.

  19. 78 FR 7828 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-04

    ...-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule Change Amending the NYSE Arca Options Fee Schedule To Revise Qualification Thresholds for Tiered Customer...'') to revise the qualification thresholds for tiered Customer posting credits for electronic executions...

  20. 77 FR 31050 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-24

    ...-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule... threshold qualifications and corresponding rates applicable to Option Trading Permit (``OTP'') Holder and... restructure the threshold qualifications and corresponding rates applicable to OTP Holder and OTP Firm...

  1. 76 FR 79545 - Cost Accounting Standards: Change to the CAS Applicability Threshold for the Inflation Adjustment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-22

    ... Cost Accounting Standards: Change to the CAS Applicability Threshold for the Inflation Adjustment to... Federal Procurement Policy, Cost Accounting Standards Board. ACTION: Final rule. SUMMARY: The Office of Federal Procurement Policy (OFPP), Cost Accounting Standards (CAS) Board (Board), has adopted, without...

  2. 26 CFR 1.162-29 - Influencing legislation.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... proposed regulation increasing the threshold value of commercial and residential real estate transactions... all day-care providers. Agency B in State X is charged with writing rules to implement the statute... rules that S recommends Agency B adopt to implement the statute on licensing of day-care providers...

  3. 26 CFR 1.162-29 - Influencing legislation.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... proposed regulation increasing the threshold value of commercial and residential real estate transactions... all day-care providers. Agency B in State X is charged with writing rules to implement the statute... rules that S recommends Agency B adopt to implement the statute on licensing of day-care providers...

  4. 26 CFR 1.162-29 - Influencing legislation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... proposed regulation increasing the threshold value of commercial and residential real estate transactions... all day-care providers. Agency B in State X is charged with writing rules to implement the statute... rules that S recommends Agency B adopt to implement the statute on licensing of day-care providers...

  5. 26 CFR 1.162-29 - Influencing legislation.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... proposed regulation increasing the threshold value of commercial and residential real estate transactions... all day-care providers. Agency B in State X is charged with writing rules to implement the statute... rules that S recommends Agency B adopt to implement the statute on licensing of day-care providers...

  6. Analysis of correlation between pediatric asthma exacerbation and exposure to pollutant mixtures with association rule mining.

    PubMed

    Toti, Giulia; Vilalta, Ricardo; Lindner, Peggy; Lefer, Barry; Macias, Charles; Price, Daniel

    2016-11-01

    Traditional studies on effects of outdoor pollution on asthma have been criticized for questionable statistical validity and inefficacy in exploring the effects of multiple air pollutants, alone and in combination. Association rule mining (ARM), a method easily interpretable and suitable for the analysis of the effects of multiple exposures, could be of use, but the traditional interest metrics of support and confidence need to be substituted with metrics that focus on risk variations caused by different exposures. We present an ARM-based methodology that produces rules associated with relevant odds ratios and limits the number of final rules even at very low support levels (0.5%), thanks to post-pruning criteria that limit rule redundancy and control for statistical significance. The methodology has been applied to a case-crossover study to explore the effects of multiple air pollutants on risk of asthma in pediatric subjects. We identified 27 rules with interesting odds ratio among more than 10,000 having the required support. The only rule including only one chemical is exposure to ozone on the previous day of the reported asthma attack (OR=1.14). 26 combinatory rules highlight the limitations of air quality policies based on single pollutant thresholds and suggest that exposure to mixtures of chemicals is more harmful, with odds ratio as high as 1.54 (associated with the combination day0 SO 2 , day0 NO, day0 NO 2 , day1 PM). The proposed method can be used to analyze risk variations caused by single and multiple exposures. The method is reliable and requires fewer assumptions on the data than parametric approaches. Rules including more than one pollutant highlight interactions that deserve further investigation, while helping to limit the search field. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Use HypE to Hide Association Rules by Adding Items

    PubMed Central

    Cheng, Peng; Lin, Chun-Wei; Pan, Jeng-Shyang

    2015-01-01

    During business collaboration, partners may benefit through sharing data. People may use data mining tools to discover useful relationships from shared data. However, some relationships are sensitive to the data owners and they hope to conceal them before sharing. In this paper, we address this problem in forms of association rule hiding. A hiding method based on evolutionary multi-objective optimization (EMO) is proposed, which performs the hiding task by selectively inserting items into the database to decrease the confidence of sensitive rules below specified thresholds. The side effects generated during the hiding process are taken as optimization goals to be minimized. HypE, a recently proposed EMO algorithm, is utilized to identify promising transactions for modification to minimize side effects. Results on real datasets demonstrate that the proposed method can effectively perform sanitization with fewer damages to the non-sensitive knowledge in most cases. PMID:26070130

  8. Finite-width Laplace sum rules for 0-+ pseudoscalar glueball in the instanton vacuum model

    NASA Astrophysics Data System (ADS)

    Wang, Feng; Chen, Junlong; Liu, Jueping

    2015-10-01

    The correlation function of the 0-+ pseudoscalar glueball current is calculated based on the semiclassical expansion for quantum chromodynamics (QCD) in the instanton liquid background. Besides taking the pure classical contribution from instantons and the perturbative one into account, we calculate the contribution arising from the interaction (or the interference) between instantons and the quantum gluon fields, which is infrared free and more important than the pure perturbative one. Instead of the usual zero-width approximation for the resonances, the Breit-Wigner form with a correct threshold behavior for the spectral function of the finite-width resonance is adopted. The properties of the 0-+ pseudoscalar glueball are investigated via a family of the QCD Laplacian sum rules. A consistency between the subtracted and unsubtracted sum rules is very well justified. The values of the mass, decay width, and coupling constants for the 0-+ resonance in which the glueball fraction is dominant are obtained.

  9. Computer vision for general purpose visual inspection: a fuzzy logic approach

    NASA Astrophysics Data System (ADS)

    Chen, Y. H.

    In automatic visual industrial inspection, computer vision systems have been widely used. Such systems are often application specific, and therefore require domain knowledge in order to have a successful implementation. Since visual inspection can be viewed as a decision making process, it is argued that the integration of fuzzy logic analysis and computer vision systems provides a practical approach to general purpose visual inspection applications. This paper describes the development of an integrated fuzzy-rule-based automatic visual inspection system. Domain knowledge about a particular application is represented as a set of fuzzy rules. From the status of predefined fuzzy variables, the set of fuzzy rules are defuzzified to give the inspection results. A practical application where IC marks (often in the forms of English characters and a company logo) inspection is demonstrated, which shows a more consistent result as compared to a conventional thresholding method.

  10. Gravitropic responses of the Avena coleoptile in space and on clinostats. I. Gravitropic response thresholds

    NASA Technical Reports Server (NTRS)

    Brown, A. H.; Chapman, D. K.; Johnsson, A.; Heathcote, D.

    1995-01-01

    We conducted a series of gravitropic experiments on Avena coleoptiles in the weightlessness environment of Spacelab. The purpose was to test the threshold stimulus, reciprocity rule and autotropic reactions to a range of g-force stimulations of different intensities and durations The tests avoided the potentially complicating effects of earth's gravity and the interference from clinostat ambiguities. Using slow-speed centrifuges, coleoptiles received transversal accelerations in the hypogravity range between 0.l and 1.0 g over periods that ranged from 2 to 130 min. All responses that occurred in weightlessness were compared to clinostat experiments on earth using the same apparatus. Characteristic gravitropistic response patterns of Atuena were not substantially different from those observed in ground-based experiments. Gravitropic presentation times were extrapolated. The threshold at 1.0 g was less than 1 min (shortest stimulation time 2 min), in agreement with values obtained on the ground. The least stimulus tested, 0.1 g for 130 min, produced a significant response. Therefore the absolute threshold for a gravitropic response is less than 0.1 g.

  11. The value of EHR-based assessment of physician competency: An investigative effort with internal medicine physicians.

    PubMed

    Venta, Kimberly; Baker, Erin; Fidopiastis, Cali; Stanney, Kay

    2017-12-01

    The purpose of this study was to investigate the potential of developing an EHR-based model of physician competency, named the Skill Deficiency Evaluation Toolkit for Eliminating Competency-loss Trends (Skill-DETECT), which presents the opportunity to use EHR-based models to inform selection of Continued Medical Education (CME) opportunities specifically targeted at maintaining proficiency. The IBM Explorys platform provided outpatient Electronic Health Records (EHRs) representing 76 physicians with over 5000 patients combined. These data were used to develop the Skill-DETECT model, a predictive hybrid model composed of a rule-based model, logistic regression model, and a thresholding model, which predicts cognitive clinical skill deficiencies in internal medicine physicians. A three-phase approach was then used to statistically validate the model performance. Subject Matter Expert (SME) panel reviews resulted in a 100% overall approval rate of the rule based model. Area under the receiver-operating characteristic curves calculated for each logistic regression curve resulted in values between 0.76 and 0.92, which indicated exceptional performance. Normality, skewness, and kurtosis were determined and confirmed that the distribution of values output from the thresholding model were unimodal and peaked, which confirmed effectiveness and generalizability. The validation has confirmed that the Skill-DETECT model has a strong ability to evaluate EHR data and support the identification of internal medicine cognitive clinical skills that are deficient or are of higher likelihood of becoming deficient and thus require remediation, which will allow both physician and medical organizations to fine tune training efforts. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. 78 FR 35082 - Self-Regulatory Organizations; Chicago Stock Exchange, Inc.; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-11

    ... Organizations; Chicago Stock Exchange, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule... designated threshold.\\5\\ In addition, the Exchange adopted security-type specific parameter values, such as..., Threshold Away Amount, Minimum Duration and N mult , will be made through proposed fee filings pursuant to...

  13. 76 FR 23732 - Margin Requirements for Uncleared Swaps for Swap Dealers and Major Swap Participants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-28

    ... these statutory bounds the Commission has endeavored to limit costs appropriately. For example, as... Commission anticipates that the prudential regulators will publicly post their proposed rules on their Web... containing a threshold below which the CSE was not required to post initial margin, i.e., zero thresholds...

  14. Bayesian regression discontinuity designs: incorporating clinical knowledge in the causal analysis of primary care data.

    PubMed

    Geneletti, Sara; O'Keeffe, Aidan G; Sharples, Linda D; Richardson, Sylvia; Baio, Gianluca

    2015-07-10

    The regression discontinuity (RD) design is a quasi-experimental design that estimates the causal effects of a treatment by exploiting naturally occurring treatment rules. It can be applied in any context where a particular treatment or intervention is administered according to a pre-specified rule linked to a continuous variable. Such thresholds are common in primary care drug prescription where the RD design can be used to estimate the causal effect of medication in the general population. Such results can then be contrasted to those obtained from randomised controlled trials (RCTs) and inform prescription policy and guidelines based on a more realistic and less expensive context. In this paper, we focus on statins, a class of cholesterol-lowering drugs, however, the methodology can be applied to many other drugs provided these are prescribed in accordance to pre-determined guidelines. Current guidelines in the UK state that statins should be prescribed to patients with 10-year cardiovascular disease risk scores in excess of 20%. If we consider patients whose risk scores are close to the 20%  risk score threshold, we find that there is an element of random variation in both the risk score itself and its measurement. We can therefore consider the threshold as a randomising device that assigns statin prescription to individuals just above the threshold and withholds it from those just below. Thus, we are effectively replicating the conditions of an RCT in the area around the threshold, removing or at least mitigating confounding. We frame the RD design in the language of conditional independence, which clarifies the assumptions necessary to apply an RD design to data, and which makes the links with instrumental variables clear. We also have context-specific knowledge about the expected sizes of the effects of statin prescription and are thus able to incorporate this into Bayesian models by formulating informative priors on our causal parameters. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  15. Sparse image reconstruction for molecular imaging.

    PubMed

    Ting, Michael; Raich, Raviv; Hero, Alfred O

    2009-06-01

    The application that motivates this paper is molecular imaging at the atomic level. When discretized at subatomic distances, the volume is inherently sparse. Noiseless measurements from an imaging technology can be modeled by convolution of the image with the system point spread function (psf). Such is the case with magnetic resonance force microscopy (MRFM), an emerging technology where imaging of an individual tobacco mosaic virus was recently demonstrated with nanometer resolution. We also consider additive white Gaussian noise (AWGN) in the measurements. Many prior works of sparse estimators have focused on the case when H has low coherence; however, the system matrix H in our application is the convolution matrix for the system psf. A typical convolution matrix has high coherence. This paper, therefore, does not assume a low coherence H. A discrete-continuous form of the Laplacian and atom at zero (LAZE) p.d.f. used by Johnstone and Silverman is formulated, and two sparse estimators derived by maximizing the joint p.d.f. of the observation and image conditioned on the hyperparameters. A thresholding rule that generalizes the hard and soft thresholding rule appears in the course of the derivation. This so-called hybrid thresholding rule, when used in the iterative thresholding framework, gives rise to the hybrid estimator, a generalization of the lasso. Estimates of the hyperparameters for the lasso and hybrid estimator are obtained via Stein's unbiased risk estimate (SURE). A numerical study with a Gaussian psf and two sparse images shows that the hybrid estimator outperforms the lasso.

  16. Merit-Based Incentive Payment System: Meaningful Changes in the Final Rule Brings Cautious Optimism.

    PubMed

    Manchikanti, Laxmaiah; Helm Ii, Standiford; Calodney, Aaron K; Hirsch, Joshua A

    2017-01-01

    The Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) eliminated the flawed Sustainable Growth Rate (SGR) act formula - a longstanding crucial issue of concern for health care providers and Medicare beneficiaries. MACRA also included a quality improvement program entitled, "The Merit-Based Incentive Payment System, or MIPS." The proposed rule of MIPS sought to streamline existing federal quality efforts and therefore linked 4 distinct programs into one. Three existing programs, meaningful use (MU), Physician Quality Reporting System (PQRS), value-based payment (VBP) system were merged with the addition of Clinical Improvement Activity category. The proposed rule also changed the name of MU to Advancing Care Information, or ACI. ACI contributes to 25% of composite score of the four programs, PQRS contributes 50% of the composite score, while VBP system, which deals with resource use or cost, contributes to 10% of the composite score. The newest category, Improvement Activities or IA, contributes 15% to the composite score. The proposed rule also created what it called a design incentive that drives movement to delivery system reform principles with the inclusion of Advanced Alternative Payment Models (APMs).Following the release of the proposed rule, the medical community, as well as Congress, provided substantial input to Centers for Medicare and Medicaid Services (CMS),expressing their concern. American Society of Interventional Pain Physicians (ASIPP) focused on 3 important aspects: delay the implementation, provide a 3-month performance period, and provide ability to submit meaningful quality measures in a timely and economic manner. The final rule accepted many of the comments from various organizations, including several of those specifically emphasized by ASIPP, with acceptance of 3-month reporting period, as well as the ability to submit non-MIPS measures to improve real quality and make the system meaningful. CMS also provided a mechanism for physicians to avoid penalties for non-reporting with reporting of just a single patient. In summary, CMS has provided substantial flexibility with mechanisms to avoid penalties, reporting for 90 continuous days, increasing the low volume threshold, changing the reporting burden and data thresholds and, finally, coordination between performance categories. The final rule has made MIPS more meaningful with bonuses for exceptional performance, the ability to report for 90 days, and to report on 50% of the patients in 2017 and 60% of the patients in 2018. The final rule also reduced the quality measures to 6, including only one outcome or high priority measure with elimination of cross cutting measure requirement. In addition, the final rule reduced the burden of ACI, improved the coordination of performance, reduced improvement activities burden from 60 points to 40 points, and finally improved coordination between performance categories. Multiple concerns remain regarding the reduction in scoring for quality improvement in future years, increase in proportion of MIPS scoring for resource use utilizing flawed, claims based methodology and the continuation of the disproportionate importance of ACI, an expensive program that can be onerous for providers which in many ways has not lived up to its promise. Key words: Medicare Access and CHIP Reauthorization Act of 2015, merit-based incentive payment system, quality performance measures, resource use, improvement activities, advancing care information performance category.

  17. Policy Tree Optimization for Adaptive Management of Water Resources Systems

    NASA Astrophysics Data System (ADS)

    Herman, J. D.; Giuliani, M.

    2016-12-01

    Water resources systems must cope with irreducible uncertainty in supply and demand, requiring policy alternatives capable of adapting to a range of possible future scenarios. Recent studies have developed adaptive policies based on "signposts" or "tipping points", which are threshold values of indicator variables that signal a change in policy. However, there remains a need for a general method to optimize the choice of indicators and their threshold values in a way that is easily interpretable for decision makers. Here we propose a conceptual framework and computational algorithm to design adaptive policies as a tree structure (i.e., a hierarchical set of logical rules) using a simulation-optimization approach based on genetic programming. We demonstrate the approach using Folsom Reservoir, California as a case study, in which operating policies must balance the risk of both floods and droughts. Given a set of feature variables, such as reservoir level, inflow observations and forecasts, and time of year, the resulting policy defines the conditions under which flood control and water supply hedging operations should be triggered. Importantly, the tree-based rule sets are easy to interpret for decision making, and can be compared to historical operating policies to understand the adaptations needed under possible climate change scenarios. Several remaining challenges are discussed, including the empirical convergence properties of the method, and extensions to irreversible decisions such as infrastructure. Policy tree optimization, and corresponding open-source software, provide a generalizable, interpretable approach to designing adaptive policies under uncertainty for water resources systems.

  18. Statistical Short-Range Guidance for Peak Wind Speed Forecasts at Edwards Air Force Base, CA

    NASA Technical Reports Server (NTRS)

    Dreher, Joseph G.; Crawford, Winifred; Lafosse, Richard; Hoeth, Brian; Burns, Kerry

    2009-01-01

    The peak winds near the surface are an important forecast element for space shuttle landings. As defined in the Flight Rules (FR), there are peak wind thresholds that cannot be exceeded in order to ensure the safety of the shuttle during landing operations. The National Weather Service Spaceflight Meteorology Group (SMG) is responsible for weather forecasts for all shuttle landings, and is required to issue surface average and 10-minute peak wind speed forecasts. They indicate peak winds are a challenging parameter to forecast. To alleviate the difficulty in making such wind forecasts, the Applied Meteorology Unit (AMU) developed a PC-based graphical user interface (GUI) for displaying peak wind climatology and probabilities of exceeding peak wind thresholds for the Shuttle Landing Facility (SLF) at Kennedy Space Center (KSC; Lambert 2003). However, the shuttle occasionally may land at Edwards Air Force Base (EAFB) in southern California when weather conditions at KSC in Florida are not acceptable, so SMG forecasters requested a similar tool be developed for EAFB.

  19. An Adaptive Deghosting Method in Neural Network-Based Infrared Detectors Nonuniformity Correction

    PubMed Central

    Li, Yiyang; Jin, Weiqi; Zhu, Jin; Zhang, Xu; Li, Shuo

    2018-01-01

    The problems of the neural network-based nonuniformity correction algorithm for infrared focal plane arrays mainly concern slow convergence speed and ghosting artifacts. In general, the more stringent the inhibition of ghosting, the slower the convergence speed. The factors that affect these two problems are the estimated desired image and the learning rate. In this paper, we propose a learning rate rule that combines adaptive threshold edge detection and a temporal gate. Through the noise estimation algorithm, the adaptive spatial threshold is related to the residual nonuniformity noise in the corrected image. The proposed learning rate is used to effectively and stably suppress ghosting artifacts without slowing down the convergence speed. The performance of the proposed technique was thoroughly studied with infrared image sequences with both simulated nonuniformity and real nonuniformity. The results show that the deghosting performance of the proposed method is superior to that of other neural network-based nonuniformity correction algorithms and that the convergence speed is equivalent to the tested deghosting methods. PMID:29342857

  20. An Adaptive Deghosting Method in Neural Network-Based Infrared Detectors Nonuniformity Correction.

    PubMed

    Li, Yiyang; Jin, Weiqi; Zhu, Jin; Zhang, Xu; Li, Shuo

    2018-01-13

    The problems of the neural network-based nonuniformity correction algorithm for infrared focal plane arrays mainly concern slow convergence speed and ghosting artifacts. In general, the more stringent the inhibition of ghosting, the slower the convergence speed. The factors that affect these two problems are the estimated desired image and the learning rate. In this paper, we propose a learning rate rule that combines adaptive threshold edge detection and a temporal gate. Through the noise estimation algorithm, the adaptive spatial threshold is related to the residual nonuniformity noise in the corrected image. The proposed learning rate is used to effectively and stably suppress ghosting artifacts without slowing down the convergence speed. The performance of the proposed technique was thoroughly studied with infrared image sequences with both simulated nonuniformity and real nonuniformity. The results show that the deghosting performance of the proposed method is superior to that of other neural network-based nonuniformity correction algorithms and that the convergence speed is equivalent to the tested deghosting methods.

  1. MLESAC Based Localization of Needle Insertion Using 2D Ultrasound Images

    NASA Astrophysics Data System (ADS)

    Xu, Fei; Gao, Dedong; Wang, Shan; Zhanwen, A.

    2018-04-01

    In the 2D ultrasound image of ultrasound-guided percutaneous needle insertions, it is difficult to determine the positions of needle axis and tip because of the existence of artifacts and other noises. In this work the speckle is regarded as the noise of an ultrasound image, and a novel algorithm is presented to detect the needle in a 2D ultrasound image. Firstly, the wavelet soft thresholding technique based on BayesShrink rule is used to denoise the speckle of ultrasound image. Secondly, we add Otsu’s thresholding method and morphologic operations to pre-process the ultrasound image. Finally, the localization of the needle is identified and positioned in the 2D ultrasound image based on the maximum likelihood estimation sample consensus (MLESAC) algorithm. The experimental results show that it is valid for estimating the position of needle axis and tip in the ultrasound images with the proposed algorithm. The research work is hopeful to be used in the path planning and robot-assisted needle insertion procedures.

  2. 78 FR 57669 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-19

    ... Change Proposing To Modify the Manner in Which It Calculates Volume, Liquidity and Quoting Thresholds... Arca'') filed with the Securities and Exchange Commission (the ``Commission'') the proposed rule change... organization. The Commission is publishing this notice to solicit comments on the proposed rule change from...

  3. On the Design of a Fuzzy Logic-Based Control System for Freeze-Drying Processes.

    PubMed

    Fissore, Davide

    2016-12-01

    This article is focused on the design of a fuzzy logic-based control system to optimize a drug freeze-drying process. The goal of the system is to keep product temperature as close as possible to the threshold value of the formulation being processed, without trespassing it, in such a way that product quality is not jeopardized and the sublimation flux is maximized. The method involves the measurement of product temperature and a set of rules that have been obtained through process simulation with the goal to obtain a unique set of rules for products with very different characteristics. Input variables are the difference between the temperature of the product and the threshold value, the difference between the temperature of the heating fluid and that of the product, and the rate of change of product temperature. The output variables are the variation of the temperature of the heating fluid and the pressure in the drying chamber. The effect of the starting value of the input variables and of the control interval has been investigated, thus resulting in the optimal configuration of the control system. Experimental investigation carried out in a pilot-scale freeze-dryer has been carried out to validate the proposed system. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  4. Receptive field optimisation and supervision of a fuzzy spiking neural network.

    PubMed

    Glackin, Cornelius; Maguire, Liam; McDaid, Liam; Sayers, Heather

    2011-04-01

    This paper presents a supervised training algorithm that implements fuzzy reasoning on a spiking neural network. Neuron selectivity is facilitated using receptive fields that enable individual neurons to be responsive to certain spike train firing rates and behave in a similar manner as fuzzy membership functions. The connectivity of the hidden and output layers in the fuzzy spiking neural network (FSNN) is representative of a fuzzy rule base. Fuzzy C-Means clustering is utilised to produce clusters that represent the antecedent part of the fuzzy rule base that aid classification of the feature data. Suitable cluster widths are determined using two strategies; subjective thresholding and evolutionary thresholding respectively. The former technique typically results in compact solutions in terms of the number of neurons, and is shown to be particularly suited to small data sets. In the latter technique a pool of cluster candidates is generated using Fuzzy C-Means clustering and then a genetic algorithm is employed to select the most suitable clusters and to specify cluster widths. In both scenarios, the network is supervised but learning only occurs locally as in the biological case. The advantages and disadvantages of the network topology for the Fisher Iris and Wisconsin Breast Cancer benchmark classification tasks are demonstrated and directions of current and future work are discussed. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Summary of water body extraction methods based on ZY-3 satellite

    NASA Astrophysics Data System (ADS)

    Zhu, Yu; Sun, Li Jian; Zhang, Chuan Yin

    2017-12-01

    Extracting from remote sensing images is one of the main means of water information extraction. Affected by spectral characteristics, many methods can be not applied to the satellite image of ZY-3. To solve this problem, we summarize the extraction methods for ZY-3 and analyze the extraction results of existing methods. According to the characteristics of extraction results, the method of WI& single band threshold and the method of texture filtering based on probability statistics are explored. In addition, the advantages and disadvantages of all methods are compared, which provides some reference for the research of water extraction from images. The obtained conclusions are as follows. 1) NIR has higher water sensitivity, consequently when the surface reflectance in the study area is less similar to water, using single band threshold method or multi band operation can obtain the ideal effect. 2) Compared with the water index and HIS optimal index method, object extraction method based on rules, which takes into account not only the spectral information of the water, but also space and texture feature constraints, can obtain better extraction effect, yet the image segmentation process is time consuming and the definition of the rules requires a certain knowledge. 3) The combination of the spectral relationship and water index can eliminate the interference of the shadow to a certain extent. When there is less small water or small water is not considered in further study, texture filtering based on probability statistics can effectively reduce the noises in result and avoid mixing shadows or paddy field with water in a certain extent.

  6. 78 FR 69177 - Ownership and Control Reports, Forms 102/102S, 40/40S, and 71

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-18

    ... that comprise each special account; requiring the reporting of certain omnibus account information on..., information regarding the owners and controllers of volume threshold accounts reported on Form 102B and that... introducing a new information collection for omnibus volume threshold accounts in New Form 71.\\11\\ The rules...

  7. 76 FR 64983 - Self-Regulatory Organizations; NYSE Amex LLC; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-19

    ...-Regulatory Organizations; NYSE Amex LLC; Notice of Filing and Immediate Effectiveness of Proposed Rule Change... currently count toward both the $350,000 cap and the 3,500,000 thresholds, but are not themselves capped... standard fee schedule, with the exception of transactions that exceed the fee cap threshold for Specialists...

  8. 16 CFR 802.21 - Acquisitions of voting securities not meeting or exceeding greater notification threshold (as...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Acquisitions of voting securities not meeting or exceeding greater notification threshold (as adjusted). 802.21 Section 802.21 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS, STATEMENTS AND INTERPRETATIONS UNDER THE HART-SCOTT-RODINO ANTITRUST IMPROVEMENTS ACT OF 1976...

  9. 16 CFR 802.21 - Acquisitions of voting securities not meeting or exceeding greater notification threshold (as...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Acquisitions of voting securities not meeting or exceeding greater notification threshold (as adjusted). 802.21 Section 802.21 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS, STATEMENTS AND INTERPRETATIONS UNDER THE HART-SCOTT-RODINO ANTITRUST IMPROVEMENTS ACT OF 1976...

  10. Rethinking the Clinically Based Thresholds of TransCelerate BioPharma for Risk-Based Monitoring.

    PubMed

    Zink, Richard C; Dmitrienko, Anastasia; Dmitrienko, Alex

    2018-01-01

    The quality of data from clinical trials has received a great deal of attention in recent years. Of central importance is the need to protect the well-being of study participants and maintain the integrity of final analysis results. However, traditional approaches to assess data quality have come under increased scrutiny as providing little benefit for the substantial cost. Numerous regulatory guidance documents and industry position papers have described risk-based approaches to identify quality and safety issues. In particular, the position paper of TransCelerate BioPharma recommends defining risk thresholds to assess safety and quality risks based on past clinical experience. This exercise can be extremely time-consuming, and the resulting thresholds may only be relevant to a particular therapeutic area, patient or clinical site population. In addition, predefined thresholds cannot account for safety or quality issues where the underlying rate of observing a particular problem may change over the course of a clinical trial, and often do not consider varying patient exposure. In this manuscript, we appropriate rules commonly utilized for funnel plots to define a traffic-light system for risk indicators based on statistical criteria that consider the duration of patient follow-up. Further, we describe how these methods can be adapted to assess changing risk over time. Finally, we illustrate numerous graphical approaches to summarize and communicate risk, and discuss hybrid clinical-statistical approaches to allow for the assessment of risk at sites with low patient enrollment. We illustrate the aforementioned methodologies for a clinical trial in patients with schizophrenia. Funnel plots are a flexible graphical technique that can form the basis for a risk-based strategy to assess data integrity, while considering site sample size, patient exposure, and changing risk across time.

  11. 75 FR 43892 - Approval and Promulgation of Implementation Plans; New York Prevention of Significant...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... proposal related to the Tailoring Rule thresholds. III. What is EPA's analysis of New York's NSR rule...) Program. New York's amendments to Part 201 revise the definition for ``major stationary source or major... Parts 200 and 201, including the new or revised definitions are consistent with Federal guidance, EPA is...

  12. 75 FR 77051 - Rules Implementing Amendments to the Investment Advisers Act of 1940

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-10

    ...The Securities and Exchange Commission is proposing new rules and rule amendments under the Investment Advisers Act of 1940 to implement provisions of the Dodd-Frank Wall Street Reform and Consumer Protection Act. These rules and rule amendments are designed to give effect to provisions of Title IV of the Dodd-Frank Act that, among other things, increase the statutory threshold for registration by investment advisers with the Commission, require advisers to hedge funds and other private funds to register with the Commission, and require reporting by certain investment advisers that are exempt from registration. In addition, we are proposing rule amendments, including amendments to the Commission's pay-to-play rule, that address a number of other changes to the Advisers Act made by the Dodd-Frank Act.

  13. 77 FR 15436 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-15

    ... Effectiveness of a Proposed Rule Change To Amend Its Fees Schedule March 9, 2012. Pursuant to Section 19(b)(1... customer transaction (the threshold for all index options is set at 5,000 contracts other than S&P 500 index options, for which the threshold is 10,000 contracts). The Exchange offers the Discount in order...

  14. Kuramoto model with uniformly spaced frequencies: Finite-N asymptotics of the locking threshold.

    PubMed

    Ottino-Löffler, Bertrand; Strogatz, Steven H

    2016-06-01

    We study phase locking in the Kuramoto model of coupled oscillators in the special case where the number of oscillators, N, is large but finite, and the oscillators' natural frequencies are evenly spaced on a given interval. In this case, stable phase-locked solutions are known to exist if and only if the frequency interval is narrower than a certain critical width, called the locking threshold. For infinite N, the exact value of the locking threshold was calculated 30 years ago; however, the leading corrections to it for finite N have remained unsolved analytically. Here we derive an asymptotic formula for the locking threshold when N≫1. The leading correction to the infinite-N result scales like either N^{-3/2} or N^{-1}, depending on whether the frequencies are evenly spaced according to a midpoint rule or an end-point rule. These scaling laws agree with numerical results obtained by Pazó [D. Pazó, Phys. Rev. E 72, 046211 (2005)PLEEE81539-375510.1103/PhysRevE.72.046211]. Moreover, our analysis yields the exact prefactors in the scaling laws, which also match the numerics.

  15. Symmetrical and asymmetrical outcomes of leader anger expression: A qualitative study of army personnel

    PubMed Central

    Lindebaum, Dirk; Jordan, Peter J; Morris, Lucy

    2016-01-01

    Recent studies have highlighted the utility of anger at work, suggesting that anger can have positive outcomes. Using the Dual Threshold Model, we assess the positive and negative consequences of anger expressions at work and focus on the conditions under which expressions of anger crossing the impropriety threshold are perceived as productive or counterproductive by observers or targets of that anger. To explore this phenomenon, we conducted a phenomenological study (n = 20) to probe the lived experiences of followers (as observers and targets) associated with anger expressions by military leaders. The nature of task (e.g. the display rules prescribed for combat situations) emerged as one condition under which the crossing of the impropriety threshold leads to positive outcomes of anger expressions. Our data reveal tensions between emotional display rules and emotional display norms in the military, thereby fostering paradoxical attitudes toward anger expression and its consequences among followers. Within this paradoxical space, anger expressions have both positive (asymmetrical) and negative (symmetrical) consequences. We place our findings in the context of the Dual Threshold Model, discuss the practical implications of our research and offer avenues for future studies. PMID:26900171

  16. Symmetrical and asymmetrical outcomes of leader anger expression: A qualitative study of army personnel.

    PubMed

    Lindebaum, Dirk; Jordan, Peter J; Morris, Lucy

    2016-02-01

    Recent studies have highlighted the utility of anger at work, suggesting that anger can have positive outcomes. Using the Dual Threshold Model, we assess the positive and negative consequences of anger expressions at work and focus on the conditions under which expressions of anger crossing the impropriety threshold are perceived as productive or counterproductive by observers or targets of that anger. To explore this phenomenon, we conducted a phenomenological study ( n = 20) to probe the lived experiences of followers (as observers and targets) associated with anger expressions by military leaders. The nature of task (e.g. the display rules prescribed for combat situations) emerged as one condition under which the crossing of the impropriety threshold leads to positive outcomes of anger expressions. Our data reveal tensions between emotional display rules and emotional display norms in the military, thereby fostering paradoxical attitudes toward anger expression and its consequences among followers. Within this paradoxical space, anger expressions have both positive (asymmetrical) and negative (symmetrical) consequences. We place our findings in the context of the Dual Threshold Model, discuss the practical implications of our research and offer avenues for future studies.

  17. NASA Safety and Health (Short Form). Final rule

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This final rule adopts with changes the interim rule published in the Federal Register on April 5, 2001 (65 FR 18051-18053), which amended the NASA FAR Supplement to implement a Safety and Health (Short Form) clause to address safety and occupational health in all NASA contracts above the micro-purchase threshold where the existing Safety and Health clause did not apply, and amended other safety and health clauses to be consistent with the new NASA Safety and Health (Short Form) clause.

  18. Detecting, Monitoring, and Reporting Possible Adverse Drug Events Using an Arden-Syntax-based Rule Engine.

    PubMed

    Fehre, Karsten; Plössnig, Manuela; Schuler, Jochen; Hofer-Dückelmann, Christina; Rappelsberger, Andrea; Adlassnig, Klaus-Peter

    2015-01-01

    The detection of adverse drug events (ADEs) is an important aspect of improving patient safety. The iMedication system employs predefined triggers associated with significant events in a patient's clinical data to automatically detect possible ADEs. We defined four clinically relevant conditions: hyperkalemia, hyponatremia, renal failure, and over-anticoagulation. These are some of the most relevant ADEs in internal medical and geriatric wards. For each patient, ADE risk scores for all four situations are calculated, compared against a threshold, and judged to be monitored, or reported. A ward-based cockpit view summarizes the results.

  19. Top Level Space Cost Methodology (TLSCM)

    DTIC Science & Technology

    1997-12-02

    Software 7 6. ACEIT . 7 C. Ground Rules and Assumptions 7 D. Typical Life Cycle Cost Distribution 7 E. Methodologies 7 1. Cost/budget Threshold 9 2. Analogy...which is based on real-time Air Force and space programs. Ref.(25:2- 8, 2-9) 6. ACEIT : Automated Cost Estimating Integrated Tools( ACEIT ), Tecolote...Research, Inc. There is a way to use the ACEIT cost program to get a print-out of an expanded WBS. Therefore, find someone that has ACEIT experience and

  20. System and method for embedding emotion in logic systems

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A. (Inventor)

    2012-01-01

    A system, method, and computer readable-media for creating a stable synthetic neural system. The method includes training an intellectual choice-driven synthetic neural system (SNS), training an emotional rule-driven SNS by generating emotions from rules, incorporating the rule-driven SNS into the choice-driven SNS through an evolvable interface, and balancing the emotional SNS and the intellectual SNS to achieve stability in a nontrivial autonomous environment with a Stability Algorithm for Neural Entities (SANE). Generating emotions from rules can include coding the rules into the rule-driven SNS in a self-consistent way. Training the emotional rule-driven SNS can occur during a training stage in parallel with training the choice-driven SNS. The training stage can include a self assessment loop which measures performance characteristics of the rule-driven SNS against core genetic code. The method uses a stability threshold to measure stability of the incorporated rule-driven SNS and choice-driven SNS using SANE.

  1. Controlling false-negative errors in microarray differential expression analysis: a PRIM approach.

    PubMed

    Cole, Steve W; Galic, Zoran; Zack, Jerome A

    2003-09-22

    Theoretical considerations suggest that current microarray screening algorithms may fail to detect many true differences in gene expression (Type II analytic errors). We assessed 'false negative' error rates in differential expression analyses by conventional linear statistical models (e.g. t-test), microarray-adapted variants (e.g. SAM, Cyber-T), and a novel strategy based on hold-out cross-validation. The latter approach employs the machine-learning algorithm Patient Rule Induction Method (PRIM) to infer minimum thresholds for reliable change in gene expression from Boolean conjunctions of fold-induction and raw fluorescence measurements. Monte Carlo analyses based on four empirical data sets show that conventional statistical models and their microarray-adapted variants overlook more than 50% of genes showing significant up-regulation. Conjoint PRIM prediction rules recover approximately twice as many differentially expressed transcripts while maintaining strong control over false-positive (Type I) errors. As a result, experimental replication rates increase and total analytic error rates decline. RT-PCR studies confirm that gene inductions detected by PRIM but overlooked by other methods represent true changes in mRNA levels. PRIM-based conjoint inference rules thus represent an improved strategy for high-sensitivity screening of DNA microarrays. Freestanding JAVA application at http://microarray.crump.ucla.edu/focus

  2. 75 FR 51126 - Self-Regulatory Organizations; Notice of Filing and Immediate Effectiveness of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-18

    ... accordance with Rule 6.24. \\6\\ The term OTP refers to an Options Trading Permit issued by the Exchange for... Ex-by-Ex threshold, coupled with the dramatic increase in option trading volume from 2003 to 2009... and the increase in trading, the existing deadline for submitting CEAs to the Exchange is problematic...

  3. 78 FR 11795 - Minimum Technical Standards for Class II Gaming Systems and Equipment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-20

    ... threshold amount. The Commission increased that amount in the MICS from $1,000,000 to $3,000,000. The... considered to be small entities for the purposes of the Regulatory Flexibility Act. Small Business Regulatory Enforcement Fairness Act The proposed rule is not a major rule under 5 U.S.C. 804(2), the Small Business...

  4. 78 FR 2249 - Magnuson-Stevens Act Provisions; Fisheries of the Northeastern United States; Northeast...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-10

    ...This action reopens the comment period for an Acadian redfish- related proposed rule that published on November 8, 2012. The original comment period closed on November 23, 2012. This action clarifies a bycatch threshold incorrectly explained in the proposed rule. The public comment period is being reopened to solicit additional public comment on this correction.

  5. Spontaneous Subarachnoid Hemorrhage: A Systematic Review and Meta-Analysis Describing the Diagnostic Accuracy of History, Physical Exam, Imaging, and Lumbar Puncture with an Exploration of Test Thresholds

    PubMed Central

    Carpenter, Christopher R.; Hussain, Adnan M.; Ward, Michael J.; Zipfel, Gregory J.; Fowler, Susan; Pines, Jesse M.; Sivilotti, Marco L.A.

    2016-01-01

    Background Spontaneous subarachnoid hemorrhage (SAH) is a rare, but serious etiology of headache. The diagnosis of SAH is especially challenging in alert, neurologically intact patients, as missed or delayed diagnosis can be catastrophic. Objectives To perform a diagnostic accuracy systematic review and meta-analysis of history, physical examination, cerebrospinal fluid (CSF) tests, computed tomography (CT), and clinical decision rules for spontaneous SAH. A secondary objective was to delineate probability of disease thresholds for imaging and lumbar puncture (LP). Methods PUBMED, EMBASE, SCOPUS, and research meeting abstracts were searched up to June 2015 for studies of emergency department (ED) patients with acute headache clinically concerning for spontaneous SAH. QUADAS-2 was used to assess study quality and, when appropriate, meta-analysis was conducted using random effects models. Outcomes were sensitivity, specificity, positive (LR+) and negative (LR−) likelihood ratios. To identify test- and treatment-thresholds, we employed the Pauker-Kassirer method with Bernstein test-indication curves using the summary estimates of diagnostic accuracy. Results A total of 5,022 publications were identified, of which 122 underwent full text-review; 22 studies were included (average SAH prevalence 7.5%). Diagnostic studies differed in assessment of history and physical exam findings, CT technology, analytical techniques used to identify xanthochromia, and criterion standards for SAH. Study quality by QUADAS-2 was variable; however, most had a relatively low-risk of biases. A history of neck pain (LR+ 4.1 [95% CI 2.2-7.6]) and neck stiffness on physical exam (LR+ 6.6 [4.0-11.0]) were the individual findings most strongly associated with SAH. Combinations of findings may rule out SAH, yet promising clinical decision rules await external validation. Non-contrast cranial CT within 6 hours of headache onset accurately ruled-in (LR+ 230 [6-8700]) and ruled-out SAH (LR− 0.01 [0-0.04]); CT beyond 6 hours had a LR− of 0.07 [0.01-0.61]. CSF analyses had lower diagnostic accuracy, whether using red blood cell (RBC) count or xanthochromia. At a threshold RBC count of 1,000 × 106/L, the LR+ was 5.7 [1.4-23] and LR− 0.21 [0.03-1.7]. Using the pooled estimates of diagnostic accuracy and testing risks and benefits, we estimate LP only benefits CT negative patients when the pre-LP probability of SAH is on the order of 5%, which corresponds to a pre-CT probability greater than 20%. Conclusions Less than one in ten headache patients concerning for SAH are ultimately diagnosed with SAH in recent studies. While certain symptoms and signs increase or decrease the likelihood of SAH, no single characteristic is sufficient to rule-in or rule-out SAH. Within 6 hours of symptom onset, non-contrast cranial CT is highly accurate, while a negative CT beyond 6 hours substantially reduces the likelihood of SAH. LP appears to benefit relatively few patients within a narrow pre-test probability range. With improvements in CT technology and an expanding body of evidence, test-thresholds for LP may become more precise, obviating the need for a post-CT LP in more acute headache patients. Existing SAH clinical decision rules await external validation, but offer the potential to identify subsets most likely to benefit from post-CT LP, angiography, or no further testing. PMID:27306497

  6. Evaluating links between forest harvest and stream temperature threshold exceedances: the value of spatial and temporal data

    Treesearch

    Jeremiah D. Groom; Sherri L. Johnson; Joshua D. Seeds; George G. Ice

    2017-01-01

    We present the results of a replicated before-after-control-impact study on 33 streams to test the effectiveness of riparian rules for private and State forests at meeting temperature criteria in streams in western Oregon. Many states have established regulatory temperature thresholds, referred to as numeric criteria, to protect cold-water fishes such as salmon and...

  7. Testing decision rules for categorizing species' extinction risk to help develop quantitative listing criteria for the U.S. Endangered Species Act.

    PubMed

    Regan, Tracey J; Taylor, Barbara L; Thompson, Grant G; Cochrane, Jean Fitts; Ralls, Katherine; Runge, Michael C; Merrick, Richard

    2013-08-01

    Lack of guidance for interpreting the definitions of endangered and threatened in the U.S. Endangered Species Act (ESA) has resulted in case-by-case decision making leaving the process vulnerable to being considered arbitrary or capricious. Adopting quantitative decision rules would remedy this but requires the agency to specify the relative urgency concerning extinction events over time, cutoff risk values corresponding to different levels of protection, and the importance given to different types of listing errors. We tested the performance of 3 sets of decision rules that use alternative functions for weighting the relative urgency of future extinction events: a threshold rule set, which uses a decision rule of x% probability of extinction over y years; a concave rule set, where the relative importance of future extinction events declines exponentially over time; and a shoulder rule set that uses a sigmoid shape function, where relative importance declines slowly at first and then more rapidly. We obtained decision cutoffs by interviewing several biologists and then emulated the listing process with simulations that covered a range of extinction risks typical of ESA listing decisions. We evaluated performance of the decision rules under different data quantities and qualities on the basis of the relative importance of misclassification errors. Although there was little difference between the performance of alternative decision rules for correct listings, the distribution of misclassifications differed depending on the function used. Misclassifications for the threshold and concave listing criteria resulted in more overprotection errors, particularly as uncertainty increased, whereas errors for the shoulder listing criteria were more symmetrical. We developed and tested the framework for quantitative decision rules for listing species under the U.S. ESA. If policy values can be agreed on, use of this framework would improve the implementation of the ESA by increasing transparency and consistency. Conservation Biology © 2013 Society for Conservation Biology No claim to original US government works.

  8. pH-dependent electron-transport properties of carbon nanotubes.

    PubMed

    Back, Ju Hee; Shim, Moonsub

    2006-11-30

    Carbon nanotube electrochemical transistors integrated with microfluidic channels are utilized to examine the effects of aqueous electrolyte solutions on the electron-transport properties of single isolated carbon nanotubes. In particular, pH and concentration of supporting inert electrolytes are examined. A systematic threshold voltage shift with pH is observed while the transconductance and subthreshold swing remain independent of pH and concentration. Decreasing pH leads to a negative shift of the threshold voltage, indicating that protonation does not lead to hole doping. Changing the type of contact metal does not alter the observed pH response. The pH-dependent charging of SiO2 substrate is ruled out as the origin based on measurements with suspended nanotube transistors. Increasing the ionic strength leads to reduced pH response. Contributions from possible surface chargeable chemical groups are considered.

  9. Faults Discovery By Using Mined Data

    NASA Technical Reports Server (NTRS)

    Lee, Charles

    2005-01-01

    Fault discovery in the complex systems consist of model based reasoning, fault tree analysis, rule based inference methods, and other approaches. Model based reasoning builds models for the systems either by mathematic formulations or by experiment model. Fault Tree Analysis shows the possible causes of a system malfunction by enumerating the suspect components and their respective failure modes that may have induced the problem. The rule based inference build the model based on the expert knowledge. Those models and methods have one thing in common; they have presumed some prior-conditions. Complex systems often use fault trees to analyze the faults. Fault diagnosis, when error occurs, is performed by engineers and analysts performing extensive examination of all data gathered during the mission. International Space Station (ISS) control center operates on the data feedback from the system and decisions are made based on threshold values by using fault trees. Since those decision-making tasks are safety critical and must be done promptly, the engineers who manually analyze the data are facing time challenge. To automate this process, this paper present an approach that uses decision trees to discover fault from data in real-time and capture the contents of fault trees as the initial state of the trees.

  10. An Improved Compressive Sensing and Received Signal Strength-Based Target Localization Algorithm with Unknown Target Population for Wireless Local Area Networks.

    PubMed

    Yan, Jun; Yu, Kegen; Chen, Ruizhi; Chen, Liang

    2017-05-30

    In this paper a two-phase compressive sensing (CS) and received signal strength (RSS)-based target localization approach is proposed to improve position accuracy by dealing with the unknown target population and the effect of grid dimensions on position error. In the coarse localization phase, by formulating target localization as a sparse signal recovery problem, grids with recovery vector components greater than a threshold are chosen as the candidate target grids. In the fine localization phase, by partitioning each candidate grid, the target position in a grid is iteratively refined by using the minimum residual error rule and the least-squares technique. When all the candidate target grids are iteratively partitioned and the measurement matrix is updated, the recovery vector is re-estimated. Threshold-based detection is employed again to determine the target grids and hence the target population. As a consequence, both the target population and the position estimation accuracy can be significantly improved. Simulation results demonstrate that the proposed approach achieves the best accuracy among all the algorithms compared.

  11. 75 FR 81003 - Rate Increase Disclosure and Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-23

    ...This document contains proposed regulations implementing the rules for health insurance issuers regarding the disclosure and review of unreasonable premium increases under section 2794 of the Public Health Service Act. The proposed rule would establish a rate review program to ensure that all rate increases that meet or exceed an established threshold are reviewed by a State or HHS to determine whether the rate increases are unreasonable.

  12. Defining ADHD symptom persistence in adulthood: optimizing sensitivity and specificity.

    PubMed

    Sibley, Margaret H; Swanson, James M; Arnold, L Eugene; Hechtman, Lily T; Owens, Elizabeth B; Stehli, Annamarie; Abikoff, Howard; Hinshaw, Stephen P; Molina, Brooke S G; Mitchell, John T; Jensen, Peter S; Howard, Andrea L; Lakes, Kimberley D; Pelham, William E

    2017-06-01

    Longitudinal studies of children diagnosed with ADHD report widely ranging ADHD persistence rates in adulthood (5-75%). This study documents how information source (parent vs. self-report), method (rating scale vs. interview), and symptom threshold (DSM vs. norm-based) influence reported ADHD persistence rates in adulthood. Five hundred seventy-nine children were diagnosed with DSM-IV ADHD-Combined Type at baseline (ages 7.0-9.9 years) 289 classmates served as a local normative comparison group (LNCG), 476 and 241 of whom respectively were evaluated in adulthood (Mean Age = 24.7). Parent and self-reports of symptoms and impairment on rating scales and structured interviews were used to investigate ADHD persistence in adulthood. Persistence rates were higher when using parent rather than self-reports, structured interviews rather than rating scales (for self-report but not parent report), and a norm-based (NB) threshold of 4 symptoms rather than DSM criteria. Receiver-Operating Characteristics (ROC) analyses revealed that sensitivity and specificity were optimized by combining parent and self-reports on a rating scale and applying a NB threshold. The interview format optimizes young adult self-reporting when parent reports are not available. However, the combination of parent and self-reports from rating scales, using an 'or' rule and a NB threshold optimized the balance between sensitivity and specificity. With this definition, 60% of the ADHD group demonstrated symptom persistence and 41% met both symptom and impairment criteria in adulthood. © 2016 Association for Child and Adolescent Mental Health.

  13. Characterising bias in regulatory risk and decision analysis: An analysis of heuristics applied in health technology appraisal, chemicals regulation, and climate change governance.

    PubMed

    MacGillivray, Brian H

    2017-08-01

    In many environmental and public health domains, heuristic methods of risk and decision analysis must be relied upon, either because problem structures are ambiguous, reliable data is lacking, or decisions are urgent. This introduces an additional source of uncertainty beyond model and measurement error - uncertainty stemming from relying on inexact inference rules. Here we identify and analyse heuristics used to prioritise risk objects, to discriminate between signal and noise, to weight evidence, to construct models, to extrapolate beyond datasets, and to make policy. Some of these heuristics are based on causal generalisations, yet can misfire when these relationships are presumed rather than tested (e.g. surrogates in clinical trials). Others are conventions designed to confer stability to decision analysis, yet which may introduce serious error when applied ritualistically (e.g. significance testing). Some heuristics can be traced back to formal justifications, but only subject to strong assumptions that are often violated in practical applications. Heuristic decision rules (e.g. feasibility rules) in principle act as surrogates for utility maximisation or distributional concerns, yet in practice may neglect costs and benefits, be based on arbitrary thresholds, and be prone to gaming. We highlight the problem of rule-entrenchment, where analytical choices that are in principle contestable are arbitrarily fixed in practice, masking uncertainty and potentially introducing bias. Strategies for making risk and decision analysis more rigorous include: formalising the assumptions and scope conditions under which heuristics should be applied; testing rather than presuming their underlying empirical or theoretical justifications; using sensitivity analysis, simulations, multiple bias analysis, and deductive systems of inference (e.g. directed acyclic graphs) to characterise rule uncertainty and refine heuristics; adopting "recovery schemes" to correct for known biases; and basing decision rules on clearly articulated values and evidence, rather than convention. Copyright © 2017. Published by Elsevier Ltd.

  14. Threshold cascades with response heterogeneity in multiplex networks

    NASA Astrophysics Data System (ADS)

    Lee, Kyu-Min; Brummitt, Charles D.; Goh, K.-I.

    2014-12-01

    Threshold cascade models have been used to describe the spread of behavior in social networks and cascades of default in financial networks. In some cases, these networks may have multiple kinds of interactions, such as distinct types of social ties or distinct types of financial liabilities; furthermore, nodes may respond in different ways to influence from their neighbors of multiple types. To start to capture such settings in a stylized way, we generalize a threshold cascade model to a multiplex network in which nodes follow one of two response rules: some nodes activate when, in at least one layer, a large enough fraction of neighbors is active, while the other nodes activate when, in all layers, a large enough fraction of neighbors is active. Varying the fractions of nodes following either rule facilitates or inhibits cascades. Near the inhibition regime, global cascades appear discontinuously as the network density increases; however, the cascade grows more slowly over time. This behavior suggests a way in which various collective phenomena in the real world could appear abruptly yet slowly.

  15. The research of "blind" spot in the LVQ network

    NASA Astrophysics Data System (ADS)

    Guo, Zhanjie; Nan, Shupo; Wang, Xiaoli

    2017-04-01

    Nowadays competitive neural network has been widely used in the pattern recognition, classification and other aspects, and show the great advantages compared with the traditional clustering methods. But the competitive neural networks still has inadequate in many aspects, and it needs to be further improved. Based on the learning Vector Quantization Network proposed by Learning Kohonen [1], this paper resolve the issue of the large training error, when there are "blind" spots in a network through the introduction of threshold value learning rules and finally programs the realization with Matlab.

  16. Widely tunable semiconductor lasers with three interferometric arms.

    PubMed

    Su, Guan-Lin; Wu, Ming C

    2017-09-04

    We present a comprehensive study for a new three-branch widely tunable semiconductor laser based on a self-imaging, lossless multi-mode interference (MMI) coupler. We have developed a general theoretical framework that is applicable to all types of interferometric lasers. Our analysis showed that the three-branch laser offers high side-mode suppression ratios (SMSRs) while maintaining a wide tuning range and a low threshold modal gain of the lasing mode. We also present the design rules for tuning over the dense-wavelength division multiplexing grid over the C-band.

  17. UIC at TREC 2008 Blog Track

    DTIC Science & Technology

    2008-11-01

    T or more words, where T is a threshold that is empirically set to 300 in the experiment. The second rule aims to remove pornographic documents...Some blog documents are embedded with pornographic words to attract search traffic. We identify a list of pornographic words. Given a blog document, all...document, this document is considered pornographic spam, and is discarded. The third rule removes documents written in foreign languages. We count the

  18. FAF-Drugs2: free ADME/tox filtering tool to assist drug discovery and chemical biology projects.

    PubMed

    Lagorce, David; Sperandio, Olivier; Galons, Hervé; Miteva, Maria A; Villoutreix, Bruno O

    2008-09-24

    Drug discovery and chemical biology are exceedingly complex and demanding enterprises. In recent years there are been increasing awareness about the importance of predicting/optimizing the absorption, distribution, metabolism, excretion and toxicity (ADMET) properties of small chemical compounds along the search process rather than at the final stages. Fast methods for evaluating ADMET properties of small molecules often involve applying a set of simple empirical rules (educated guesses) and as such, compound collections' property profiling can be performed in silico. Clearly, these rules cannot assess the full complexity of the human body but can provide valuable information and assist decision-making. This paper presents FAF-Drugs2, a free adaptable tool for ADMET filtering of electronic compound collections. FAF-Drugs2 is a command line utility program (e.g., written in Python) based on the open source chemistry toolkit OpenBabel, which performs various physicochemical calculations, identifies key functional groups, some toxic and unstable molecules/functional groups. In addition to filtered collections, FAF-Drugs2 can provide, via Gnuplot, several distribution diagrams of major physicochemical properties of the screened compound libraries. We have developed FAF-Drugs2 to facilitate compound collection preparation, prior to (or after) experimental screening or virtual screening computations. Users can select to apply various filtering thresholds and add rules as needed for a given project. As it stands, FAF-Drugs2 implements numerous filtering rules (23 physicochemical rules and 204 substructure searching rules) that can be easily tuned.

  19. THE EFFECTS OF SPATIAL SMOOTHING ON SOLAR MAGNETIC HELICITY PARAMETERS AND THE HEMISPHERIC HELICITY SIGN RULE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ocker, Stella Koch; Petrie, Gordon, E-mail: socker@oberlin.edu, E-mail: gpetrie@nso.edu

    The hemispheric preference for negative/positive helicity to occur in the northern/southern solar hemisphere provides clues to the causes of twisted, flaring magnetic fields. Previous studies on the hemisphere rule may have been affected by seeing from atmospheric turbulence. Using Hinode /SOT-SP data spanning 2006–2013, we studied the effects of two spatial smoothing tests that imitate atmospheric seeing: noise reduction by ignoring pixel values weaker than the estimated noise threshold, and Gaussian spatial smoothing. We studied in detail the effects of atmospheric seeing on the helicity distributions across various field strengths for active regions (ARs) NOAA 11158 and NOAA 11243, in addition tomore » studying the average helicities of 179 ARs with and without smoothing. We found that, rather than changing trends in the helicity distributions, spatial smoothing modified existing trends by reducing random noise and by regressing outliers toward the mean, or removing them altogether. Furthermore, the average helicity parameter values of the 179 ARs did not conform to the hemisphere rule: independent of smoothing, the weak-vertical-field values tended to be negative in both hemispheres, and the strong-vertical-field values tended to be positive, especially in the south. We conclude that spatial smoothing does not significantly affect the overall statistics for space-based data, and thus seeing from atmospheric turbulence seems not to have significantly affected previous studies’ ground-based results on the hemisphere rule.« less

  20. Automatic Fault Recognition of Photovoltaic Modules Based on Statistical Analysis of Uav Thermography

    NASA Astrophysics Data System (ADS)

    Kim, D.; Youn, J.; Kim, C.

    2017-08-01

    As a malfunctioning PV (Photovoltaic) cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle). The proposed algorithm uses statistical analysis of thermal intensity (surface temperature) characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.

  1. Mapping sea ice leads with a coupled numeric/symbolic system

    NASA Technical Reports Server (NTRS)

    Key, J.; Schweiger, A. J.; Maslanik, J. A.

    1990-01-01

    A method is presented which facilitates the detection and delineation of leads with single-channel Landsat data by coupling numeric and symbolic procedures. The procedure consists of three steps: (1) using the dynamic threshold method, an image is mapped to a lead/no lead binary image; (2) the likelihood of fragments to be real leads is examined with a set of numeric rules; and (3) pairs of objects are examined geometrically and merged where possible. The processing ends when all fragments are merged and statistical characteristics are determined, and a map of valid lead objects are left which summarizes useful physical in the lead complexes. Direct implementation of domain knowledge and rapid prototyping are two benefits of the rule-based system. The approach is found to be more successfully applied to mid- and high-level processing, and the system can retrieve statistics about sea-ice leads as well as detect the leads.

  2. [Rules and regulations applying to incidents in radiotherapy].

    PubMed

    Lohr, F; Baus, W; Vorwerk, H; Schlömp, B; André, L; Georg, D; Hodapp, N

    2012-07-01

    Radiotherapy is an essential and reliable element of the treatment armamentarium in oncology. Numerous rules, regulations, and protocols minimize the associated risks. It can, however, never be excluded that errors in the treatment delivery chain result in inadequate tumor doses or unnecessary damage to organs at risk. A legal framework governs the management of such incidents. The most important European and North American regulations are reported. Various directives issued by the European Union are differently implemented nationally. This applies particularly to the characterization of incidents that must be reported to authorities. Reporting thresholds, audit systems, and the extent of the integration of voluntary reporting systems vary. Radiotherapy incidents are dealt with differently on an international level. Changes are to be expected based on the European Basic Safety Standards Directive that is currently being prepared and will have to be implemented nationally in due course.

  3. Is it possible to claim or refute sputum eosinophils ≥ 3% in asthmatics with sufficient accuracy using biomarkers?

    PubMed

    Demarche, Sophie F; Schleich, Florence N; Paulus, Virginie A; Henket, Monique A; Van Hees, Thierry J; Louis, Renaud E

    2017-07-03

    The concept of asthma inflammatory phenotypes has proved to be important in predicting response to inhaled corticosteroids. Induced sputum, which has been pivotal in the development of the concept of inflammatory phenotypes, is however not widely available. Several studies have proposed to use surrogate exhaled or blood biomarkers, like fractional exhaled nitric oxide (FENO), blood eosinophils and total serum immunoglobulin E (IgE). However, taken alone, each of these biomarkers has moderate accuracy to identify sputum eosinophilia. Here, we propose a new approach based on the likelihood ratio to study which thresholds of these biomarkers, taken alone or in combination, were able to rule in or rule out sputum eosinophils ≥3%. We showed in a large population of 869 asthmatics that combining FENO, blood eosinophils and total serum IgE could accurately predict sputum eosinophils ≥ or <3% in 58% of our population.

  4. PO-07 - Excluding pulmonary embolism in cancer patients using the Wells rule and age-adjusted D-dimer testing: an individual patient data meta-analysis.

    PubMed

    van Es, N; van der Hulle, T; van Es, J; den Exter, P L; Douma, R A; Goekoop, R J; Mos, I C M; Garcia, J G; Kamphuisen, P W; Huisman, M V; Klok, F A; Büller, H R; Bossuyt, P M

    2016-04-01

    Among patients with clinically suspected pulmonary embolism (PE), imaging and anticoagulant treatment can be safely withheld in approximately one-third of patients based on the combination of a "PE unlikely" Wells score and a D-dimer below the age-adjusted threshold. The clinical utility of this diagnostic approach in cancer patients is less clear. To evaluate the efficiency and failure rate of the original and simplified Wells rules in combination with age-adjusted D-dimer testing in patients with active cancer. Individual patient data were used from 6 large prospective studies in which the diagnostic management of PE was guided by the original Wells rule and D-dimer testing. Study physicians classified patients as having active cancer if they had new, recurrent, or progressive cancer (excluding basal-cell or squamous-cell skin carcinoma), or cancer requiring treatment in the last 6 months. We evaluated the dichotomous Wells rule and its simplified version (Table). The efficiency of the algorithm was defined as the proportion of patients with a "PE unlikely" Wells score and a negative age-adjusted D-dimer, defined by a D-dimer below the threshold of a patient's age times 10 μg/L in patients aged ≥51 years. A diagnostic failure was defined as a patient with a "PE unlikely" Wells score and negative age-adjusted D-dimer who had symptomatic venous thromboembolism during 3 months follow-up. A one-stage random effects meta-analysis was performed to estimate the efficiency and failure. The dataset comprised 938 patients with active cancer with a mean age of 63 years. The most frequent cancer types were breast (13%), gastrointestinal tract (11%), and lung (8%). The type of cancer was not specified in 42%. The pooled PE prevalence was 29% (95% CI 25-32). PE could be excluded in 122 patients based on a "PE unlikely" Wells score and a negative age-adjusted D-dimer (efficiency 13%; 95% CI 11-15). Two of 122 patients were diagnosed with non-fatal symptomatic venous thromboembolism during follow-up (failure rate 1.5%; 95% CI 0.13-14.8). The simplified Wells score in combination with a negative age-adjusted D-dimer had an efficiency of 3.9% (95% CI 2.0-7.6) and a failure rate of 2.4% (95% CI 0.3-15). Among cancer patients with clinically suspected PE, imaging and anticoagulant treatment can be withheld in 1 out of every 8 patients by the original Wells rule and age-adjusted D-dimer testing. The simplified Wells rule was neither efficient nor safe in this population. © 2016 Elsevier Ltd. All rights reserved.

  5. High Dimensional Classification Using Features Annealed Independence Rules.

    PubMed

    Fan, Jianqing; Fan, Yingying

    2008-01-01

    Classification using high-dimensional features arises frequently in many contemporary statistical studies such as tumor classification using microarray or other high-throughput data. The impact of dimensionality on classifications is largely poorly understood. In a seminal paper, Bickel and Levina (2004) show that the Fisher discriminant performs poorly due to diverging spectra and they propose to use the independence rule to overcome the problem. We first demonstrate that even for the independence classification rule, classification using all the features can be as bad as the random guessing due to noise accumulation in estimating population centroids in high-dimensional feature space. In fact, we demonstrate further that almost all linear discriminants can perform as bad as the random guessing. Thus, it is paramountly important to select a subset of important features for high-dimensional classification, resulting in Features Annealed Independence Rules (FAIR). The conditions under which all the important features can be selected by the two-sample t-statistic are established. The choice of the optimal number of features, or equivalently, the threshold value of the test statistics are proposed based on an upper bound of the classification error. Simulation studies and real data analysis support our theoretical results and demonstrate convincingly the advantage of our new classification procedure.

  6. Wavelet methodology to improve single unit isolation in primary motor cortex cells

    PubMed Central

    Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A.

    2016-01-01

    The proper isolation of action potentials recorded extracellularly from neural tissue is an active area of research in the fields of neuroscience and biomedical signal processing. This paper presents an isolation methodology for neural recordings using the wavelet transform (WT), a statistical thresholding scheme, and the principal component analysis (PCA) algorithm. The effectiveness of five different mother wavelets was investigated: biorthogonal, Daubachies, discrete Meyer, symmetric, and Coifman; along with three different wavelet coefficient thresholding schemes: fixed form threshold, Stein’s unbiased estimate of risk, and minimax; and two different thresholding rules: soft and hard thresholding. The signal quality was evaluated using three different statistical measures: mean-squared error, root-mean squared, and signal to noise ratio. The clustering quality was evaluated using two different statistical measures: isolation distance, and L-ratio. This research shows that the selection of the mother wavelet has a strong influence on the clustering and isolation of single unit neural activity, with the Daubachies 4 wavelet and minimax thresholding scheme performing the best. PMID:25794461

  7. 75 FR 14669 - Regulation of Fuels and Fuel Additives: Changes to Renewable Fuel Standard Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-26

    ...Under the Clean Air Act Section 211(o), as amended by the Energy Independence and Security Act of 2007 (EISA), the Environmental Protection Agency is required to promulgate regulations implementing changes to the Renewable Fuel Standard program. The revised statutory requirements specify the volumes of cellulosic biofuel, biomass-based diesel, advanced biofuel, and total renewable fuel that must be used in transportation fuel. This action finalizes the regulations that implement the requirements of EISA, including the cellulosic, biomass- based diesel, advanced biofuel, and renewable fuel standards that will apply to all gasoline and diesel produced or imported in 2010. The final regulations make a number of changes to the current Renewable Fuel Standard program while retaining many elements of the compliance and trading system already in place. This final rule also implements the revised statutory definitions and criteria, most notably the new greenhouse gas emission thresholds for renewable fuels and new limits on renewable biomass feedstocks. This rulemaking marks the first time that greenhouse gas emission performance is being applied in a regulatory context for a nationwide program. As mandated by the statute, our greenhouse gas emission assessments consider the full lifecycle emission impacts of fuel production from both direct and indirect emissions, including significant emissions from land use changes. In carrying out our lifecycle analysis we have taken steps to ensure that the lifecycle estimates are based on the latest and most up-to-date science. The lifecycle greenhouse gas assessments reflected in this rulemaking represent significant improvements in analysis based on information and data received since the proposal. However, we also recognize that lifecycle GHG assessment of biofuels is an evolving discipline and will continue to revisit our lifecycle analyses in the future as new information becomes available. EPA plans to ask the National Academy of Sciences for assistance as we move forward. Based on current analyses we have determined that ethanol from corn starch will be able to comply with the required greenhouse gas (GHG) threshold for renewable fuel. Similarly, biodiesel can be produced to comply with the 50% threshold for biomass-based diesel, sugarcane with the 50% threshold for advanced biofuel and multiple cellulosic-based fuels with their 60% threshold. Additional fuel pathways have also been determined to comply with their thresholds. The assessment for this rulemaking also indicates the increased use of renewable fuels will have important environmental, energy and economic impacts for our Nation.

  8. Cable Overheating Risk Warning Method Based on Impedance Parameter Estimation in Distribution Network

    NASA Astrophysics Data System (ADS)

    Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao

    2017-05-01

    Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.

  9. Improving the Projections of Vegetation Biogeography by Integrating Climate Envelope Models and Dynamic Global Vegetation Models

    NASA Astrophysics Data System (ADS)

    Case, M. J.; Kim, J. B.

    2015-12-01

    Assessing changes in vegetation is increasingly important for conservation planning in the face of climate change. Dynamic global vegetation models (DGVMs) are important tools for assessing such changes. DGVMs have been applied at regional scales to create projections of range expansions and contractions of plant functional types. Many DGVMs use a number of algorithms to determine the biogeography of plant functional types. One such DGVM, MC2, uses a series of decision trees based on bioclimatic thresholds while others, such as LPJ, use constraining emergent properties with a limited set of bioclimatic threshold-based rules. Although both approaches have been used widely, we demonstrate that these biogeography outputs perform poorly at continental scales when compared to existing potential vegetation maps. Specifically, we found that with MC2, the algorithm for determining leaf physiognomy is too simplistic to capture arid and semi-arid vegetation in much of the western U.S., as well as is the algorithm for determining the broadleaf and needleleaf mix in the Southeast. With LPJ, we found that the bioclimatic thresholds used to allow seedling establishment are too broad and fail to capture regional-scale biogeography of the plant functional types. In response, we demonstrate a new approach to determining the biogeography of plant functional types by integrating the climatic thresholds produced for individual tree species by a series of climate envelope models with the biogeography algorithms of MC2 and LPJ. Using this approach, we find that MC2 and LPJ perform considerably better when compared to potential vegetation maps.

  10. Patient Protection and Affordable Care Act; health insurance market rules. Final rule.

    PubMed

    2013-02-27

    This final rule implements provisions related to fair health insurance premiums, guaranteed availability, guaranteed renewability, single risk pools, and catastrophic plans, consistent with title I of the Patient Protection and Affordable Care Act, as amended by the Health Care and Education Reconciliation Act of 2010, referred to collectively as the Affordable Care Act. The final rule clarifies the approach used to enforce the applicable requirements of the Affordable Care Act with respect to health insurance issuers and group health plans that are non-federal governmental plans. This final rule also amends the standards for health insurance issuers and states regarding reporting, utilization, and collection of data under the federal rate review program, and revises the timeline for states to propose state-specific thresholds for review and approval by the Centers for Medicare & Medicaid Services (CMS).

  11. Spontaneous Subarachnoid Hemorrhage: A Systematic Review and Meta-analysis Describing the Diagnostic Accuracy of History, Physical Examination, Imaging, and Lumbar Puncture With an Exploration of Test Thresholds.

    PubMed

    Carpenter, Christopher R; Hussain, Adnan M; Ward, Michael J; Zipfel, Gregory J; Fowler, Susan; Pines, Jesse M; Sivilotti, Marco L A

    2016-09-01

    Spontaneous subarachnoid hemorrhage (SAH) is a rare, but serious etiology of headache. The diagnosis of SAH is especially challenging in alert, neurologically intact patients, as missed or delayed diagnosis can be catastrophic. The objective was to perform a diagnostic accuracy systematic review and meta-analysis of history, physical examination, cerebrospinal fluid (CSF) tests, computed tomography (CT), and clinical decision rules for spontaneous SAH. A secondary objective was to delineate probability of disease thresholds for imaging and lumbar puncture (LP). PubMed, Embase, Scopus, and research meeting abstracts were searched up to June 2015 for studies of emergency department patients with acute headache clinically concerning for spontaneous SAH. QUADAS-2 was used to assess study quality and, when appropriate, meta-analysis was conducted using random effects models. Outcomes were sensitivity, specificity, and positive (LR+) and negative (LR-) likelihood ratios. To identify test and treatment thresholds, we employed the Pauker-Kassirer method with Bernstein test indication curves using the summary estimates of diagnostic accuracy. A total of 5,022 publications were identified, of which 122 underwent full-text review; 22 studies were included (average SAH prevalence = 7.5%). Diagnostic studies differed in assessment of history and physical examination findings, CT technology, analytical techniques used to identify xanthochromia, and criterion standards for SAH. Study quality by QUADAS-2 was variable; however, most had a relatively low risk of biases. A history of neck pain (LR+ = 4.1; 95% confidence interval [CI] = 2.2 to 7.6) and neck stiffness on physical examination (LR+ = 6.6; 95% CI = 4.0 to 11.0) were the individual findings most strongly associated with SAH. Combinations of findings may rule out SAH, yet promising clinical decision rules await external validation. Noncontrast cranial CT within 6 hours of headache onset accurately ruled in (LR+ = 230; 95% CI = 6 to 8,700) and ruled out SAH (LR- = 0.01; 95% CI = 0 to 0.04); CT beyond 6 hours had a LR- of 0.07 (95% CI = 0.01 to 0.61). CSF analyses had lower diagnostic accuracy, whether using red blood cell (RBC) count or xanthochromia. At a threshold RBC count of 1,000 × 10(6) /L, the LR+ was 5.7 (95% CI = 1.4 to 23) and LR- was 0.21 (95% CI = 0.03 to 1.7). Using the pooled estimates of diagnostic accuracy and testing risks and benefits, we estimate that LP only benefits CT-negative patients when the pre-LP probability of SAH is on the order of 5%, which corresponds to a pre-CT probability greater than 20%. Less than one in 10 headache patients concerning for SAH are ultimately diagnosed with SAH in recent studies. While certain symptoms and signs increase or decrease the likelihood of SAH, no single characteristic is sufficient to rule in or rule out SAH. Within 6 hours of symptom onset, noncontrast cranial CT is highly accurate, while a negative CT beyond 6 hours substantially reduces the likelihood of SAH. LP appears to benefit relatively few patients within a narrow pretest probability range. With improvements in CT technology and an expanding body of evidence, test thresholds for LP may become more precise, obviating the need for a post-CT LP in more acute headache patients. Existing SAH clinical decision rules await external validation, but offer the potential to identify subsets most likely to benefit from post-CT LP, angiography, or no further testing. © 2016 by the Society for Academic Emergency Medicine.

  12. 27 The DiPEP (Diagnosis of PE in Pregnancy) study: can clinical assessment, d-dimer or chest x-ray be used to select pregnant or postpartum women with suspected PE for diagnostic imaging?

    PubMed

    Goodacre, Steve; Horspool, Kimberley; Nelson-Piercy, Catherine; Knight, Marian; Shephard, Neil; Lecky, Fiona; Thomas, Steven; Hunt, Beverley; Fuller, Gordon

    2017-12-01

    To determine whether clinical features (in the form of a clinical decision rule) or d-dimer can be used to select pregnant or postpartum women with suspected PE for diagnostic imaging. Observational cohort study augmented with additional cases. Consultant-led maternity units participating in the UK Obstetric Surveillance System (UKOSS) and emergency departments and maternity units at eleven prospectively recruiting sites. 198 pregnant or postpartum women with diagnosed PE identified through UKOSS and 324 pregnant or postpartum women with suspected PE from prospectively recruiting sites. Data were collected relating to clinical features, elements of clinical decision rules, d-dimer measurements, diagnostic imaging, treatment for PE and adverse outcomes. Women were classified as having or not having PE on the basis of diagnostic imaging, treatment and subsequent adverse outcomes. Primary analysis was limited to women with conclusive diagnostic imaging. Secondary analyses included women with clinically diagnosed or ruled out PE. The primary analysis included 181 women with PE and 259 without. Most clinical features showed no association with PE. The only exceptions were number of previous pregnancies over 24 weeks (p=0.017), no varicose veins (p=0.045), no recent long haul travel (p=0.006), recent surgery including caesarean section (p=0.001), increased temperature (p=0.003), low oxygen saturation (p<0.001), PE-related chest x-ray abnormality (p=0.01) and other chest x-ray abnormality (p=0.001).Clinical decision rules had areas under the receiver-operator characteristic curve ranging from 0.577 to 0.732. No clinically useful threshold for decision-making was identified for any rule. The sensitivities and specificities of d-dimer were 88.4% and 8.8% using the standard laboratory threshold and 69.8% and 32.8% using a pregnancy-specific threshold. Clinical decision rules, d-dimer and chest x-ray should not be used to select pregnant or postpartum women with suspected PE for diagnostic imaging. © 2017, Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  13. Applicability of the Effective-Medium Approximation to Heterogeneous Aerosol Particles.

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Dlugach, Janna M.; Liu, Li

    2016-01-01

    The effective-medium approximation (EMA) is based on the assumption that a heterogeneous particle can have a homogeneous counterpart possessing similar scattering and absorption properties. We analyze the numerical accuracy of the EMA by comparing superposition T-matrix computations for spherical aerosol particles filled with numerous randomly distributed small inclusions and Lorenz-Mie computations based on the Maxwell-Garnett mixing rule. We verify numerically that the EMA can indeed be realized for inclusion size parameters smaller than a threshold value. The threshold size parameter depends on the refractive-index contrast between the host and inclusion materials and quite often does not exceed several tenths, especially in calculations of the scattering matrix and the absorption cross section. As the inclusion size parameter approaches the threshold value, the scattering-matrix errors of the EMA start to grow with increasing the host size parameter and or the number of inclusions. We confirm, in particular, the existence of the effective-medium regime in the important case of dust aerosols with hematite or air-bubble inclusions, but then the large refractive-index contrast necessitates inclusion size parameters of the order of a few tenths. Irrespective of the highly restricted conditions of applicability of the EMA, our results provide further evidence that the effective-medium regime must be a direct corollary of the macroscopic Maxwell equations under specific assumptions.

  14. Zseq: An Approach for Preprocessing Next-Generation Sequencing Data.

    PubMed

    Alkhateeb, Abedalrhman; Rueda, Luis

    2017-08-01

    Next-generation sequencing technology generates a huge number of reads (short sequences), which contain a vast amount of genomic data. The sequencing process, however, comes with artifacts. Preprocessing of sequences is mandatory for further downstream analysis. We present Zseq, a linear method that identifies the most informative genomic sequences and reduces the number of biased sequences, sequence duplications, and ambiguous nucleotides. Zseq finds the complexity of the sequences by counting the number of unique k-mers in each sequence as its corresponding score and also takes into the account other factors such as ambiguous nucleotides or high GC-content percentage in k-mers. Based on a z-score threshold, Zseq sweeps through the sequences again and filters those with a z-score less than the user-defined threshold. Zseq algorithm is able to provide a better mapping rate; it reduces the number of ambiguous bases significantly in comparison with other methods. Evaluation of the filtered reads has been conducted by aligning the reads and assembling the transcripts using the reference genome as well as de novo assembly. The assembled transcripts show a better discriminative ability to separate cancer and normal samples in comparison with another state-of-the-art method. Moreover, de novo assembled transcripts from the reads filtered by Zseq have longer genomic sequences than other tested methods. Estimating the threshold of the cutoff point is introduced using labeling rules with optimistic results.

  15. Thermal behavior of Charmonium in the vector channel from QCD sum rules

    NASA Astrophysics Data System (ADS)

    Dominguez, C. A.; Loewe, M.; Rojas, J. C.; Zhang, Y.

    2010-11-01

    The thermal evolution of the hadronic parameters of charmonium in the vector channel, i.e. the J/Ψ resonance mass, coupling (leptonic decay constant), total width, and continuum threshold are analyzed in the framework of thermal Hilbert moment QCD sum rules. The continuum threshold s0 has the same behavior as in all other hadronic channels, i.e. it decreases with increasing temperature until the PQCD threshold s0 = 4mQ2 is reached at T≃1.22Tc (mQ is the charm quark mass). The other hadronic parameters behave in a very different way from those of light-light and heavy-light quark systems. The J/Ψ mass is essentially constant in a wide range of temperatures, while the total width grows with temperature up to T≃1.04Tc beyond which it decreases sharply with increasing T. The resonance coupling is also initially constant beginning to increase monotonically around T≃Tc. This behavior of the total width and of the leptonic decay constant is a strong indication that the J/Ψ resonance might survive beyond the critical temperature for deconfinement, in agreement with some recent lattice QCD results.

  16. Biomarkers in Acute Heart Failure – Cardiac And Kidney

    PubMed Central

    2015-01-01

    Natriuretic peptides (NP) are well-validated aids in the diagnosis of acute decompensated heart failure (ADHF). In acute presentations, both brain natriuretic peptide (BNP) and N-terminal of the prohormone brain natriuretic peptide (NT-proBNP) offer high sensitivity (>90 %) and negative predictive values (>95 %) for ruling out ADHF at thresholds of 100 and 300 pg/ml, respectively. Plasma NP rise with age. For added rule-in performance age-adjusted thresholds (450 pg/ml for under 50 years, 900 pg/ml for 50–75 years and 1,800 pg/ml for those >75 years) can be applied to NT-proBNP results. Test performance (specificity and accuracy but not sensitivity) is clearly reduced by renal dysfunction and atrial fibrillation. Obesity offsets the threshold downwards (to ~50 pg/ml for BNP), but overall discrimination is preserved. Reliable markers for impending acute kidney injury in ADHF constitute an unmet need, with candidates, such as kidney injury molecule-1 and neutrophil gelatinase-associated lipocalin, failing to perform sufficiently well, and new possibilities, including the cell cycle markers insulin growth factor binding protein 7 and tissue inhibitor of metalloproteinases type 2, remain the subject of research. PMID:28785442

  17. 40 CFR 52.1679 - EPA-approved New York State regulations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... approval; no action taken on provisions that may require PSD permits for sources of greenhouse gas (GHG) emissions with emissions below the thresholds identified in EPA's final PSD and Title V GHG Tailoring Rule...

  18. RMP Guidance for Warehouses - Introduction

    EPA Pesticide Factsheets

    If you handle, manufacture, use, or store any of the toxic and flammable substances listed in 40 CFR Part 68 above the specified threshold quantities in a process, you are required to develop and implement a risk management program rule.

  19. Mapping of High Value Crops Through AN Object-Based Svm Model Using LIDAR Data and Orthophoto in Agusan del Norte Philippines

    NASA Astrophysics Data System (ADS)

    Candare, Rudolph Joshua; Japitana, Michelle; Cubillas, James Earl; Ramirez, Cherry Bryan

    2016-06-01

    This research describes the methods involved in the mapping of different high value crops in Agusan del Norte Philippines using LiDAR. This project is part of the Phil-LiDAR 2 Program which aims to conduct a nationwide resource assessment using LiDAR. Because of the high resolution data involved, the methodology described here utilizes object-based image analysis and the use of optimal features from LiDAR data and Orthophoto. Object-based classification was primarily done by developing rule-sets in eCognition. Several features from the LiDAR data and Orthophotos were used in the development of rule-sets for classification. Generally, classes of objects can't be separated by simple thresholds from different features making it difficult to develop a rule-set. To resolve this problem, the image-objects were subjected to Support Vector Machine learning. SVMs have gained popularity because of their ability to generalize well given a limited number of training samples. However, SVMs also suffer from parameter assignment issues that can significantly affect the classification results. More specifically, the regularization parameter C in linear SVM has to be optimized through cross validation to increase the overall accuracy. After performing the segmentation in eCognition, the optimization procedure as well as the extraction of the equations of the hyper-planes was done in Matlab. The learned hyper-planes separating one class from another in the multi-dimensional feature-space can be thought of as super-features which were then used in developing the classifier rule set in eCognition. In this study, we report an overall classification accuracy of greater than 90% in different areas.

  20. 75 FR 80675 - Home Mortgage Disclosure

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-23

    ...The Board is publishing a final rule amending the staff commentary that interprets the requirements of Regulation C (Home Mortgage Disclosure). The staff commentary is amended to increase the asset-size exemption threshold for depository institutions based on the annual percentage change in the Consumer Price Index for Urban Wage Earners and Clerical Workers (CPIW). The adjustment from $39 million to $40 million reflects the increase of that index by 2.21 percent during the twelve-month period ending in November 2010. Thus, depository institutions with assets of $40 million or less as of December 31, 2010 are exempt from collecting data in 2011.

  1. Discriminating between Graduates and Failure in the USAF Medical Laboratory Specialist School: An Explorative Approach.

    DTIC Science & Technology

    1981-12-01

    occurred on the Introversion Scale of the NMPI. 20 A review of the use of psychological tests on MT’s was accomplished by Driver and Feeley [1974...programs, Gondek [1981] has recommended that the best pro- cedure for variable inclusion when using a stepwise procedure is to use the threshold default...values supplied by the package, since no simple rules exist for determining entry or removal thresholds for partial F’s, tolerance statistics, or any of

  2. Occupational injury and illness recording and reporting requirements. Final rule.

    PubMed

    2002-07-01

    The Occupational Safety and Health Administration (OSHA) is revising the hearing loss recording provisions of the Occupational Injury and Illness Recording and Reporting Requirements rule published January 19, 2001 (66 FR 5916-6135), scheduled to take effect on January 1, 2003 (66 FR 52031-52034). This final rule revises the criteria for recording hearing loss cases in several ways, including requiring the recording of Standard Threshold Shifts (10 dB shifts in hearing acuity) that have resulted in a total 25 dB level of hearing above audiometric zero, averaged over the frequencies at 2000, 3000, and 4000 Hz, beginning in year 2003.

  3. Thresholds for the cost-effectiveness of interventions: alternative approaches.

    PubMed

    Marseille, Elliot; Larson, Bruce; Kazi, Dhruv S; Kahn, James G; Rosen, Sydney

    2015-02-01

    Many countries use the cost-effectiveness thresholds recommended by the World Health Organization's Choosing Interventions that are Cost-Effective project (WHO-CHOICE) when evaluating health interventions. This project sets the threshold for cost-effectiveness as the cost of the intervention per disability-adjusted life-year (DALY) averted less than three times the country's annual gross domestic product (GDP) per capita. Highly cost-effective interventions are defined as meeting a threshold per DALY averted of once the annual GDP per capita. We argue that reliance on these thresholds reduces the value of cost-effectiveness analyses and makes such analyses too blunt to be useful for most decision-making in the field of public health. Use of these thresholds has little theoretical justification, skirts the difficult but necessary ranking of the relative values of locally-applicable interventions and omits any consideration of what is truly affordable. The WHO-CHOICE thresholds set such a low bar for cost-effectiveness that very few interventions with evidence of efficacy can be ruled out. The thresholds have little value in assessing the trade-offs that decision-makers must confront. We present alternative approaches for applying cost-effectiveness criteria to choices in the allocation of health-care resources.

  4. A directed matched filtering algorithm (DMF) for discriminating hydrothermal alteration zones using the ASTER remote sensing data

    NASA Astrophysics Data System (ADS)

    Fereydooni, H.; Mojeddifar, S.

    2017-09-01

    This study introduced a different procedure to implement matched filtering algorithm (MF) on the ASTER images to obtain the distribution map of alteration minerals in the northwestern part of the Kerman Cenozoic Magmatic Arc (KCMA). This region contains many areas with porphyry copper mineralization such as Meiduk, Abdar, Kader, Godekolvari, Iju, Serenu, Chahfiroozeh and Parkam. Also argillization, sericitization and propylitization are the most common types of hydrothermal alteration in the area. Matched filtering results were provided for alteration minerals with a matched filtering score, called MF image. To identify the pixels which contain only one material (endmember), an appropriate threshold value should be used to the MF image. The chosen threshold classifies a MF image into background and target pixels. This article argues that the current thresholding process (the choice of a threshold) shows misclassification for MF image. To address the issue, this paper introduced the directed matched filtering (DMF) algorithm in which a spectral signature-based filter (SSF) was used instead of the thresholding process. SSF is a user-defined rule package which contains numeral descriptions about the spectral reflectance of alteration minerals. On the other hand, the spectral bands are defined by an upper and lower limit in SSF filter for each alteration minerals. SSF was developed for chlorite, kaolinite, alunite, and muscovite minerals to map alteration zones. The validation proved that, at first: selecting a contiguous range of MF values could not identify desirable results, second: unexpectedly, considerable frequency of pure pixels was observed in the MF scores less than threshold value. Also, the comparison between DMF results and field studies showed an accuracy of 88.51%.

  5. Verification of the H2O Linelists with Theoretically Developed Tools

    NASA Technical Reports Server (NTRS)

    Ma, Qiancheng; Tipping, R.; Lavrentieva, N. N.; Dudaryonok, A. S.

    2013-01-01

    Two basic rules (i.e., the pair identity and the smooth variation rules) resulting from the properties of the energy levels and wave functions of H2O states govern how the spectroscopic parameters vary with the H2O lines within the individually defined groups of lines. With these rules, for those lines involving high j states in the same groups, variations of all their spectroscopic parameters (i.e., the transition frequency, intensity, pressure broadened half-width, pressure-induced shift, and temperature exponent) can be well monitored. Thus, the rules can serve as simple and effective tools to screen the H2O spectroscopic data listed in the HITRAN database and verify the latter's accuracies. By checking violations of the rules occurring among the data within the individual groups, possible errors can be picked up and also possible missing lines in the linelist whose intensities are above the threshold can be identified. We have used these rules to check the accuracies of the spectroscopic parameters and the completeness of the linelists for several important H2O vibrational bands. Based on our results, the accuracy of the line frequencies in HITRAN 2008 is consistent. For the line intensity, we have found that there are a substantial number of lines whose intensity values are questionable. With respect to other parameters, many mistakes have been found. The above claims are consistent with a well known fact that values of these parameters in HITRAN contain larger uncertainties. Furthermore, supplements of the missing line list consisting of line assignments and positions can be developed from the screening results.

  6. Stochastic prey arrivals and crab spider giving-up times: simulations of spider performance using two simple "rules of thumb".

    PubMed

    Kareiva, Peter; Morse, Douglass H; Eccleston, Jill

    1989-03-01

    We compared the patch-choice performances of an ambush predator, the crab spider Misumena vatia (Thomisidae) hunting on common milkweed Asclepias syriaca (Asclepiadaceae) umbles, with two stochastic rule-of-thumb simulation models: one that employed a threshold giving-up time and one that assumed a fixed probability of moving. Adult female Misumena were placed on milkweed plants with three umbels, each with markedly different numbers of flower-seeking prey. Using a variety of visitation regimes derived from observed visitation patterns of insect prey, we found that decreases in among-umbel variance in visitation rates or increases in overall mean visitation rates reduced the "clarity of the optimum" (the difference in the yield obtained as foraging behavior changes), both locally and globally. Yield profiles from both models were extremely flat or jagged over a wide range of prey visitation regimes; thus, differences between optimal and "next-best" strategies differed only modestly over large parts of the "foraging landscape". Although optimal yields from fixed probability simulations were one-third to one-half those obtained from threshold simulations, spiders appear to depart umbels in accordance with the fixed probability rule.

  7. Enhancement mechanism of the additional absorbent on the absorption of the absorbing composite using a type-based mixing rule

    NASA Astrophysics Data System (ADS)

    Xu, Yonggang; Yuan, Liming; Zhang, Deyuan

    2016-04-01

    A silicone rubber composite filled with carbonyl iron particles and four different carbonous materials (carbon black, graphite, carbon fiber or multi-walled carbon nanotubes) was prepared using a two-roller mixture. The complex permittivity and permeability were measured using a vector network analyzer at the frequency of 2-18 GHz. Then a type-based mixing rule based on the dielectric absorbent and magnetic absorbent was proposed to reveal the enhancing mechanism on the permittivity and permeability. The enforcement effect lies in the decreased percolation threshold and the changing pending parameter as the carbonous materials were added. The reflection loss (RL) result showed the added carbonous materials enhanced the absorption in the lower frequency range, the RL decrement value being about 2 dB at 4-5 GHz with a thickness of 1 mm. All the added carbonous materials reinforced the shielding effectiveness (SE) of the composites. The maximum increment value of the SE was about 3.23 dB at 0.5 mm and 4.65 dB at 1 mm, respectively. The added carbonous materials could be effective additives for enforcing the absorption and shielding property of the absorbers.

  8. Reverse engineering the gap gene network of Drosophila melanogaster.

    PubMed

    Perkins, Theodore J; Jaeger, Johannes; Reinitz, John; Glass, Leon

    2006-05-01

    A fundamental problem in functional genomics is to determine the structure and dynamics of genetic networks based on expression data. We describe a new strategy for solving this problem and apply it to recently published data on early Drosophila melanogaster development. Our method is orders of magnitude faster than current fitting methods and allows us to fit different types of rules for expressing regulatory relationships. Specifically, we use our approach to fit models using a smooth nonlinear formalism for modeling gene regulation (gene circuits) as well as models using logical rules based on activation and repression thresholds for transcription factors. Our technique also allows us to infer regulatory relationships de novo or to test network structures suggested by the literature. We fit a series of models to test several outstanding questions about gap gene regulation, including regulation of and by hunchback and the role of autoactivation. Based on our modeling results and validation against the experimental literature, we propose a revised network structure for the gap gene system. Interestingly, some relationships in standard textbook models of gap gene regulation appear to be unnecessary for or even inconsistent with the details of gap gene expression during wild-type development.

  9. Identification of Subgroups of Women with Carpal Tunnel Syndrome with Central Sensitization.

    PubMed

    Fernández-de-Las-Peñas, César; Fernández-Muñoz, Juan J; Navarro-Pardo, Esperanza; da-Silva-Pocinho, Ricardo F; Ambite-Quesada, Silvia; Pareja, Juan A

    2016-09-01

    Identification of subjects with different sensitization mechanisms can help to identify better therapeutic strategies for carpal tunnel syndrome (CTS). The aim of the current study was to identify subgroups of women with CTS with different levels of sensitization. A total of 223 women with CTS were recruited. Self-reported variables included pain intensity, function, disability, and depression. Pressure pain thresholds (PPT) were assessed bilaterally over median, ulnar, and radial nerves, C5-C6 joint, carpal tunnel, and tibialis anterior to assess widespread pressure pain hyperalgesia. Heat (HPT) and cold (CPT) pain thresholds were also bilaterally assessed over the carpal tunnel and the thenar eminence to determine thermal pain hyperalgesia. Pinch grip force between the thumb and the remaining fingers was calculated to determine motor assessment. Subgroups were determined according to the status on a previous clinical prediction rule: PPT over the affected C5-C6 joint < 137 kPa, HPT on affected carpal tunnel <39.6ºC, and general health >66 points. The ANOVA showed that women within group 1 (positive rule, n = 60) exhibited bilateral widespread pressure hyperalgesia (P < 0.001) and bilateral thermal thresholds (P < 0.001) than those within group 2 (negative rule, n = 162). Women in group 1 also exhibited higher depression than those in group 2 (P = 0.023). No differences in self-reported variables were observed. This study showed that a clinical prediction rule originally developed for identifying women with CTS who are likely to respond favorably to manual physical therapy was able to identify women exhibiting higher widespread pressure hyper-sensitivity and thermal hyperalgesia. This subgroup of women with CTS exhibiting higher sensitization may need specific therapeutic programs. © 2016 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Simulation-based model checking approach to cell fate specification during Caenorhabditis elegans vulval development by hybrid functional Petri net with extension.

    PubMed

    Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru

    2009-04-27

    Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I that cannot be applied with qualitative based model checking, is more reasonable than Rule II owing to the high coverage of predicted fate patterns (except for the genotype of lin-15ko; lin-12ko double mutants). More insights are also suggested. The quantitative simulation-based model checking approach is a useful means to provide us valuable biological insights and better understandings of biological systems and observation data that may be hard to capture with the qualitative one.

  11. Price of anarchy is maximized at the percolation threshold.

    PubMed

    Skinner, Brian

    2015-05-01

    When many independent users try to route traffic through a network, the flow can easily become suboptimal as a consequence of congestion of the most efficient paths. The degree of this suboptimality is quantified by the so-called price of anarchy (POA), but so far there are no general rules for when to expect a large POA in a random network. Here I address this question by introducing a simple model of flow through a network with randomly placed congestible and incongestible links. I show that the POA is maximized precisely when the fraction of congestible links matches the percolation threshold of the lattice. Both the POA and the total cost demonstrate critical scaling near the percolation threshold.

  12. 76 FR 27677 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-12

    ...] Index and other specified exchange traded products.\\5\\ \\3\\ See ISE Rule 2100(c)(13). \\4\\ See Securities... that this threshold differentiation is appropriate because lower priced securities tend to be more...

  13. 78 FR 60381 - Amendments to the 2013 Mortgage Rules Under the Equal Credit Opportunity Act (Regulation B), Real...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-01

    ...This final rule amends some of the final mortgage rules issued by the Bureau of Consumer Financial Protection (Bureau) in January 2013. These amendments focus primarily on loss mitigation procedures under Regulation X's servicing provisions, amounts counted as loan originator compensation to retailers of manufactured homes and their employees for purposes of applying points and fees thresholds under the Home Ownership and Equity Protection Act and the Ability-to-Repay rules in Regulation Z, exemptions available to creditors that operate predominantly in ``rural or underserved'' areas for various purposes under the mortgage regulations, application of the loan originator compensation rules to bank tellers and similar staff, and the prohibition on creditor-financed credit insurance. The Bureau also is adjusting the effective dates for certain provisions of the loan originator compensation rules. In addition, the Bureau is adopting technical and wording changes for clarification purposes to Regulations B, X, and Z.

  14. Efficient mining of association rules for the early diagnosis of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Chaves, R.; Górriz, J. M.; Ramírez, J.; Illán, I. A.; Salas-Gonzalez, D.; Gómez-Río, M.

    2011-09-01

    In this paper, a novel technique based on association rules (ARs) is presented in order to find relations among activated brain areas in single photon emission computed tomography (SPECT) imaging. In this sense, the aim of this work is to discover associations among attributes which characterize the perfusion patterns of normal subjects and to make use of them for the early diagnosis of Alzheimer's disease (AD). Firstly, voxel-as-feature-based activation estimation methods are used to find the tridimensional activated brain regions of interest (ROIs) for each patient. These ROIs serve as input to secondly mine ARs with a minimum support and confidence among activation blocks by using a set of controls. In this context, support and confidence measures are related to the proportion of functional areas which are singularly and mutually activated across the brain. Finally, we perform image classification by comparing the number of ARs verified by each subject under test to a given threshold that depends on the number of previously mined rules. Several classification experiments were carried out in order to evaluate the proposed methods using a SPECT database that consists of 41 controls (NOR) and 56 AD patients labeled by trained physicians. The proposed methods were validated by means of the leave-one-out cross validation strategy, yielding up to 94.87% classification accuracy, thus outperforming recent developed methods for computer aided diagnosis of AD.

  15. Nodule Detection in a Lung Region that's Segmented with Using Genetic Cellular Neural Networks and 3D Template Matching with Fuzzy Rule Based Thresholding

    PubMed Central

    Osman, Onur; Ucan, Osman N.

    2008-01-01

    Objective The purpose of this study was to develop a new method for automated lung nodule detection in serial section CT images with using the characteristics of the 3D appearance of the nodules that distinguish themselves from the vessels. Materials and Methods Lung nodules were detected in four steps. First, to reduce the number of region of interests (ROIs) and the computation time, the lung regions of the CTs were segmented using Genetic Cellular Neural Networks (G-CNN). Then, for each lung region, ROIs were specified with using the 8 directional search; +1 or -1 values were assigned to each voxel. The 3D ROI image was obtained by combining all the 2-Dimensional (2D) ROI images. A 3D template was created to find the nodule-like structures on the 3D ROI image. Convolution of the 3D ROI image with the proposed template strengthens the shapes that are similar to those of the template and it weakens the other ones. Finally, fuzzy rule based thresholding was applied and the ROI's were found. To test the system's efficiency, we used 16 cases with a total of 425 slices, which were taken from the Lung Image Database Consortium (LIDC) dataset. Results The computer aided diagnosis (CAD) system achieved 100% sensitivity with 13.375 FPs per case when the nodule thickness was greater than or equal to 5.625 mm. Conclusion Our results indicate that the detection performance of our algorithm is satisfactory, and this may well improve the performance of computer-aided detection of lung nodules. PMID:18253070

  16. Notice and Supplemental Determination for Renewable Fuels Produced Under the Final Renewable Fuel Standard Program from Canola Oil

    EPA Pesticide Factsheets

    This rule finalizes the determination that canola oil biodiesel meets the lifecycle greenhouse gas (GHG) emission reduction threshold of 50 required by the Energy Independence and Security Act of 2007 (EISA).

  17. Fact Sheet: Clean Air Act Section 112(r): Accidental Release Prevention / Risk Management Plan Rule

    EPA Pesticide Factsheets

    EPA is required to publish regulations and guidance for chemical accident prevention at facilities that pose the greatest risk of harm from accidental releases of regulated flammable and toxic substances above threshold quantities.

  18. 77 FR 19741 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-02

    ... will be attractive to all investors and benefit all market participants. The thresholds in the... such rule change if it appears to the Commission that such action is necessary or appropriate in the...

  19. Wavelet methodology to improve single unit isolation in primary motor cortex cells.

    PubMed

    Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A

    2015-05-15

    The proper isolation of action potentials recorded extracellularly from neural tissue is an active area of research in the fields of neuroscience and biomedical signal processing. This paper presents an isolation methodology for neural recordings using the wavelet transform (WT), a statistical thresholding scheme, and the principal component analysis (PCA) algorithm. The effectiveness of five different mother wavelets was investigated: biorthogonal, Daubachies, discrete Meyer, symmetric, and Coifman; along with three different wavelet coefficient thresholding schemes: fixed form threshold, Stein's unbiased estimate of risk, and minimax; and two different thresholding rules: soft and hard thresholding. The signal quality was evaluated using three different statistical measures: mean-squared error, root-mean squared, and signal to noise ratio. The clustering quality was evaluated using two different statistical measures: isolation distance, and L-ratio. This research shows that the selection of the mother wavelet has a strong influence on the clustering and isolation of single unit neural activity, with the Daubachies 4 wavelet and minimax thresholding scheme performing the best. Copyright © 2015. Published by Elsevier B.V.

  20. How insurance affects altruistic provision in threshold public goods games.

    PubMed

    Zhang, Jianlei; Zhang, Chunyan; Cao, Ming

    2015-03-13

    The occurrence and maintenance of cooperative behaviors in public goods systems have attracted great research attention across multiple disciplines. A threshold public goods game requires a minimum amount of contributions to be collected from a group of individuals for provision to occur. Here we extend the common binary-strategy combination of cooperation and defection by adding a third strategy, called insured cooperation, which corresponds to buying an insurance covering the potential loss resulted from the unsuccessful public goods game. Particularly, only the contributing agents can opt to be insured, which is an effort decreasing the amount of the potential loss occurring. Theoretical computations suggest that when agents face the potential aggregate risk in threshold public goods games, more contributions occur with increasing compensation from insurance. Moreover, permitting the adoption of insurance significantly enhances individual contributions and facilitates provision, especially when the required threshold is high. This work also relates the strategy competition outcomes to different allocation rules once the resulted contributions exceed the threshold point in populations nested within a dilemma.

  1. Constraints on the FRB rate at 700-900 MHz

    NASA Astrophysics Data System (ADS)

    Connor, Liam; Lin, Hsiu-Hsien; Masui, Kiyoshi; Oppermann, Niels; Pen, Ue-Li; Peterson, Jeffrey B.; Roman, Alexander; Sievers, Jonathan

    2016-07-01

    Estimating the all-sky rate of fast radio bursts (FRBs) has been difficult due to small-number statistics and the fact that they are seen by disparate surveys in different regions of the sky. In this paper we provide limits for the FRB rate at 800 MHz based on the only burst detected at frequencies below 1.4 GHz, FRB 110523. We discuss the difficulties in rate estimation, particularly in providing an all-sky rate above a single fluence threshold. We find an implied rate between 700 and 900 MHz that is consistent with the rate at 1.4 GHz, scaling to 6.4^{+29.5}_{-5.0} × 10^3 sky-1 d-1 for an HTRU-like survey. This is promising for upcoming experiments below a GHz like CHIME and UTMOST, for which we forecast detection rates. Given 110523's discovery at 32σ with nothing weaker detected, down to the threshold of 8σ, we find consistency with a Euclidean flux distribution but disfavour steep distributions, ruling out γ > 2.2.

  2. Online Sensor Fault Detection Based on an Improved Strong Tracking Filter

    PubMed Central

    Wang, Lijuan; Wu, Lifeng; Guan, Yong; Wang, Guohui

    2015-01-01

    We propose a method for online sensor fault detection that is based on the evolving Strong Tracking Filter (STCKF). The cubature rule is used to estimate states to improve the accuracy of making estimates in a nonlinear case. A residual is the difference in value between an estimated value and the true value. A residual will be regarded as a signal that includes fault information. The threshold is set at a reasonable level, and will be compared with residuals to determine whether or not the sensor is faulty. The proposed method requires only a nominal plant model and uses STCKF to estimate the original state vector. The effectiveness of the algorithm is verified by simulation on a drum-boiler model. PMID:25690553

  3. Excusing exclusion: Accounting for rule-breaking and sanctions in a Swedish methadone clinic.

    PubMed

    Petersson, Frida J M

    2013-11-01

    Methadone maintenance treatment has been subjected to much debate and controversy in Sweden during the last decades. Thresholds for getting access are high and control policies strict within the programmes. This article analyses how professionals working in a Swedish methadone clinic discuss and decide on appropriate responses to clients' rule-breaking behaviour. The research data consist of field notes from observations of three interprofessional team meetings where different clients' illicit drug use is discussed. A micro-sociological approach and accounts analysis was applied to the data. During their decision-oriented talk at the meetings, the professionals account for: (1) sanctions, (2) nonsanction, (3) mildness. In accounting for (2) and (3), they also account for clients' rule-breaking behaviour. Analysis shows how these ways of accounting are concerned with locating blame and responsibility for the act in question, as well as with constructing excuses and justifications for the clients and for the professionals themselves. In general, these results demonstrate that maintenance treatment in everyday professional decision-making, far from being a neutral evidence-based practice, involves a substantial amount of professional discretion and moral judgements. Sanctions are chosen according to the way in which a deviance from the rules is explained and, in doing so, a certain behaviour is deemed to be serious, dangerous and unacceptable - or excusable. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Gravitropic responses of the Avena coleoptile in space and on clinostats. II. Is reciprocity valid?

    NASA Technical Reports Server (NTRS)

    Johnsson, A.; Brown, A. H.; Chapman, D. K.; Heathcote, D.; Karlsson, C.

    1995-01-01

    Experiments were undertaken to determine if the reciprocity rule is valid for gravitropic responses of oat coleoptiles in the acceleration region below 1 g. The rule predicts that the gravitropic response should be proportional to the product of the applied acceleration and the stimulation time. Seedlings were cultivated on 1 g centrifuges and transferred to test centrifuges to apply a transverse g-stimulation. Since responses occurred in microgravity, the uncertainties about the validity of clinostat simulation of weightlessness was avoided. Plants at two stages of coleoptile development were tested. Plant responses were obtained using time-lapse video recordings that were analyzed after the flight. Stimulus intensities and durations were varied and ranged from 0.1 to 1.0 g and from 2 to 130 min, respectively. For threshold g-doses the reciprocity rule was obeyed. The threshold dose was of the order of 55 g s and 120 g s, respectively, for two groups of plants investigated. Reciprocity was studied also at bending responses which are from just above the detectable level to about 10 degrees. The validity of the rule could not be confirmed for higher g-doses, chiefly because the data were more variable. It was investigated whether the uniformity of the overall response data increased when the gravitropic dose was defined as (gm x t) with m-values different from unity. This was not the case and the reciprocity concept is, therefore, valid also in the hypogravity region. The concept of gravitropic dose, the product of the transverse acceleration and the stimulation time, is also well-defined in the acceleration region studied. With the same hardware, tests were done on earth where responses occurred on clinostats. The results did not contradict the reciprocity rule but scatter in the data was large.

  5. Total and partial photoneutron cross sections for Pb isotopes

    NASA Astrophysics Data System (ADS)

    Kondo, T.; Utsunomiya, H.; Goriely, S.; Daoutidis, I.; Iwamoto, C.; Akimune, H.; Okamoto, A.; Yamagata, T.; Kamata, M.; Itoh, O.; Toyokawa, H.; Lui, Y.-W.; Harada, H.; Kitatani, F.; Hilaire, S.; Koning, A. J.

    2012-07-01

    Using quasimonochromatic laser-Compton scattering γ rays, total photoneutron cross sections were measured for 206,207,208Pb near neutron threshold with a high-efficiency 4π neutron detector. Partial E1 and M1 photoneutron cross sections along with total cross sections were determined for 207,208Pb at four energies near threshold by measuring anisotropies in photoneutron emission with linearly polarized γ rays. The E1 strength dominates over the M1 strength in the neutron channel where E1 photoneutron cross sections show extra strength of the pygmy dipole resonance in 207,208Pb near the neutron threshold corresponding to 0.32%-0.42% of the Thomas-Reiche-Kuhn sum rule. Several μN2 units of B(M1)↑ strength were observed in 207,208Pb just above neutron threshold, which correspond to an M1 cross section less than 10% of the total photoneutron cross section.

  6. High-wafer-yield, high-performance vertical cavity surface-emitting lasers

    NASA Astrophysics Data System (ADS)

    Li, Gabriel S.; Yuen, Wupen; Lim, Sui F.; Chang-Hasnain, Constance J.

    1996-04-01

    Vertical cavity surface emitting lasers (VCSELs) with very low threshold current and voltage of 340 (mu) A and 1.5 V is achieved. The molecular beam epitaxially grown wafers are grown with a highly accurate, low cost and versatile pre-growth calibration technique. One- hundred percent VCSEL wafer yield is obtained. Low threshold current is achieved with a native oxide confined structure with excellent current confinement. Single transverse mode with stable, predetermined polarization direction up to 18 times threshold is also achieved, due to stable index guiding provided by the structure. This is the highest value reported to data for VCSELs. We have established that p-contact annealing in these devices is crucial for low voltage operation, contrary to the general belief. Uniform doping in the mirrors also appears not to be inferior to complicated doping engineering. With these design rules, very low threshold voltage VCSELs are achieved with very simple growth and fabrication steps.

  7. RMP Guidance for Warehouses - Chapter 1: General Applicability

    EPA Pesticide Factsheets

    Helps you determine if you are subject to Part 68, the risk management program rule. It covers you if you are the owner/operator of a stationary source, that has more than a threshold quantity, of a regulated substance, in a process.

  8. RMP Guidance for Chemical Distributors - Chapter 1: General Applicability

    EPA Pesticide Factsheets

    The Risk Management Program rule covers you if you are: the owner/operator of a stationary source, that has more than a threshold quantity, of a regulated substance, in a process. Follow the flowchart, definitions, and Q & A's to determine applicability.

  9. General RMP Guidance - Chapter 1: General Applicability

    EPA Pesticide Factsheets

    Part 68, the risk management program rule, covers you if you are the owner or operator of a stationary source (facility), that has more than a threshold quantity, of a regulated toxic or flammable substance(e.g., ammonia or chlorine), in a process.

  10. 78 FR 13405 - Patient Protection and Affordable Care Act; Health Insurance Market Rules; Rate Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-27

    ...This final rule implements provisions related to fair health insurance premiums, guaranteed availability, guaranteed renewability, single risk pools, and catastrophic plans, consistent with title I of the Patient Protection and Affordable Care Act, as amended by the Health Care and Education Reconciliation Act of 2010, referred to collectively as the Affordable Care Act. The final rule clarifies the approach used to enforce the applicable requirements of the Affordable Care Act with respect to health insurance issuers and group health plans that are non-federal governmental plans. This final rule also amends the standards for health insurance issuers and states regarding reporting, utilization, and collection of data under the federal rate review program, and revises the timeline for states to propose state- specific thresholds for review and approval by the Centers for Medicare & Medicaid Services (CMS).

  11. Variability of blood alcohol content (BAC) determinations: the role of measurement uncertainty, significant figures, and decision rules for compliance assessment in the frame of a multiple BAC threshold law.

    PubMed

    Zamengo, Luca; Frison, Giampietro; Tedeschi, Gianpaola; Frasson, Samuela; Zancanaro, Flavio; Sciarrone, Rocco

    2014-10-01

    The measurement of blood-alcohol content (BAC) is a crucial analytical determination required to assess if an offence (e.g. driving under the influence of alcohol) has been committed. For various reasons, results of forensic alcohol analysis are often challenged by the defence. As a consequence, measurement uncertainty becomes a critical topic when assessing compliance with specification limits for forensic purposes. The aims of this study were: (1) to investigate major sources of variability for BAC determinations; (2) to estimate measurement uncertainty for routine BAC determinations; (3) to discuss the role of measurement uncertainty in compliance assessment; (4) to set decision rules for a multiple BAC threshold law, as provided in the Italian Highway Code; (5) to address the topic of the zero-alcohol limit from the forensic toxicology point of view; and (6) to discuss the role of significant figures and rounding errors on measurement uncertainty and compliance assessment. Measurement variability was investigated by the analysis of data collected from real cases and internal quality control. The contribution of both pre-analytical and analytical processes to measurement variability was considered. The resulting expanded measurement uncertainty was 8.0%. Decision rules for the multiple BAC threshold Italian law were set by adopting a guard-banding approach. 0.1 g/L was chosen as cut-off level to assess compliance with the zero-alcohol limit. The role of significant figures and rounding errors in compliance assessment was discussed by providing examples which stressed the importance of these topics for forensic purposes. Copyright © 2014 John Wiley & Sons, Ltd.

  12. Finite-width Laplacian sum rules for 2++ tensor glueball in the instanton vacuum model

    NASA Astrophysics Data System (ADS)

    Chen, Junlong; Liu, Jueping

    2017-01-01

    The more carefully defined and more appropriate 2++ tensor glueball current is a S Uc(3 ) gauge-invariant, symmetric, traceless, and conserved Lorentz-irreducible tensor. After Lorentz decomposition, the invariant amplitude of the correlation function is abstracted and calculated based on the semiclassical expansion for quantum chromodynamics (QCD) in the instanton liquid background. In addition to taking the perturbative contribution into account, we calculate the contribution arising from the interaction (or the interference) between instantons and the quantum gluon fields, which is infrared free. Instead of the usual zero-width approximation for the resonances, the Breit-Wigner form with a correct threshold behavior for the spectral function of the finite-width three resonances is adopted. The properties of the 2++ tensor glueball are investigated via a family of the QCD Laplacian sum rules for the invariant amplitude. The values of the mass, decay width, and coupling constants for the 2++ resonance in which the glueball fraction is dominant are obtained.

  13. Effective application of improved profit-mining algorithm for the interday trading model.

    PubMed

    Hsieh, Yu-Lung; Yang, Don-Lin; Wu, Jungpin

    2014-01-01

    Many real world applications of association rule mining from large databases help users make better decisions. However, they do not work well in financial markets at this time. In addition to a high profit, an investor also looks for a low risk trading with a better rate of winning. The traditional approach of using minimum confidence and support thresholds needs to be changed. Based on an interday model of trading, we proposed effective profit-mining algorithms which provide investors with profit rules including information about profit, risk, and winning rate. Since profit-mining in the financial market is still in its infant stage, it is important to detail the inner working of mining algorithms and illustrate the best way to apply them. In this paper we go into details of our improved profit-mining algorithm and showcase effective applications with experiments using real world trading data. The results show that our approach is practical and effective with good performance for various datasets.

  14. Effective Application of Improved Profit-Mining Algorithm for the Interday Trading Model

    PubMed Central

    Wu, Jungpin

    2014-01-01

    Many real world applications of association rule mining from large databases help users make better decisions. However, they do not work well in financial markets at this time. In addition to a high profit, an investor also looks for a low risk trading with a better rate of winning. The traditional approach of using minimum confidence and support thresholds needs to be changed. Based on an interday model of trading, we proposed effective profit-mining algorithms which provide investors with profit rules including information about profit, risk, and winning rate. Since profit-mining in the financial market is still in its infant stage, it is important to detail the inner working of mining algorithms and illustrate the best way to apply them. In this paper we go into details of our improved profit-mining algorithm and showcase effective applications with experiments using real world trading data. The results show that our approach is practical and effective with good performance for various datasets. PMID:24688442

  15. [The diagnostic and the exclusion scores for pulmonary embolism].

    PubMed

    Junod, A

    2015-05-27

    Several clinical scores for the diagnosis of pulmonary embolism (PE) have been published. The most popular ones are the Wells score and the revised Geneva score; simplified versions exist for these two scores; they have been validated. Both scores have common properties, but there is a major difference for the Wells score, namely the inclusion of a feature based on clinical judgment. These two scores in combination with D-dimers measurement have been used to rule out PE. An important improvement in this process has recently taken place with the use of an adjustable, age-dependent threshold for DD for patients over 50 years.

  16. A hierarchical classification method for finger knuckle print recognition

    NASA Astrophysics Data System (ADS)

    Kong, Tao; Yang, Gongping; Yang, Lu

    2014-12-01

    Finger knuckle print has recently been seen as an effective biometric technique. In this paper, we propose a hierarchical classification method for finger knuckle print recognition, which is rooted in traditional score-level fusion methods. In the proposed method, we firstly take Gabor feature as the basic feature for finger knuckle print recognition and then a new decision rule is defined based on the predefined threshold. Finally, the minor feature speeded-up robust feature is conducted for these users, who cannot be recognized by the basic feature. Extensive experiments are performed to evaluate the proposed method, and experimental results show that it can achieve a promising performance.

  17. Congestion control for a fair packet delivery in WSN: from a complex system perspective.

    PubMed

    Aguirre-Guerrero, Daniela; Marcelín-Jiménez, Ricardo; Rodriguez-Colina, Enrique; Pascoe-Chalke, Michael

    2014-01-01

    In this work, we propose that packets travelling across a wireless sensor network (WSN) can be seen as the active agents that make up a complex system, just like a bird flock or a fish school, for instance. From this perspective, the tools and models that have been developed to study this kind of systems have been applied. This is in order to create a distributed congestion control based on a set of simple rules programmed at the nodes of the WSN. Our results show that it is possible to adapt the carried traffic to the network capacity, even under stressing conditions. Also, the network performance shows a smooth degradation when the traffic goes beyond a threshold which is settled by the proposed self-organized control. In contrast, without any control, the network collapses before this threshold. The use of the proposed solution provides an effective strategy to address some of the common problems found in WSN deployment by providing a fair packet delivery. In addition, the network congestion is mitigated using adaptive traffic mechanisms based on a satisfaction parameter assessed by each packet which has impact on the global satisfaction of the traffic carried by the WSN.

  18. Evaluation of enhanced sanctions for higher BACs : summary of states' laws

    DOT National Transportation Integrated Search

    2001-03-01

    Twenty-nine states have a stature, regulation, or rule that provides for additional or more severe sanctions for driving under the influence (DUI) offenders with a "high" BAC. States vary in terms of the high-BAC threshold, which ranges from .15 to ....

  19. 76 FR 34630 - Approval and Promulgation of Implementation Plans; New Hampshire: Prevention of Significant...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-14

    ... Promulgation of Implementation Plans; New Hampshire: Prevention of Significant Deterioration; Greenhouse Gas... greenhouse gas (GHG) emissions. This rule clarifies the applicable thresholds in the New Hampshire SIP... Significant Deterioration Provisions Concerning Greenhouse Gas Emitting-Sources in State Implementation Plans...

  20. Meta‐analysis of test accuracy studies using imputation for partial reporting of multiple thresholds

    PubMed Central

    Deeks, J.J.; Martin, E.C.; Riley, R.D.

    2017-01-01

    Introduction For tests reporting continuous results, primary studies usually provide test performance at multiple but often different thresholds. This creates missing data when performing a meta‐analysis at each threshold. A standard meta‐analysis (no imputation [NI]) ignores such missing data. A single imputation (SI) approach was recently proposed to recover missing threshold results. Here, we propose a new method that performs multiple imputation of the missing threshold results using discrete combinations (MIDC). Methods The new MIDC method imputes missing threshold results by randomly selecting from the set of all possible discrete combinations which lie between the results for 2 known bounding thresholds. Imputed and observed results are then synthesised at each threshold. This is repeated multiple times, and the multiple pooled results at each threshold are combined using Rubin's rules to give final estimates. We compared the NI, SI, and MIDC approaches via simulation. Results Both imputation methods outperform the NI method in simulations. There was generally little difference in the SI and MIDC methods, but the latter was noticeably better in terms of estimating the between‐study variances and generally gave better coverage, due to slightly larger standard errors of pooled estimates. Given selective reporting of thresholds, the imputation methods also reduced bias in the summary receiver operating characteristic curve. Simulations demonstrate the imputation methods rely on an equal threshold spacing assumption. A real example is presented. Conclusions The SI and, in particular, MIDC methods can be used to examine the impact of missing threshold results in meta‐analysis of test accuracy studies. PMID:29052347

  1. The Rule of Five for Non-Oral Routes of Drug Delivery: Ophthalmic, Inhalation and Transdermal

    PubMed Central

    Choy, Young Bin; Prausnitz, Mark R.

    2011-01-01

    The Rule of Five predicts suitability of drug candidates, but was developed primarily using orally administered drugs. Here, we test whether the Rule of Five predicts drugs for delivery via non-oral routes, specifically ophthalmic, inhalation and transdermal. We assessed 111 drugs approved by FDA for those routes of administration and found that >98% of current non-oral drugs have physicochemical properties within the limits of the Rule of Five. However, given the inherent bias in the dataset, this analysis was not able to assess whether drugs with properties outside those limits are poor candidates. Indeed, further analysis indicates that drugs well outside the Rule of Five limits, including hydrophilic macromolecules, can be delivered by inhalation. In contrast, drugs currently administered across skin fall within more stringent limits than predicted by the Rule of Five, but new transdermal delivery technologies may make these constraints obsolete by dramatically increasing skin permeability. The Rule of Five does appear to apply well to ophthalmic delivery. We conclude that although current non-oral drugs mostly have physicochemical properties within the Rule of Five thresholds, the Rule of Five should not be used to predict non-oral drug candidates, especially for inhalation and transdermal routes. PMID:20967491

  2. Impact of OSHA final rule--recording hearing loss: an analysis of an industrial audiometric dataset.

    PubMed

    Rabinowitz, Peter M; Slade, Martin; Dixon-Ernst, Christine; Sircar, Kanta; Cullen, Mark

    2003-12-01

    The 2003 Occupational Safety and Health Administration (OSHA) Occupational Injury and Illness Recording and Reporting Final Rule changed the definition of recordable work-related hearing loss. We performed a study of the Alcoa Inc. audiometric database to evaluate the impact of this new rule. The 2003 rule increased the rate of potentially recordable hearing loss events from 0.2% to 1.6% per year. A total of 68.6% of potentially recordable cases had American Academy of Audiology/American Medical Association (AAO/AMA) hearing impairment at the time of recordability. On average, recordable loss occurred after onset of impairment, whereas the non-age-corrected 10-dB standard threshold shift (STS) usually preceded impairment. The OSHA Final Rule will significantly increase recordable cases of occupational hearing loss. The new case definition is usually accompanied by AAO/AMA hearing impairment. Other, more sensitive metrics should therefore be used for early detection and prevention of hearing loss.

  3. Body size phenology in a regional bee fauna: a temporal extension of Bergmann's rule.

    PubMed

    Osorio-Canadas, Sergio; Arnan, Xavier; Rodrigo, Anselm; Torné-Noguera, Anna; Molowny, Roberto; Bosch, Jordi

    2016-12-01

    Bergmann's rule originally described a positive relationship between body size and latitude in warm-blooded animals. Larger animals, with a smaller surface/volume ratio, are better enabled to conserve heat in cooler climates (thermoregulatory hypothesis). Studies on endothermic vertebrates have provided support for Bergmann's rule, whereas studies on ectotherms have yielded conflicting results. If the thermoregulatory hypothesis is correct, negative relationships between body size and temperature should occur in temporal in addition to geographical gradients. To explore this possibility, we analysed seasonal activity patterns in a bee fauna comprising 245 species. In agreement with our hypothesis of a different relationship for large (endothermic) and small (ectothermic) species, we found that species larger than 27.81 mg (dry weight) followed Bergmann's rule, whereas species below this threshold did not. Our results represent a temporal extension of Bergmann's rule and indicate that body size and thermal physiology play an important role in structuring community phenology. © 2016 John Wiley & Sons Ltd/CNRS.

  4. Diagnostic Accuracy of History, Physical Examination, Laboratory Tests, and Point-of-care Ultrasound for Pediatric Acute Appendicitis in the Emergency Department: A Systematic Review and Meta-analysis.

    PubMed

    Benabbas, Roshanak; Hanna, Mark; Shah, Jay; Sinert, Richard

    2017-05-01

    Acute appendicitis (AA) is the most common surgical emergency in children. Accurate and timely diagnosis is crucial but challenging due to atypical presentations and the inherent difficulty of obtaining a reliable history and physical examination in younger children. The aim of this study was to determine the utility of history, physical examination, laboratory tests, Pediatric Appendicitis Score (PAS) and Emergency Department Point-of-Care Ultrasound (ED-POCUS) in the diagnosis of AA in ED pediatric patients. We performed a systematic review and meta-analysis and used a test-treatment threshold model to identify diagnostic findings that could rule in/out AA and obviate the need for further imaging studies, specifically computed tomography (CT) scan, magnetic resonance imaging (MRI), and radiology department ultrasound (RUS). We searched PubMed, EMBASE, and SCOPUS up to October 2016 for studies on ED pediatric patients with abdominal pain. Quality Assessment Tool for Diagnostic Accuracy Studies (QUADAS-2) was used to evaluate the quality and applicability of included studies. Positive and negative likelihood ratios (LR+ and LR-) for diagnostic modalities were calculated and when appropriate data was pooled using Meta-DiSc. Based on the available literature on the test characteristics of different imaging modalities and applying the Pauker-Kassirer method we developed a test-treatment threshold model. Twenty-one studies were included encompassing 8,605 patients with weighted AA prevalence of 39.2%. Studies had variable quality using the QUADAS-2 tool with most studies at high risk of partial verification bias. We divided studies based on their inclusion criteria into two groups of "undifferentiated abdominal pain" and abdominal pain "suspected of AA." In patients with undifferentiated abdominal pain, history of "pain migration to right lower quadrant (RLQ)" (LR+ = 4.81, 95% confidence interval [CI] = 3.59-6.44) and presence of "cough/hop pain" in the physical examination (LR+ = 7.64, 95% CI = 5.94-9.83) were most strongly associated with AA. In patients suspected of AA none of the history or laboratory findings were strongly associated with AA. Rovsing's sign was the physical examination finding most strongly associated with AA (LR+ = 3.52, 95% CI = 2.65-4.68). Among different PAS cutoff points, PAS ≥ 9 (LR+ = 5.26, 95% CI = 3.34-8.29) was most associated with AA. None of the history, physical examination, laboratory tests findings, or PAS alone could rule in or rule out AA in patients with undifferentiated abdominal pain or those suspected of AA. ED-POCUS had LR+ of 9.24 (95% CI = 6.24-13.28) and LR- of 0.17 (95% CI = 0.09-0.30). Using our test-treatment threshold model, positive ED-POCUS could rule in AA without the use of CT and MRI, but negative ED-POCUS could not rule out AA. Presence of AA is more likely in patients with undifferentiated abdominal pain migrating to the RLQ or when cough/hop pain is present in the physical examination. Once AA is suspected, no single history, physical examination, laboratory finding, or score attained on PAS can eliminate the need for imaging studies. Operating characteristics of ED-POCUS are similar to those reported for RUS in literature for diagnosis of AA. In ED patients suspected of AA, a positive ED-POCUS is diagnostic and obviates the need for CT or MRI while negative ED-POCUS is not enough to rule out AA. © 2017 by the Society for Academic Emergency Medicine.

  5. Assessment of the 2016 National Institute for Health and Care Excellence high-sensitivity troponin rule-out strategy

    PubMed Central

    Greenslade, Jaimi; Cullen, Louise; Than, Martin; Kendall, Jason; Body, Richard; Parsonage, William A; Khattab, Ahmed

    2018-01-01

    Objective We aimed to evaluate the limit of detection of high-sensitivity troponin (hs-cTn) and Thrombolysis In Myocardial Infarction (TIMI) score combination rule-out strategy suggested within the 2016 National Institute for Health and Care Excellence (NICE) Chest Pain of Recent Onset guidelines and establish the optimal TIMI score threshold for clinical use. Methods A pooled analysis of adult patients presenting to the emergency department with chest pain and a non-ischaemic ECG, recruited into six prospective studies, from Australia, New Zealand and the UK. We evaluated the sensitivity of TIMI score thresholds from 0 to 2 alongside hs-cTnT or hs-cTnI for the primary outcome of major adverse cardiac events within 30 days. Results Data were available for 3159 patients for hs-cTnT and 4532 for hs-cTnI, of these 376 (11.9%) and 445 (9.8%) had major adverse cardiac events, respectively. Using a TIMI score of 0, the sensitivity for the primary outcome was 99.5% (95% CI 98.1% to 99.9%) alongside hs-cTnT and 98.9% (97.4% to 99.6%)%) alongside hs-cTnI, identifying 17.9% and 21.0% of patients as low risk, respectively. For a TIMI score ≤1 sensitivity was 98.9% (97.3% to 99.7%)%) alongside hs-cTnT and 98.4% (96.8% to 99.4%)%) alongside hs-cTnI, identifying 28.1% and 35.7% as low risk, respectively. For TIMI≤2, meta-sensitivity was <98% with either assay. Conclusions Our findings support the rule-out strategy suggested by NICE. The TIMI score threshold suggested for clinical use is 0. The proportion of patients identified as low risk (18%–21%) and suitable for early discharge using this threshold may be sufficient to encourage change of practice. Trial registration numbers ADAPT observational study/IMPACT intervention trial ACTRN12611001069943. ADAPT-ADP randomised controlled trial ACTRN12610000766011. EDACS-ADP randomised controlled trial ACTRN12613000745741. TRUST observational study ISRCTN no. 21109279. PMID:28864718

  6. 76 FR 66882 - Approval and Promulgation of State Implementation Plans; Missouri: Prevention of Significant...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-28

    ... Promulgation of State Implementation Plans; Missouri: Prevention of Significant Deterioration; Greenhouse Gas...) relating to regulation of Greenhouse Gases (GHGs) under Missouri's Prevention of Significant Deterioration... the GHG emission thresholds established in EPA's ``PSD and Title V Greenhouse Gas Tailoring Final Rule...

  7. 77 FR 51697 - Telemarketing Sales Rule Fees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-27

    ... percent threshold, the fees will change for fiscal year 2013. Second, to determine how much the fees... will not establish or alter any record keeping, reporting, or third-party disclosure requirements... Registry may not participate in any arrangement to share the cost of accessing the registry, including any...

  8. 78 FR 53642 - Telemarketing Sales Rule Fees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-30

    ... percent threshold, the fees will change for fiscal year 2014. Second, to determine how much the fees... Amended TSR and will not establish or alter any record keeping, reporting, or third-party disclosure... not participate in any arrangement to share the cost of accessing the registry, including any...

  9. A Visual Detection Learning Model

    NASA Technical Reports Server (NTRS)

    Beard, Bettina L.; Ahumada, Albert J., Jr.; Trejo, Leonard (Technical Monitor)

    1998-01-01

    Our learning model has memory templates representing the target-plus-noise and noise-alone stimulus sets. The best correlating template determines the response. The correlations and the feedback participate in the additive template updating rule. The model can predict the relative thresholds for detection in random, fixed and twin noise.

  10. Steroid treatment of posttraumatic anosmia.

    PubMed

    Jiang, Rong-San; Wu, Shang-Heng; Liang, Kai-Li; Shiao, Jiun-Yih; Hsin, Chung-Han; Su, Mao-Chang

    2010-10-01

    The objective of this study was to treat posttraumatic anosmia with oral steroid and evaluate its effect. One-hundred sixteen posttraumatic patients whose olfactory thresholds were -1.0 by the phenyl ethyl alcohol threshold test assembled in our department. They were treated with a course of high-dose steroid, and followed up for at least 3 months. During the latter period of this study, magnetic resonance imaging was performed to measure the volumes of olfactory bulbs and to detect subfrontal lobe damage. Among them, 19 (16.4%) patients' olfactory thresholds improved after steroid treatment, but the other 97 patients' thresholds did not change. The incidences of loss of consciousness and intracranial hemorrhage after head injury, the ratios of admission and craniotomy, the intervals between head injury and steroid treatment, the volumes of olfactory bulbs, and the incidences of subfrontal lobe damage were not significantly different between patients whose thresholds improved and those whose thresholds did not improve. However, patients with olfactory improvement were significantly younger than those who remained unchanged. Our study showed that oral steroid treatment might improve olfactory acuity in some patients with posttraumatic anosmia, but the possibility of spontaneous recovery cannot be ruled out.

  11. Dissociative recombination of O2(+), NO(+) and N2(+)

    NASA Technical Reports Server (NTRS)

    Guberman, S. L.

    1983-01-01

    A new L(2) approach for the calculation of the threshold molecular capture width needed for the determination of DR cross sections was developed. The widths are calculated with Fermi's golden rule by substituting Rydberg orbitals for the free electron continuum coulomb orbital. It is shown that the calculated width converges exponentially as the effective principal quantum number of the Rydberg orbital increases. The threshold capture width is then easily obtained. Since atmospheric recombination involves very low energy electrons, the threshold capture widths are essential to the calculation of DR cross sections for the atmospheric species studied here. The approach described makes use of bound state computer codes already in use. A program that collects width matrix elements over CI wavefunctions for the initial and final states is described.

  12. Morphology-based three-dimensional segmentation of coronary artery tree from CTA scans

    NASA Astrophysics Data System (ADS)

    Banh, Diem Phuc T.; Kyprianou, Iacovos S.; Paquerault, Sophie; Myers, Kyle J.

    2007-03-01

    We developed an algorithm based on a rule-based threshold framework to segment the coronary arteries from angiographic computed tomography (CTA) data. Computerized segmentation of the coronary arteries is a challenging procedure due to the presence of diverse anatomical structures surrounding the heart on cardiac CTA data. The proposed algorithm incorporates various levels of image processing and organ information including region, connectivity and morphology operations. It consists of three successive stages. The first stage involves the extraction of the three-dimensional scaffold of the heart envelope. This stage is semiautomatic requiring a reader to review the CTA scans and manually select points along the heart envelope in slices. These points are further processed using a surface spline-fitting technique to automatically generate the heart envelope. The second stage consists of segmenting the left heart chambers and coronary arteries using grayscale threshold, size and connectivity criteria. This is followed by applying morphology operations to further detach the left and right coronary arteries from the aorta. In the final stage, the 3D vessel tree is reconstructed and labeled using an Isolated Connected Threshold technique. The algorithm was developed and tested on a patient coronary artery CTA that was graciously shared by the Department of Radiology of the Massachusetts General Hospital. The test showed that our method constantly segmented the vessels above 79% of the maximum gray-level and automatically extracted 55 of the 58 coronary segments that can be seen on the CTA scan by a reader. These results are an encouraging step toward our objective of generating high resolution models of the male and female heart that will be subsequently used as phantoms for medical imaging system optimization studies.

  13. The best-interests standard as threshold, ideal, and standard of reasonableness.

    PubMed

    Kopelman, L M

    1997-06-01

    The best-interests standard is a widely used ethical, legal, and social basis for policy and decision-making involving children and other incompetent persons. It is under attack, however, as self-defeating, individualistic, unknowable, vague, dangerous, and open to abuse. The author defends this standard by identifying its employment, first, as a threshold for intervention and judgment (as in child abuse and neglect rulings), second, as an ideal to establish policies or prima facie duties, and, third, as a standard of reasonableness. Criticisms of the best-interests standard are reconsidered after clarifying these different meanings.

  14. A stimulus-dependent spike threshold is an optimal neural coder

    PubMed Central

    Jones, Douglas L.; Johnson, Erik C.; Ratnam, Rama

    2015-01-01

    A neural code based on sequences of spikes can consume a significant portion of the brain's energy budget. Thus, energy considerations would dictate that spiking activity be kept as low as possible. However, a high spike-rate improves the coding and representation of signals in spike trains, particularly in sensory systems. These are competing demands, and selective pressure has presumably worked to optimize coding by apportioning a minimum number of spikes so as to maximize coding fidelity. The mechanisms by which a neuron generates spikes while maintaining a fidelity criterion are not known. Here, we show that a signal-dependent neural threshold, similar to a dynamic or adapting threshold, optimizes the trade-off between spike generation (encoding) and fidelity (decoding). The threshold mimics a post-synaptic membrane (a low-pass filter) and serves as an internal decoder. Further, it sets the average firing rate (the energy constraint). The decoding process provides an internal copy of the coding error to the spike-generator which emits a spike when the error equals or exceeds a spike threshold. When optimized, the trade-off leads to a deterministic spike firing-rule that generates optimally timed spikes so as to maximize fidelity. The optimal coder is derived in closed-form in the limit of high spike-rates, when the signal can be approximated as a piece-wise constant signal. The predicted spike-times are close to those obtained experimentally in the primary electrosensory afferent neurons of weakly electric fish (Apteronotus leptorhynchus) and pyramidal neurons from the somatosensory cortex of the rat. We suggest that KCNQ/Kv7 channels (underlying the M-current) are good candidates for the decoder. They are widely coupled to metabolic processes and do not inactivate. We conclude that the neural threshold is optimized to generate an energy-efficient and high-fidelity neural code. PMID:26082710

  15. A simple rule for the evolution of contingent cooperation in large groups

    PubMed Central

    Schonmann, Roberto H.; Boyd, Robert

    2016-01-01

    Humans cooperate in large groups of unrelated individuals, and many authors have argued that such cooperation is sustained by contingent reward and punishment. However, such sanctioning systems can also stabilize a wide range of behaviours, including mutually deleterious behaviours. Moreover, it is very likely that large-scale cooperation is derived in the human lineage. Thus, understanding the evolution of mutually beneficial cooperative behaviour requires knowledge of when strategies that support such behaviour can increase when rare. Here, we derive a simple formula that gives the relatedness necessary for contingent cooperation in n-person iterated games to increase when rare. This rule applies to a wide range of pay-off functions and assumes that the strategies supporting cooperation are based on the presence of a threshold fraction of cooperators. This rule suggests that modest levels of relatedness are sufficient for invasion by strategies that make cooperation contingent on previous cooperation by a small fraction of group members. In contrast, only high levels of relatedness allow the invasion by strategies that require near universal cooperation. In order to derive this formula, we introduce a novel methodology for studying evolution in group structured populations including local and global group-size regulation and fluctuations in group size. PMID:26729938

  16. EPA’s Stage 2 Disinfection Byproducts Rules (DBPR) and Northern Kentucky Water: An Economic and Scientific Review

    PubMed Central

    Henry, Hugh

    2013-01-01

    Implementation of EPA’s Stage 2 Disinfection Byproducts Rules (DBPR) in Northern Kentucky will cause a water rate increase of over 25%. Hence a review was undertaken, considering both economics and science in the context of President Obama’s 2009 scientific integrity directive. The rules purport to avoid up to 0.49% of new bladder cancers by reducing the levels of DBPs in drinking water – a benefit so small that failure to implement will not cause unreasonable risk to health (URTH). It suggests at most one Northern Kentucky death avoided over 17 years for a cost of $136,000,000 ($1700 per household). Even this small benefit is probably overstated. EPA finds no “causal link” between DBPs and bladder cancer, and EPA acknowledges problems with the epidemiological data used in their calculation: the data appear contradictory and inconsistent, may be skewed toward “positive” results, and suggest different cancer sites than animal studies. Two similar international agencies disagree with EPA’s conclusions. The science is based on the Linear No Threshold (LNT) dose response model for DBPs, but this may not be the correct model. 83% of EPA’s epidemiological data show a statistical possibility that low levels of DBPs might be beneficial or have no effect. PMID:24298228

  17. EPA's Stage 2 Disinfection Byproducts Rules (DBPR) and Northern Kentucky Water: An Economic and Scientific Review.

    PubMed

    Henry, Hugh

    2013-01-01

    Implementation of EPA's Stage 2 Disinfection Byproducts Rules (DBPR) in Northern Kentucky will cause a water rate increase of over 25%. Hence a review was undertaken, considering both economics and science in the context of President Obama's 2009 scientific integrity directive. The rules purport to avoid up to 0.49% of new bladder cancers by reducing the levels of DBPs in drinking water - a benefit so small that failure to implement will not cause unreasonable risk to health (URTH). It suggests at most one Northern Kentucky death avoided over 17 years for a cost of $136,000,000 ($1700 per household). Even this small benefit is probably overstated. EPA finds no "causal link" between DBPs and bladder cancer, and EPA acknowledges problems with the epidemiological data used in their calculation: the data appear contradictory and inconsistent, may be skewed toward "positive" results, and suggest different cancer sites than animal studies. Two similar international agencies disagree with EPA's conclusions. The science is based on the Linear No Threshold (LNT) dose response model for DBPs, but this may not be the correct model. 83% of EPA's epidemiological data show a statistical possibility that low levels of DBPs might be beneficial or have no effect.

  18. Validation of an association rule mining-based method to infer associations between medications and problems.

    PubMed

    Wright, A; McCoy, A; Henkin, S; Flaherty, M; Sittig, D

    2013-01-01

    In a prior study, we developed methods for automatically identifying associations between medications and problems using association rule mining on a large clinical data warehouse and validated these methods at a single site which used a self-developed electronic health record. To demonstrate the generalizability of these methods by validating them at an external site. We received data on medications and problems for 263,597 patients from the University of Texas Health Science Center at Houston Faculty Practice, an ambulatory practice that uses the Allscripts Enterprise commercial electronic health record product. We then conducted association rule mining to identify associated pairs of medications and problems and characterized these associations with five measures of interestingness: support, confidence, chi-square, interest and conviction and compared the top-ranked pairs to a gold standard. 25,088 medication-problem pairs were identified that exceeded our confidence and support thresholds. An analysis of the top 500 pairs according to each measure of interestingness showed a high degree of accuracy for highly-ranked pairs. The same technique was successfully employed at the University of Texas and accuracy was comparable to our previous results. Top associations included many medications that are highly specific for a particular problem as well as a large number of common, accurate medication-problem pairs that reflect practice patterns.

  19. 75 FR 18607 - Mandatory Reporting of Greenhouse Gases: Petroleum and Natural Gas Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-12

    ... any of the following methods: Federal eRulemaking Portal: http://www.regulations.gov . Follow the... the Source Category D. Selection of Reporting Threshold E. Selection of Proposed Monitoring Methods F... rule and the monitoring methods proposed. This section then provides a brief summary of, and rationale...

  20. 76 FR 14570 - Federal Acquisition Regulation; Trade Agreements Thresholds

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-16

    ... application of the World Trade Organization Government Procurement Agreement and the Free Trade Agreements, as... Parts 22, 25, and 52 Government procurement. Dated: March 4, 2011. Millisa Gary, Acting Director, Office... (NASA). ACTION: Final rule. SUMMARY: DoD, GSA, and NASA have adopted as final, without change, an...

  1. 77 FR 11744 - Approval and Promulgation of Implementation Plans; Tennessee: Prevention of Significant...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-28

    ... New Source Review (NSR) Prevention of Significant Deterioration (PSD) program. Specifically, the SIP... modification projects become subject to Tennessee's PSD permitting requirements for GHG emissions. This rule... thresholds in the Tennessee SIP for GHG PSD requirements. EPA is approving Tennessee's January 11, 2012, SIP...

  2. 76 FR 59334 - Approval and Promulgation of Air Quality Implementation Plans; New Mexico; Albuquerque/Bernalillo...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-26

    ... Prevention of Significant Deterioration (PSD) program to establish appropriate emission thresholds for... County's PSD permitting requirements for their greenhouse gas (GHG) emissions. Due to the SIP Narrowing Rule, 75 FR 82536, starting on January 2, 2011, the approved Albuquerque/Bernalillo County SIP's PSD...

  3. PROPORTIONAL REBATE, RANDOM FULL-REBATES, AND WINNER-TAKE-ALL RULES IN PROVIDING A THRESHOLD PUBLIC GOOD. (R825307)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  4. 77 FR 54382 - Revisions of Five California Clean Air Act Title V Operating Permits Programs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-05

    ... Five California Clean Air Act Title V Operating Permits Programs AGENCY: Environmental Protection... Permits (Title V) programs of the Monterey Bay Unified Air Pollution Control District (MBUAPCD), San Luis... thresholds in EPA's Tailoring Rule, which have not been previously subject [[Page 54383

  5. 78 FR 17612 - Rules Relating to Additional Medicare Tax; Hearing Cancellation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-22

    ...This document cancels a public hearing on proposed regulations under sections 3101(b), 3102, 3202(a), 1401(b), 6205, and 6402 of the Internal Revenue Code; relating to the Additional Hospital Insurance Tax on income above threshold amounts as added by the Affordable Care Act.

  6. Evidence-based Diagnostics: Adult Septic Arthritis

    PubMed Central

    Carpenter, Christopher R.; Schuur, Jeremiah D.; Everett, Worth W.; Pines, Jesse M.

    2011-01-01

    Background Acutely swollen or painful joints are common complaints in the emergency department (ED). Septic arthritis in adults is a challenging diagnosis, but prompt differentiation of a bacterial etiology is crucial to minimize morbidity and mortality. Objectives The objective was to perform a systematic review describing the diagnostic characteristics of history, physical examination, and bedside laboratory tests for nongonococcal septic arthritis. A secondary objective was to quantify test and treatment thresholds using derived estimates of sensitivity and specificity, as well as best-evidence diagnostic and treatment risks and anticipated benefits from appropriate therapy. Methods Two electronic search engines (PUBMED and EMBASE) were used in conjunction with a selected bibliography and scientific abstract hand search. Inclusion criteria included adult trials of patients presenting with monoarticular complaints if they reported sufficient detail to reconstruct partial or complete 2 × 2 contingency tables for experimental diagnostic test characteristics using an acceptable criterion standard. Evidence was rated by two investigators using the Quality Assessment Tool for Diagnostic Accuracy Studies (QUADAS). When more than one similarly designed trial existed for a diagnostic test, meta-analysis was conducted using a random effects model. Interval likelihood ratios (LRs) were computed when possible. To illustrate one method to quantify theoretical points in the probability of disease whereby clinicians might cease testing altogether and either withhold treatment (test threshold) or initiate definitive therapy in lieu of further diagnostics (treatment threshold), an interactive spreadsheet was designed and sample calculations were provided based on research estimates of diagnostic accuracy, diagnostic risk, and therapeutic risk/benefits. Results The prevalence of nongonococcal septic arthritis in ED patients with a single acutely painful joint is approximately 27% (95% confidence interval [CI] = 17% to 38%). With the exception of joint surgery (positive likelihood ratio [+LR] = 6.9) or skin infection overlying a prosthetic joint (+LR = 15.0), history, physical examination, and serum tests do not significantly alter posttest probability. Serum inflammatory markers such as white blood cell (WBC) counts, erythrocyte sedimentation rate (ESR), and C-reactive protein (CRP) are not useful acutely. The interval LR for synovial white blood cell (sWBC) counts of 0 × 109–25 × 109/ L was 0.33; for 25 × 109–50 × 109/L, 1.06; for 50 × 109–100 × 109/L, 3.59; and exceeding 100 × 109/L, infinity. Synovial lactate may be useful to rule in or rule out the diagnosis of septic arthritis with a +LR ranging from 2.4 to infinity, and negative likelihood ratio (−LR) ranging from 0 to 0.46. Rapid polymerase chain reaction (PCR) of synovial fluid may identify the causative organism within 3 hours. Based on 56% sensitivity and 90% specificity for sWBC counts of >50 × 109/L in conjunction with best-evidence estimates for diagnosis-related risk and treatment-related risk/benefit, the arthrocentesis test threshold is 5%, with a treatment threshold of 39%. Conclusions Recent joint surgery or cellulitis overlying a prosthetic hip or knee were the only findings on history or physical examination that significantly alter the probability of nongonococcal septic arthritis. Extreme values of sWBC (>50 × 109/L) can increase, but not decrease, the probability of septic arthritis. Future ED-based diagnostic trials are needed to evaluate the role of clinical gestalt and the efficacy of nontraditional synovial markers such as lactate. PMID:21843213

  7. Evidence-based diagnostics: adult septic arthritis.

    PubMed

    Carpenter, Christopher R; Schuur, Jeremiah D; Everett, Worth W; Pines, Jesse M

    2011-08-01

    Acutely swollen or painful joints are common complaints in the emergency department (ED). Septic arthritis in adults is a challenging diagnosis, but prompt differentiation of a bacterial etiology is crucial to minimize morbidity and mortality. The objective was to perform a systematic review describing the diagnostic characteristics of history, physical examination, and bedside laboratory tests for nongonococcal septic arthritis. A secondary objective was to quantify test and treatment thresholds using derived estimates of sensitivity and specificity, as well as best-evidence diagnostic and treatment risks and anticipated benefits from appropriate therapy. Two electronic search engines (PUBMED and EMBASE) were used in conjunction with a selected bibliography and scientific abstract hand search. Inclusion criteria included adult trials of patients presenting with monoarticular complaints if they reported sufficient detail to reconstruct partial or complete 2 × 2 contingency tables for experimental diagnostic test characteristics using an acceptable criterion standard. Evidence was rated by two investigators using the Quality Assessment Tool for Diagnostic Accuracy Studies (QUADAS). When more than one similarly designed trial existed for a diagnostic test, meta-analysis was conducted using a random effects model. Interval likelihood ratios (LRs) were computed when possible. To illustrate one method to quantify theoretical points in the probability of disease whereby clinicians might cease testing altogether and either withhold treatment (test threshold) or initiate definitive therapy in lieu of further diagnostics (treatment threshold), an interactive spreadsheet was designed and sample calculations were provided based on research estimates of diagnostic accuracy, diagnostic risk, and therapeutic risk/benefits. The prevalence of nongonococcal septic arthritis in ED patients with a single acutely painful joint is approximately 27% (95% confidence interval [CI] = 17% to 38%). With the exception of joint surgery (positive likelihood ratio [+LR] = 6.9) or skin infection overlying a prosthetic joint (+LR = 15.0), history, physical examination, and serum tests do not significantly alter posttest probability. Serum inflammatory markers such as white blood cell (WBC) counts, erythrocyte sedimentation rate (ESR), and C-reactive protein (CRP) are not useful acutely. The interval LR for synovial white blood cell (sWBC) counts of 0 × 10(9)-25 × 10(9)/L was 0.33; for 25 × 10(9)-50 × 10(9)/L, 1.06; for 50 × 10(9)-100 × 10(9)/L, 3.59; and exceeding 100 × 10(9)/L, infinity. Synovial lactate may be useful to rule in or rule out the diagnosis of septic arthritis with a +LR ranging from 2.4 to infinity, and negative likelihood ratio (-LR) ranging from 0 to 0.46. Rapid polymerase chain reaction (PCR) of synovial fluid may identify the causative organism within 3 hours. Based on 56% sensitivity and 90% specificity for sWBC counts of >50 × 10(9)/L in conjunction with best-evidence estimates for diagnosis-related risk and treatment-related risk/benefit, the arthrocentesis test threshold is 5%, with a treatment threshold of 39%. Recent joint surgery or cellulitis overlying a prosthetic hip or knee were the only findings on history or physical examination that significantly alter the probability of nongonococcal septic arthritis. Extreme values of sWBC (>50 × 10(9)/L) can increase, but not decrease, the probability of septic arthritis. Future ED-based diagnostic trials are needed to evaluate the role of clinical gestalt and the efficacy of nontraditional synovial markers such as lactate. © 2011 by the Society for Academic Emergency Medicine.

  8. Decision Tree Repository and Rule Set Based Mingjiang River Estuarine Wetlands Classifaction

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Li, X.; Xiao, W.

    2018-05-01

    The increasing urbanization and industrialization have led to wetland losses in estuarine area of Mingjiang River over past three decades. There has been increasing attention given to produce wetland inventories using remote sensing and GIS technology. Due to inconsistency training site and training sample, traditionally pixel-based image classification methods can't achieve a comparable result within different organizations. Meanwhile, object-oriented image classification technique shows grate potential to solve this problem and Landsat moderate resolution remote sensing images are widely used to fulfill this requirement. Firstly, the standardized atmospheric correct, spectrally high fidelity texture feature enhancement was conducted before implementing the object-oriented wetland classification method in eCognition. Secondly, we performed the multi-scale segmentation procedure, taking the scale, hue, shape, compactness and smoothness of the image into account to get the appropriate parameters, using the top and down region merge algorithm from single pixel level, the optimal texture segmentation scale for different types of features is confirmed. Then, the segmented object is used as the classification unit to calculate the spectral information such as Mean value, Maximum value, Minimum value, Brightness value and the Normalized value. The Area, length, Tightness and the Shape rule of the image object Spatial features and texture features such as Mean, Variance and Entropy of image objects are used as classification features of training samples. Based on the reference images and the sampling points of on-the-spot investigation, typical training samples are selected uniformly and randomly for each type of ground objects. The spectral, texture and spatial characteristics of each type of feature in each feature layer corresponding to the range of values are used to create the decision tree repository. Finally, with the help of high resolution reference images, the random sampling method is used to conduct the field investigation, achieve an overall accuracy of 90.31 %, and the Kappa coefficient is 0.88. The classification method based on decision tree threshold values and rule set developed by the repository, outperforms the results obtained from the traditional methodology. Our decision tree repository and rule set based object-oriented classification technique was an effective method for producing comparable and consistency wetlands data set.

  9. Convergence between DSM-IV-TR and DSM-5 diagnostic models for personality disorder: evaluation of strategies for establishing diagnostic thresholds.

    PubMed

    Morey, Leslie C; Skodol, Andrew E

    2013-05-01

    The Personality and Personality Disorders Work Group for the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) recommended substantial revisions to the personality disorders (PDs) section of DSM-IV-TR, proposing a hybrid categorical-dimensional model that represented PDs as combinations of core personality dysfunctions and various configurations of maladaptive personality traits. Although the DSM-5 Task Force endorsed the proposal, the Board of Trustees of the American Psychiatric Association (APA) did not, placing the Work Group's model in DSM-5 Section III ("Emerging Measures and Models") with other concepts thought to be in need of additional research. This paper documents the impact of using this alternative model in a national sample of 337 patients as described by clinicians familiar with their cases. In particular, the analyses focus on alternative strategies considered by the Work Group for deriving decision rules, or diagnostic thresholds, with which to assign categorical diagnoses. Results demonstrate that diagnostic rules could be derived that yielded appreciable correspondence between DSM-IV-TR and proposed DSM-5 PD diagnoses-correspondence greater than that observed in the transition between DSM-III and DSM-III-R PDs. The approach also represents the most comprehensive attempt to date to provide conceptual and empirical justification for diagnostic thresholds utilized within the DSM PDs.

  10. Local-duality QCD sum rules for strong isospin breaking in the decay constants of heavy-light mesons.

    PubMed

    Lucha, Wolfgang; Melikhov, Dmitri; Simula, Silvano

    2018-01-01

    We discuss the leptonic decay constants of heavy-light mesons by means of Borel QCD sum rules in the local-duality (LD) limit of infinitely large Borel mass parameter. In this limit, for an appropriate choice of the invariant structures in the QCD correlation functions, all vacuum-condensate contributions vanish and all nonperturbative effects are contained in only one quantity, the effective threshold. We study properties of the LD effective thresholds in the limits of large heavy-quark mass [Formula: see text] and small light-quark mass [Formula: see text]. In the heavy-quark limit, we clarify the role played by the radiative corrections in the effective threshold for reproducing the pQCD expansion of the decay constants of pseudoscalar and vector mesons. We show that the dependence of the meson decay constants on [Formula: see text] arises predominantly (at the level of 70-80%) from the calculable [Formula: see text]-dependence of the perturbative spectral densities. Making use of the lattice QCD results for the decay constants of nonstrange and strange pseudoscalar and vector heavy mesons, we obtain solid predictions for the decay constants of heavy-light mesons as functions of [Formula: see text] in the range from a few to 100 MeV and evaluate the corresponding strong isospin-breaking effects: [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text].

  11. A forecast-based STDP rule suitable for neuromorphic implementation.

    PubMed

    Davies, S; Galluppi, F; Rast, A D; Furber, S B

    2012-08-01

    Artificial neural networks increasingly involve spiking dynamics to permit greater computational efficiency. This becomes especially attractive for on-chip implementation using dedicated neuromorphic hardware. However, both spiking neural networks and neuromorphic hardware have historically found difficulties in implementing efficient, effective learning rules. The best-known spiking neural network learning paradigm is Spike Timing Dependent Plasticity (STDP) which adjusts the strength of a connection in response to the time difference between the pre- and post-synaptic spikes. Approaches that relate learning features to the membrane potential of the post-synaptic neuron have emerged as possible alternatives to the more common STDP rule, with various implementations and approximations. Here we use a new type of neuromorphic hardware, SpiNNaker, which represents the flexible "neuromimetic" architecture, to demonstrate a new approach to this problem. Based on the standard STDP algorithm with modifications and approximations, a new rule, called STDP TTS (Time-To-Spike) relates the membrane potential with the Long Term Potentiation (LTP) part of the basic STDP rule. Meanwhile, we use the standard STDP rule for the Long Term Depression (LTD) part of the algorithm. We show that on the basis of the membrane potential it is possible to make a statistical prediction of the time needed by the neuron to reach the threshold, and therefore the LTP part of the STDP algorithm can be triggered when the neuron receives a spike. In our system these approximations allow efficient memory access, reducing the overall computational time and the memory bandwidth required. The improvements here presented are significant for real-time applications such as the ones for which the SpiNNaker system has been designed. We present simulation results that show the efficacy of this algorithm using one or more input patterns repeated over the whole time of the simulation. On-chip results show that the STDP TTS algorithm allows the neural network to adapt and detect the incoming pattern with improvements both in the reliability of, and the time required for, consistent output. Through the approximations we suggest in this paper, we introduce a learning rule that is easy to implement both in event-driven simulators and in dedicated hardware, reducing computational complexity relative to the standard STDP rule. Such a rule offers a promising solution, complementary to standard STDP evaluation algorithms, for real-time learning using spiking neural networks in time-critical applications. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Loop Mirror Laser Neural Network with a Fast Liquid-Crystal Display

    NASA Astrophysics Data System (ADS)

    Mos, Evert C.; Schleipen, Jean J. H. B.; de Waardt, Huug; Khoe, Djan G. D.

    1999-07-01

    In our laser neural network (LNN) all-optical threshold action is obtained by application of controlled optical feedback to a laser diode. Here an extended experimental LNN is presented with as many as 32 neurons and 12 inputs. In the setup we use a fast liquid-crystal display to implement an optical matrix vector multiplier. This display, based on ferroelectric liquid-crystal material, enables us to present 125 training examples s to the LNN. To maximize the optical feedback efficiency of the setup, a loop mirror is introduced. We use a -rule learning algorithm to train the network to perform a number of functions toward the application area of telecommunication data switching.

  13. How parents perceive screen viewing in their 5-6 year old child within the context of their own screen viewing time: a mixed-methods study.

    PubMed

    Thompson, Janice L; Sebire, Simon J; Kesten, Joanna M; Zahra, Jesmond; Edwards, Mark; Solomon-Moore, Emma; Jago, Russell

    2017-06-01

    Few studies have examined parental perceptions of their child's screen-viewing (SV) within the context of parental SV time. This study qualitatively examined parents' perceptions of their 5-6-year-old child's SV within the context of their own quantitatively measured SV. A mixed-methods design employed semi-structured telephone interviews, demographic and SV questionnaires, objectively-measured physical activity and sedentary time. Deductive content analysis was used to explore parents' perceptions of, and concerns about, their child's SV, and management of their child's SV. Comparisons were made between parent-child dyads reporting low (<2-h per day) versus high SV time. Fifty-three parents were interviewed (94.3% mothers), with 52 interviews analysed. Fifteen parent-child dyads (28.8%) exceeded the 2-h SV threshold on both weekdays and weekend days; 5 parent-child dyads (9.6%) did not exceed this threshold. The remaining 32 dyads reported a combination of parent or child exceeding/not exceeding the SV threshold on either weekdays or weekend days. Three main themes distinguished the 15 parent-child dyads exceeding the SV threshold from the 5 dyads that did not: 1) parents' personal SV-related views and behaviours; 2) the family SV environment; and 3) setting SV rules and limits. Parents in the dyads not exceeding the SV threshold prioritized and engaged with their children in non-SV behaviours for relaxation, set limits around their own and their child's SV-related behaviours, and described an environment supportive of physical activity. Parents in the dyads exceeding the SV threshold were more likely to prioritise SV as a shared family activity, and described a less structured SV environment with minimal rule setting, influenced their child's need for relaxation time. The majority of parents in this study who exceeded the SV threshold expressed minimal concern and a relaxed approach to managing SV for themselves and their child(ren), suggesting a need to raise awareness amongst these parents about the time they spend engaging in SV. Parents may understand their SV-related parenting practices more clearly if they are encouraged to examine their own SV behaviours. Designing interventions aimed to create environments that are less supportive of SV, with more structured approaches to SV parenting strategies are warranted.

  14. Minimum duration of actigraphy-defined nocturnal awakenings necessary for morning recall.

    PubMed

    Winser, Michael A; McBean, Amanda L; Montgomery-Downs, Hawley E

    2013-07-01

    Healthy adults awaken between each sleep cycle approximately 5 times each night but generally do not remember all of these awakenings in the morning. A rule of thumb has arisen in the sleep field that approximately 5 min of continuous wakefulness are required to form a memory for an awakening. However, few studies have examined memory for these sleep-wake transitions and none have done so in the home, while participants follow their normal routine. Self-report and actigraphy were used in the participant's home environment to determine the minimum duration of an awakening necessary for morning recall for each of the 39 healthy adults. Recall thresholds ranged from 30 to 600 s with a mean of 259 s (4 min 19 s) and were negatively associated with sleep efficiency but not significantly associated with total sleep time, age, income, or education. There also was a sex by cohabitation interaction, with single men having lower thresholds than single women and cohabiting participants, which was explained by higher sleep efficiency in noncohabitating men. Large individual differences suggest that many factors may influence recall threshold. Our preliminary study is the first to calculate the duration of wakefulness necessary for morning recall of nocturnal awakenings and the first to use a field-based design, allowing for the study of habitual sleep patterns at the participant's home. Further study is needed to explore if recall thresholds calculated using actigraphy can be validated against polysomnography (PSG) or be used to guide potential treatments. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Establishment of a standard operating procedure for predicting the time of calving in cattle

    PubMed Central

    Sauter-Louis, Carola; Braunert, Anna; Lange, Dorothee; Weber, Frank; Zerbe, Holm

    2011-01-01

    Precise calving monitoring is essential for minimizing the effects of dystocia in cows and calves. We conducted two studies in healthy cows that compared seven clinical signs (broad pelvic ligaments relaxation, vaginal secretion, udder hyperplasia, udder edema, teat filling, tail relaxation, and vulva edema) alone and in combination in order to predict the time of parturition. The relaxation of the broad pelvic ligaments combined with teat filling gave the best values for predicting either calving or no calving within 12 h. For the proposed parturition score (PS), a threshold of 4 PS points was identified below which calving within the next 12 h could be ruled out with a probability of 99.3% in cows (95.5% in heifers). Above this threshold, intermitted calving monitoring every 3 h and a progesterone rapid blood test (PRBT) would be recommended. By combining the PS and PRBT (if PS ≥ 4), the prediction of calving within the next 12 h improved from 14.9% to 53.1%, and the probability of ruling out calving was 96.8%. The PRBT was compared to the results of an enzyme immunoassay (sensitivity, 90.2%; specificity, 74.9%). The standard operating procedure developed in this study that combines the PS and PRBT will enable veterinarians to rule out or predict calving within a 12 h period in cows with high accuracy under field conditions. PMID:21586878

  16. Multiobjective hedging rules for flood water conservation

    NASA Astrophysics Data System (ADS)

    Ding, Wei; Zhang, Chi; Cai, Ximing; Li, Yu; Zhou, Huicheng

    2017-03-01

    Flood water conservation can be beneficial for water uses especially in areas with water stress but also can pose additional flood risk. The potential of flood water conservation is affected by many factors, especially decision makers' preference for water conservation and reservoir inflow forecast uncertainty. This paper discusses the individual and joint effects of these two factors on the trade-off between flood control and water conservation, using a multiobjective, two-stage reservoir optimal operation model. It is shown that hedging between current water conservation and future flood control exists only when forecast uncertainty or decision makers' preference is within a certain range, beyond which, hedging is trivial and the multiobjective optimization problem is reduced to a single objective problem with either flood control or water conservation. Different types of hedging rules are identified with different levels of flood water conservation preference, forecast uncertainties, acceptable flood risk, and reservoir storage capacity. Critical values of decision preference (represented by a weight) and inflow forecast uncertainty (represented by standard deviation) are identified. These inform reservoir managers with a feasible range of their preference to water conservation and thresholds of forecast uncertainty, specifying possible water conservation within the thresholds. The analysis also provides inputs for setting up an optimization model by providing the range of objective weights and the choice of hedging rule types. A case study is conducted to illustrate the concepts and analyses.

  17. Computerized detection of unruptured aneurysms in MRA images: reduction of false positives using anatomical location features

    NASA Astrophysics Data System (ADS)

    Uchiyama, Yoshikazu; Gao, Xin; Hara, Takeshi; Fujita, Hiroshi; Ando, Hiromichi; Yamakawa, Hiroyasu; Asano, Takahiko; Kato, Hiroki; Iwama, Toru; Kanematsu, Masayuki; Hoshi, Hiroaki

    2008-03-01

    The detection of unruptured aneurysms is a major subject in magnetic resonance angiography (MRA). However, their accurate detection is often difficult because of the overlapping between the aneurysm and the adjacent vessels on maximum intensity projection images. The purpose of this study is to develop a computerized method for the detection of unruptured aneurysms in order to assist radiologists in image interpretation. The vessel regions were first segmented using gray-level thresholding and a region growing technique. The gradient concentration (GC) filter was then employed for the enhancement of the aneurysms. The initial candidates were identified in the GC image using a gray-level threshold. For the elimination of false positives (FPs), we determined shape features and an anatomical location feature. Finally, rule-based schemes and quadratic discriminant analysis were employed along with these features for distinguishing between the aneurysms and the FPs. The sensitivity for the detection of unruptured aneurysms was 90.0% with 1.52 FPs per patient. Our computerized scheme can be useful in assisting the radiologists in the detection of unruptured aneurysms in MRA images.

  18. Short-term memory of TiO2-based electrochemical capacitors: empirical analysis with adoption of a sliding threshold

    NASA Astrophysics Data System (ADS)

    Lim, Hyungkwang; Kim, Inho; Kim, Jin-Sang; Hwang, Cheol Seong; Jeong, Doo Seok

    2013-09-01

    Chemical synapses are important components of the large-scaled neural network in the hippocampus of the mammalian brain, and a change in their weight is thought to be in charge of learning and memory. Thus, the realization of artificial chemical synapses is of crucial importance in achieving artificial neural networks emulating the brain’s functionalities to some extent. This kind of research is often referred to as neuromorphic engineering. In this study, we report short-term memory behaviours of electrochemical capacitors (ECs) utilizing TiO2 mixed ionic-electronic conductor and various reactive electrode materials e.g. Ti, Ni, and Cr. By experiments, it turned out that the potentiation behaviours did not represent unlimited growth of synaptic weight. Instead, the behaviours exhibited limited synaptic weight growth that can be understood by means of an empirical equation similar to the Bienenstock-Cooper-Munro rule, employing a sliding threshold. The observed potentiation behaviours were analysed using the empirical equation and the differences between the different ECs were parameterized.

  19. Global Mittag-Leffler stability and synchronization analysis of fractional-order quaternion-valued neural networks with linear threshold neurons.

    PubMed

    Yang, Xujun; Li, Chuandong; Song, Qiankun; Chen, Jiyang; Huang, Junjian

    2018-05-04

    This paper talks about the stability and synchronization problems of fractional-order quaternion-valued neural networks (FQVNNs) with linear threshold neurons. On account of the non-commutativity of quaternion multiplication resulting from Hamilton rules, the FQVNN models are separated into four real-valued neural network (RVNN) models. Consequently, the dynamic analysis of FQVNNs can be realized by investigating the real-valued ones. Based on the method of M-matrix, the existence and uniqueness of the equilibrium point of the FQVNNs are obtained without detailed proof. Afterwards, several sufficient criteria ensuring the global Mittag-Leffler stability for the unique equilibrium point of the FQVNNs are derived by applying the Lyapunov direct method, the theory of fractional differential equation, the theory of matrix eigenvalue, and some inequality techniques. In the meanwhile, global Mittag-Leffler synchronization for the drive-response models of the addressed FQVNNs are investigated explicitly. Finally, simulation examples are designed to verify the feasibility and availability of the theoretical results. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Rectification of graphene self-switching diodes: First-principles study

    NASA Astrophysics Data System (ADS)

    Ghaziasadi, Hassan; Jamasb, Shahriar; Nayebi, Payman; Fouladian, Majid

    2018-05-01

    The first principles calculations based on self-consistent charge density functional tight-binding have performed to investigate the electrical properties and rectification behavior of the graphene self-switching diodes (GSSD). The devices contained two structures called CG-GSSD and DG-GSSD which have metallic or semiconductor gates depending on their side gates have a single or double hydrogen edge functionalized. We have relaxed the devices and calculated I-V curves, transmission spectrums and maximum rectification ratios. We found that the DG-MSM devices are more favorable and more stable. Also, the DG-MSM devices have better maximum rectification ratios and current. Moreover, by changing the side gates widths and behaviors from semiconductor to metal, the threshold voltages under forward bias changed from +1.2 V to +0.3 V. Also, the maximum currents are obtained from 1.12 μA to 10.50 μA. Finally, the MSM and SSS type of all devices have minimum and maximum values of voltage threshold and maximum rectification ratios, but the 769-DG devices don't obey this rule.

  1. Partial Photoneutron Cross Sections for 207,208Pb

    NASA Astrophysics Data System (ADS)

    Kondo, T.; Utsunomiya, H.; Goriely, S.; Iwamoto, C.; Akimune, H.; Yamagata, T.; Toyokawa, H.; Harada, H.; Kitatani, F.; Lui, Y.-W.; Hilaire, S.; Koning, A. J.

    2014-05-01

    Using linearly-polarized laser-Compton scattering γ-rays, partial E1 and M1 photoneutron cross sections along with total cross sections were determined for 207,208Pb at four energies near neutron threshold by measuring anisotropies in photoneutron emission. Separately, total photoneutron cross sections were measured for 207,208Pb with a high-efficiency 4π neutron detector. The partial cross section measurement provides direct evidence for the presence of pygmy dipole resonance (PDR) in 207,208Pb in the vicinity of neutron threshold. The strength of PDR amounts to 0.32%-0.42% of the Thomas-Reiche-Kuhn sum rule. Several μN2 units of B(M1)↑ strength were observed in 207,208Pb just above neutron threshold, which correspond to M1 cross sections less than 10% of the total photoneutron cross sections.

  2. Designing adaptive operating rules for a large multi-purpose reservoir

    NASA Astrophysics Data System (ADS)

    Geressu, Robel; Rougé, Charles; Harou, Julien

    2017-04-01

    Reservoirs whose live storage capacity is large compared with annual inflow have "memory", i.e., their storage levels contain information about past inflows and reservoir operations. Such "long-memory" reservoirs can be found in basins in dry regions such as the Nile River Basin in Africa, the Colorado River Basin in the US, or river basins in Western and Central Asia. There the effects of a dry year have the potential to impact reservoir levels and downstream releases for several subsequent years, prompting tensions in transboundary basins. Yet, current reservoir operation rules in those reservoirs do not reflect this by integrating past climate history and release decisions among the factors that influence operating decisions. This work proposes and demonstrates an adaptive reservoir operating rule that explicitly accounts for the recent history of release decisions, and not only current storage level and near-term inflow forecasts. This implies adding long-term (e.g., multiyear) objectives to the existing short-term (e.g., annual) ones. We apply these operating rules to the Grand Ethiopian Renaissance Dam, a large reservoir under construction on the Blue Nile River. Energy generation has to be balanced with the imperative of releasing enough water in low flow years (e.g., the minimum 1, 2 or 3 year cumulative flow) to avoid tensions with downstream countries, Sudan and Egypt. Maximizing the minimum multi-year releases could be of interest for the Nile problem to minimize the impact on performance of the large High Aswan Dam in Egypt. Objectives include maximizing the average and minimum annual energy generation and maximizing the minimum annual, two year and three year cumulative releases. The system model is tested using 30 stochastically generated streamflow series. One can then derive adaptive release rules depending on the value of one- and two-year total releases with respect to thresholds. Then, there are 3 sets of release rules for the reservoir depending on whether one or both thresholds are not met, vs. only one with a non-adaptive rule. Multi-objective evolutionary algorithms (MOEAs) are used to obtain the Pareto front, i.e., non-dominated adaptive and non-adaptive operating rule sets. Implementing adaptive rules is found to improve the trade-offs between energy generation criteria and minimum release targets. Compared with non-adaptive operations, an adaptive operating policy shows an increase of around 3 and 10 Billion cubic meters in the minimum 1 and 3-year cumulative releases for a given value of the same average annual energy generation.

  3. Estimating economic thresholds for site-specific weed control using manual weed counts and sensor technology: an example based on three winter wheat trials.

    PubMed

    Keller, Martina; Gutjahr, Christoph; Möhring, Jens; Weis, Martin; Sökefeld, Markus; Gerhards, Roland

    2014-02-01

    Precision experimental design uses the natural heterogeneity of agricultural fields and combines sensor technology with linear mixed models to estimate the effect of weeds, soil properties and herbicide on yield. These estimates can be used to derive economic thresholds. Three field trials are presented using the precision experimental design in winter wheat. Weed densities were determined by manual sampling and bi-spectral cameras, yield and soil properties were mapped. Galium aparine, other broad-leaved weeds and Alopecurus myosuroides reduced yield by 17.5, 1.2 and 12.4 kg ha(-1) plant(-1)  m(2) in one trial. The determined thresholds for site-specific weed control with independently applied herbicides were 4, 48 and 12 plants m(-2), respectively. Spring drought reduced yield effects of weeds considerably in one trial, since water became yield limiting. A negative herbicide effect on the crop was negligible, except in one trial, in which the herbicide mixture tended to reduce yield by 0.6 t ha(-1). Bi-spectral cameras for weed counting were of limited use and still need improvement. Nevertheless, large weed patches were correctly identified. The current paper presents a new approach to conducting field trials and deriving decision rules for weed control in farmers' fields. © 2013 Society of Chemical Industry.

  4. Textual and visual content-based anti-phishing: a Bayesian approach.

    PubMed

    Zhang, Haijun; Liu, Gang; Chow, Tommy W S; Liu, Wenyin

    2011-10-01

    A novel framework using a Bayesian approach for content-based phishing web page detection is presented. Our model takes into account textual and visual contents to measure the similarity between the protected web page and suspicious web pages. A text classifier, an image classifier, and an algorithm fusing the results from classifiers are introduced. An outstanding feature of this paper is the exploration of a Bayesian model to estimate the matching threshold. This is required in the classifier for determining the class of the web page and identifying whether the web page is phishing or not. In the text classifier, the naive Bayes rule is used to calculate the probability that a web page is phishing. In the image classifier, the earth mover's distance is employed to measure the visual similarity, and our Bayesian model is designed to determine the threshold. In the data fusion algorithm, the Bayes theory is used to synthesize the classification results from textual and visual content. The effectiveness of our proposed approach was examined in a large-scale dataset collected from real phishing cases. Experimental results demonstrated that the text classifier and the image classifier we designed deliver promising results, the fusion algorithm outperforms either of the individual classifiers, and our model can be adapted to different phishing cases. © 2011 IEEE

  5. Visual detection following retinal damage: predictions of an inhomogeneous retino-cortical model

    NASA Astrophysics Data System (ADS)

    Arnow, Thomas L.; Geisler, Wilson S.

    1996-04-01

    A model of human visual detection performance has been developed, based on available anatomical and physiological data for the primate visual system. The inhomogeneous retino- cortical (IRC) model computes detection thresholds by comparing simulated neural responses to target patterns with responses to a uniform background of the same luminance. The model incorporates human ganglion cell sampling distributions; macaque monkey ganglion cell receptive field properties; macaque cortical cell contrast nonlinearities; and a optical decision rule based on ideal observer theory. Spatial receptive field properties of cortical neurons were not included. Two parameters were allowed to vary while minimizing the squared error between predicted and observed thresholds. One parameter was decision efficiency, the other was the relative strength of the ganglion-cell center and surround. The latter was only allowed to vary within a small range consistent with known physiology. Contrast sensitivity was measured for sinewave gratings as a function of spatial frequency, target size and eccentricity. Contrast sensitivity was also measured for an airplane target as a function of target size, with and without artificial scotomas. The results of these experiments, as well as contrast sensitivity data from the literature were compared to predictions of the IRC model. Predictions were reasonably good for grating and airplane targets.

  6. Improving Fall Detection Using an On-Wrist Wearable Accelerometer

    PubMed Central

    Chira, Camelia; González, Víctor M.; de la Cal, Enrique

    2018-01-01

    Fall detection is a very important challenge that affects both elderly people and the carers. Improvements in fall detection would reduce the aid response time. This research focuses on a method for fall detection with a sensor placed on the wrist. Falls are detected using a published threshold-based solution, although a study on threshold tuning has been carried out. The feature extraction is extended in order to balance the dataset for the minority class. Alternative models have been analyzed to reduce the computational constraints so the solution can be embedded in smart-phones or smart wristbands. Several published datasets have been used in the Materials and Methods section. Although these datasets do not include data from real falls of elderly people, a complete comparison study of fall-related datasets shows statistical differences between the simulated falls and real falls from participants suffering from impairment diseases. Given the obtained results, the rule-based systems represent a promising research line as they perform similarly to neural networks, but with a reduced computational cost. Furthermore, support vector machines performed with a high specificity. However, further research to validate the proposal in real on-line scenarios is needed. Furthermore, a slight improvement should be made to reduce the number of false alarms. PMID:29701721

  7. Predicting dynamic range and intensity discrimination for electrical pulse-train stimuli using a stochastic auditory nerve model: the effects of stimulus noise.

    PubMed

    Xu, Yifang; Collins, Leslie M

    2005-06-01

    This work investigates dynamic range and intensity discrimination for electrical pulse-train stimuli that are modulated by noise using a stochastic auditory nerve model. Based on a hypothesized monotonic relationship between loudness and the number of spikes elicited by a stimulus, theoretical prediction of the uncomfortable level has previously been determined by comparing spike counts to a fixed threshold, Nucl. However, no specific rule for determining Nucl has been suggested. Our work determines the uncomfortable level based on the excitation pattern of the neural response in a normal ear. The number of fibers corresponding to the portion of the basilar membrane driven by a stimulus at an uncomfortable level in a normal ear is related to Nucl at an uncomfortable level of the electrical stimulus. Intensity discrimination limens are predicted using signal detection theory via the probability mass function of the neural response and via experimental simulations. The results show that the uncomfortable level for pulse-train stimuli increases slightly as noise level increases. Combining this with our previous threshold predictions, we hypothesize that the dynamic range for noise-modulated pulse-train stimuli should increase with additive noise. However, since our predictions indicate that intensity discrimination under noise degrades, overall intensity coding performance may not improve significantly.

  8. 75 FR 53169 - Federal Acquisition Regulation; Federal Acquisition Circular 2005-45; Small Entity Compliance Guide

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-30

    ... Contract Act, and trade agreements thresholds. The Councils have also used the same methodology to adjust... Business Regulatory Enforcement Fairness Act of 1996. It consists of a summary of rules appearing in...-008 Davis. and Reinvestment Act of 2009 (the Recovery Act)-- Buy American Requirements for...

  9. 77 FR 20756 - Implementation of the Local Community Radio Act of 2010; Revision of Service and Eligibility...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-06

    ... the technical report entitled `Experimental Measurements of the Third-Adjacent Channel Impacts of Low... rules designed to prevent any predicted interference. 31. We propose to adopt a basic threshold test. This test is designed to closely track the interference standard developed by Mitre, without...

  10. 75 FR 55385 - Self-Regulatory Organizations; EDGX Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-10

    ...-Regulatory Organizations; EDGX Exchange, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule... guarantee limits the increase in a Members' execution costs associating with failing to meet the volume thresholds of other exchanges and ECNs while a Member is in the process of migrating volumes from one...

  11. 78 FR 69900 - Self-Regulatory Organizations; National Stock Exchange, Inc.; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-21

    ... the efficiency and cost-effectiveness of trading on the Exchange. Amended Rebate for Double Play... Organizations; National Stock Exchange, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule... to adjust the volume thresholds that must be met before an ETP Holder can be eligible to pay the...

  12. 75 FR 66680 - Defense Federal Acquisition Regulation Supplement; Trade Agreements-New Thresholds (DFARS 2009-D040)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-29

    ... Trade Organization Government Procurement Agreement and the Free Trade Agreements, as determined by the... of Subjects in 48 CFR Part 225 Government procurement. Ynette R. Shelkin, Editor, Defense Acquisition...: DoD is adopting as final, without change, the interim rule that amended the Defense Federal...

  13. 78 FR 8329 - Federal Housing Administration (FHA): Hospital Mortgage Insurance Program-Refinancing Hospital Loans

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-05

    ... rehabilitation, and equipment purchases. However, when the credit markets became more restrictive in 2008... than 20 percent eligible to be used for construction and/or equipment. The rule establishes threshold..., renovations, and/or equipment to be financed with mortgage proceeds and how those repairs, renovations, and/or...

  14. 77 FR 13952 - Federal Acquisition Regulation; United States-Korea Free Trade Agreement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-07

    ...DoD, GSA, and NASA are issuing an interim rule amending the Federal Acquisition Regulation (FAR) to implement the United States- Korea Free Trade Agreement. The Republic of Korea is already party to the World Trade Organization Government Procurement Agreement, but this trade agreement implements a lower procurement threshold.

  15. Chiral corrections to the Adler-Weisberger sum rule

    NASA Astrophysics Data System (ADS)

    Beane, Silas R.; Klco, Natalie

    2016-12-01

    The Adler-Weisberger sum rule for the nucleon axial-vector charge, gA , offers a unique signature of chiral symmetry and its breaking in QCD. Its derivation relies on both algebraic aspects of chiral symmetry, which guarantee the convergence of the sum rule, and dynamical aspects of chiral symmetry breaking—as exploited using chiral perturbation theory—which allow the rigorous inclusion of explicit chiral symmetry breaking effects due to light-quark masses. The original derivations obtained the sum rule in the chiral limit and, without the benefit of chiral perturbation theory, made various attempts at extrapolating to nonvanishing pion masses. In this paper, the leading, universal, chiral corrections to the chiral-limit sum rule are obtained. Using PDG data, a recent parametrization of the pion-nucleon total cross sections in the resonance region given by the SAID group, as well as recent Roy-Steiner equation determinations of subthreshold amplitudes, threshold parameters, and correlated low-energy constants, the Adler-Weisberger sum rule is confronted with experimental data. With uncertainty estimates associated with the cross-section parametrization, the Goldberger-Treimann discrepancy, and the truncation of the sum rule at O (Mπ4) in the chiral expansion, this work finds gA=1.248 ±0.010 ±0.007 ±0.013 .

  16. Catastrophic Disruption Threshold and Maximum Deflection from Kinetic Impact

    NASA Astrophysics Data System (ADS)

    Cheng, A. F.

    2017-12-01

    The use of a kinetic impactor to deflect an asteroid on a collision course with Earth was described in the NASA Near-Earth Object Survey and Deflection Analysis of Alternatives (2007) as the most mature approach for asteroid deflection and mitigation. The NASA DART mission will demonstrate asteroid deflection by kinetic impact at the Potentially Hazardous Asteroid 65803 Didymos in October, 2022. The kinetic impactor approach is considered to be applicable with warning times of 10 years or more and with hazardous asteroid diameters of 400 m or less. In principle, a larger kinetic impactor bringing greater kinetic energy could cause a larger deflection, but input of excessive kinetic energy will cause catastrophic disruption of the target, leaving possibly large fragments still on collision course with Earth. Thus the catastrophic disruption threshold limits the maximum deflection from a kinetic impactor. An often-cited rule of thumb states that the maximum deflection is 0.1 times the escape velocity before the target will be disrupted. It turns out this rule of thumb does not work well. A comparison to numerical simulation results shows that a similar rule applies in the gravity limit, for large targets more than 300 m, where the maximum deflection is roughly the escape velocity at momentum enhancement factor β=2. In the gravity limit, the rule of thumb corresponds to pure momentum coupling (μ=1/3), but simulations find a slightly different scaling μ=0.43. In the smaller target size range that kinetic impactors would apply to, the catastrophic disruption limit is strength-controlled. A DART-like impactor won't disrupt any target asteroid down to significantly smaller size than the 50 m below which a hazardous object would not penetrate the atmosphere in any case unless it is unusually strong.

  17. Effects of inspections in small world social networks with different contagion rules

    NASA Astrophysics Data System (ADS)

    Muñoz, Francisco; Nuño, Juan Carlos; Primicerio, Mario

    2015-08-01

    We study the way the structure of social links determines the effects of random inspections on a population formed by two types of individuals, e.g. tax-payers and tax-evaders (free riders). It is assumed that inspections occur in a larger scale than the population relaxation time and, therefore, a unique initial inspection is performed on a population that is completely formed by tax-evaders. Besides, the inspected tax-evaders become tax-payers forever. The social network is modeled as a Watts-Strogatz Small World whose topology can be tuned in terms of a parameter p ∈ [ 0 , 1 ] from regular (p = 0) to random (p = 1). Two local contagion rules are considered: (i) a continuous one that takes the proportion of neighbors to determine the next status of an individual (node) and (ii) a discontinuous (threshold rule) that assumes a minimum number of neighbors to modify the current state. In the former case, irrespective of the inspection intensity ν, the equilibrium population is always formed by tax-payers. In the mean field approach, we obtain the characteristic time of convergence as a function of ν and p. For the threshold contagion rule, we show that the response of the population to the intensity of inspections ν is a function of the structure of the social network p and the willingness of the individuals to change their state, r. It is shown that sharp transitions occur at critical values of ν that depends on p and r. We discuss these results within the context of tax evasion and fraud where the strategies of inspection could be of major relevance.

  18. How mechanisms of perceptual decision-making affect the psychometric function

    PubMed Central

    Gold, Joshua I.; Ding, Long

    2012-01-01

    Psychometric functions are often interpreted in the context of Signal Detection Theory, which emphasizes a distinction between sensory processing and non-sensory decision rules in the brain. This framework has helped to relate perceptual sensitivity to the “neurometric” sensitivity of sensory-driven neural activity. However, perceptual sensitivity, as interpreted via Signal Detection Theory, is based on not just how the brain represents relevant sensory information, but also how that information is read out to form the decision variable to which the decision rule is applied. Here we discuss recent advances in our understanding of this readout process and describe its effects on the psychometric function. In particular, we show that particular aspects of the readout process can have specific, identifiable effects on the threshold, slope, upper asymptote, time dependence, and choice dependence of psychometric functions. To illustrate these points, we emphasize studies of perceptual learning that have identified changes in the readout process that can lead to changes in these aspects of the psychometric function. We also discuss methods that have been used to distinguish contributions of the sensory representation versus its readout to psychophysical performance. PMID:22609483

  19. Modeling pedestrian shopping behavior using principles of bounded rationality: model comparison and validation

    NASA Astrophysics Data System (ADS)

    Zhu, Wei; Timmermans, Harry

    2011-06-01

    Models of geographical choice behavior have been dominantly based on rational choice models, which assume that decision makers are utility-maximizers. Rational choice models may be less appropriate as behavioral models when modeling decisions in complex environments in which decision makers may simplify the decision problem using heuristics. Pedestrian behavior in shopping streets is an example. We therefore propose a modeling framework for pedestrian shopping behavior incorporating principles of bounded rationality. We extend three classical heuristic rules (conjunctive, disjunctive and lexicographic rule) by introducing threshold heterogeneity. The proposed models are implemented using data on pedestrian behavior in Wang Fujing Street, the city center of Beijing, China. The models are estimated and compared with multinomial logit models and mixed logit models. Results show that the heuristic models are the best for all the decisions that are modeled. Validation tests are carried out through multi-agent simulation by comparing simulated spatio-temporal agent behavior with the observed pedestrian behavior. The predictions of heuristic models are slightly better than those of the multinomial logit models.

  20. Severe Weather Forecast Decision Aid

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III; Wheeler, Mark M.; Short, David A.

    2005-01-01

    This report presents a 15-year climatological study of severe weather events and related severe weather atmospheric parameters. Data sources included local forecast rules, archived sounding data, Cloud-to-Ground Lightning Surveillance System (CGLSS) data, surface and upper air maps, and two severe weather event databases covering east-central Florida. The local forecast rules were used to set threat assessment thresholds for stability parameters that were derived from the sounding data. The severe weather events databases were used to identify days with reported severe weather and the CGLSS data was used to differentiate between lightning and non-lightning days. These data sets provided the foundation for analyzing the stability parameters and synoptic patterns that were used to develop an objective tool to aid in forecasting severe weather events. The period of record for the analysis was May - September, 1989 - 2003. The results indicate that there are certain synoptic patterns more prevalent on days with severe weather and some of the stability parameters are better predictors of severe weather days based on locally tuned threat values. The results also revealed the stability parameters that did not display any skill related to severe weather days. An interactive web-based Severe Weather Decision Aid was developed to assist the duty forecaster by providing a level of objective guidance based on the analysis of the stability parameters, CGLSS data, and synoptic-scale dynamics. The tool will be tested and evaluated during the 2005 warm season.

  1. Cocaine Promotes Coincidence Detection and Lowers Induction Threshold during Hebbian Associative Synaptic Potentiation in Prefrontal Cortex.

    PubMed

    Ruan, Hongyu; Yao, Wei-Dong

    2017-01-25

    Addictive drugs usurp neural plasticity mechanisms that normally serve reward-related learning and memory, primarily by evoking changes in glutamatergic synaptic strength in the mesocorticolimbic dopamine circuitry. Here, we show that repeated cocaine exposure in vivo does not alter synaptic strength in the mouse prefrontal cortex during an early period of withdrawal, but instead modifies a Hebbian quantitative synaptic learning rule by broadening the temporal window and lowers the induction threshold for spike-timing-dependent LTP (t-LTP). After repeated, but not single, daily cocaine injections, t-LTP in layer V pyramidal neurons is induced at +30 ms, a normally ineffective timing interval for t-LTP induction in saline-exposed mice. This cocaine-induced, extended-timing t-LTP lasts for ∼1 week after terminating cocaine and is accompanied by an increased susceptibility to potentiation by fewer pre-post spike pairs, indicating a reduced t-LTP induction threshold. Basal synaptic strength and the maximal attainable t-LTP magnitude remain unchanged after cocaine exposure. We further show that the cocaine facilitation of t-LTP induction is caused by sensitized D1-cAMP/protein kinase A dopamine signaling in pyramidal neurons, which then pathologically recruits voltage-gated l-type Ca 2+ channels that synergize with GluN2A-containing NMDA receptors to drive t-LTP at extended timing. Our results illustrate a mechanism by which cocaine, acting on a key neuromodulation pathway, modifies the coincidence detection window during Hebbian plasticity to facilitate associative synaptic potentiation in prefrontal excitatory circuits. By modifying rules that govern activity-dependent synaptic plasticity, addictive drugs can derail the experience-driven neural circuit remodeling process important for executive control of reward and addiction. It is believed that addictive drugs often render an addict's brain reward system hypersensitive, leaving the individual more susceptible to relapse. We found that repeated cocaine exposure alters a Hebbian associative synaptic learning rule that governs activity-dependent synaptic plasticity in the mouse prefrontal cortex, characterized by a broader temporal window and a lower threshold for spike-timing-dependent LTP (t-LTP), a cellular form of learning and memory. This rule change is caused by cocaine-exacerbated D1-cAMP/protein kinase A dopamine signaling in pyramidal neurons that in turn pathologically recruits l-type Ca 2+ channels to facilitate coincidence detection during t-LTP induction. Our study provides novel insights on how cocaine, even with only brief exposure, may prime neural circuits for subsequent experience-dependent remodeling that may underlie certain addictive behavior. Copyright © 2017 the authors 0270-6474/17/370986-12$15.00/0.

  2. Impact of OSHA Final Rule—Recording Hearing Loss: An Analysis of an Industrial Audiometric Dataset

    PubMed Central

    Rabinowitz, Peter M.; Slade, Martin; Dixon-Ernst, Christine; Sircar, Kanta; Cullen, Mark

    2013-01-01

    The 2003 Occupational Safety and Health Administration (OSHA) Occupational Injury and Illness Recording and Reporting Final Rule changed the definition of recordable work-related hearing loss. We performed a study of the Alcoa Inc. audiometric database to evaluate the impact of this new rule. The 2003 rule increased the rate of potentially recordable hearing loss events from 0.2% to 1.6% per year. A total of 68.6% of potentially recordable cases had American Academy of Audiology/American Medical Association (AAO/AMA) hearing impairment at the time of recordability. On average, recordable loss occurred after onset of impairment, whereas the non-age-corrected 10-dB standard threshold shift (STS) usually preceded impairment. The OSHA Final Rule will significantly increase recordable cases of occupational hearing loss. The new case definition is usually accompanied by AAO/AMA hearing impairment. Other, more sensitive metrics should therefore be used for early detection and prevention of hearing loss. PMID:14665813

  3. Color edges extraction using statistical features and automatic threshold technique: application to the breast cancer cells.

    PubMed

    Ben Chaabane, Salim; Fnaiech, Farhat

    2014-01-23

    Color image segmentation has been so far applied in many areas; hence, recently many different techniques have been developed and proposed. In the medical imaging area, the image segmentation may be helpful to provide assistance to doctor in order to follow-up the disease of a certain patient from the breast cancer processed images. The main objective of this work is to rebuild and also to enhance each cell from the three component images provided by an input image. Indeed, from an initial segmentation obtained using the statistical features and histogram threshold techniques, the resulting segmentation may represent accurately the non complete and pasted cells and enhance them. This allows real help to doctors, and consequently, these cells become clear and easy to be counted. A novel method for color edges extraction based on statistical features and automatic threshold is presented. The traditional edge detector, based on the first and the second order neighborhood, describing the relationship between the current pixel and its neighbors, is extended to the statistical domain. Hence, color edges in an image are obtained by combining the statistical features and the automatic threshold techniques. Finally, on the obtained color edges with specific primitive color, a combination rule is used to integrate the edge results over the three color components. Breast cancer cell images were used to evaluate the performance of the proposed method both quantitatively and qualitatively. Hence, a visual and a numerical assessment based on the probability of correct classification (PC), the false classification (Pf), and the classification accuracy (Sens(%)) are presented and compared with existing techniques. The proposed method shows its superiority in the detection of points which really belong to the cells, and also the facility of counting the number of the processed cells. Computer simulations highlight that the proposed method substantially enhances the segmented image with smaller error rates better than other existing algorithms under the same settings (patterns and parameters). Moreover, it provides high classification accuracy, reaching the rate of 97.94%. Additionally, the segmentation method may be extended to other medical imaging types having similar properties.

  4. Mammogram segmentation using maximal cell strength updation in cellular automata.

    PubMed

    Anitha, J; Peter, J Dinesh

    2015-08-01

    Breast cancer is the most frequently diagnosed type of cancer among women. Mammogram is one of the most effective tools for early detection of the breast cancer. Various computer-aided systems have been introduced to detect the breast cancer from mammogram images. In a computer-aided diagnosis system, detection and segmentation of breast masses from the background tissues is an important issue. In this paper, an automatic segmentation method is proposed to identify and segment the suspicious mass regions of mammogram using a modified transition rule named maximal cell strength updation in cellular automata (CA). In coarse-level segmentation, the proposed method performs an adaptive global thresholding based on the histogram peak analysis to obtain the rough region of interest. An automatic seed point selection is proposed using gray-level co-occurrence matrix-based sum average feature in the coarse segmented image. Finally, the method utilizes CA with the identified initial seed point and the modified transition rule to segment the mass region. The proposed approach is evaluated over the dataset of 70 mammograms with mass from mini-MIAS database. Experimental results show that the proposed approach yields promising results to segment the mass region in the mammograms with the sensitivity of 92.25% and accuracy of 93.48%.

  5. Computer-Aided Diagnostic (CAD) Scheme by Use of Contralateral Subtraction Technique

    NASA Astrophysics Data System (ADS)

    Nagashima, Hiroyuki; Harakawa, Tetsumi

    We developed a computer-aided diagnostic (CAD) scheme for detection of subtle image findings of acute cerebral infarction in brain computed tomography (CT) by using a contralateral subtraction technique. In our computerized scheme, the lateral inclination of image was first corrected automatically by rotating and shifting. The contralateral subtraction image was then derived by subtraction of reversed image from original image. Initial candidates for acute cerebral infarctions were identified using the multiple-thresholding and image filtering techniques. As the 1st step for removing false positive candidates, fourteen image features were extracted in each of the initial candidates. Halfway candidates were detected by applying the rule-based test with these image features. At the 2nd step, five image features were extracted using the overlapping scale with halfway candidates in interest slice and upper/lower slice image. Finally, acute cerebral infarction candidates were detected by applying the rule-based test with five image features. The sensitivity in the detection for 74 training cases was 97.4% with 3.7 false positives per image. The performance of CAD scheme for 44 testing cases had an approximate result to training cases. Our CAD scheme using the contralateral subtraction technique can reveal suspected image findings of acute cerebral infarctions in CT images.

  6. Revisiting the Procedures for the Vector Data Quality Assurance in Practice

    NASA Astrophysics Data System (ADS)

    Erdoğan, M.; Torun, A.; Boyacı, D.

    2012-07-01

    Immense use of topographical data in spatial data visualization, business GIS (Geographic Information Systems) solutions and applications, mobile and location-based services forced the topo-data providers to create standard, up-to-date and complete data sets in a sustainable frame. Data quality has been studied and researched for more than two decades. There have been un-countable numbers of references on its semantics, its conceptual logical and representations and many applications on spatial databases and GIS. However, there is a gap between research and practice in the sense of spatial data quality which increases the costs and decreases the efficiency of data production. Spatial data quality is well-known by academia and industry but usually in different context. The research on spatial data quality stated several issues having practical use such as descriptive information, metadata, fulfillment of spatial relationships among data, integrity measures, geometric constraints etc. The industry and data producers realize them in three stages; pre-, co- and post data capturing. The pre-data capturing stage covers semantic modelling, data definition, cataloguing, modelling, data dictionary and schema creation processes. The co-data capturing stage covers general rules of spatial relationships, data and model specific rules such as topologic and model building relationships, geometric threshold, data extraction guidelines, object-object, object-belonging class, object-non-belonging class, class-class relationships to be taken into account during data capturing. And post-data capturing stage covers specified QC (quality check) benchmarks and checking compliance to general and specific rules. The vector data quality criteria are different from the views of producers and users. But these criteria are generally driven by the needs, expectations and feedbacks of the users. This paper presents a practical method which closes the gap between theory and practice. Development of spatial data quality concepts into developments and application requires existence of conceptual, logical and most importantly physical existence of data model, rules and knowledge of realization in a form of geo-spatial data. The applicable metrics and thresholds are determined on this concrete base. This study discusses application of geo-spatial data quality issues and QA (quality assurance) and QC procedures in the topographic data production. Firstly we introduce MGCP (Multinational Geospatial Co-production Program) data profile of NATO (North Atlantic Treaty Organization) DFDD (DGIWG Feature Data Dictionary), the requirements of data owner, the view of data producers for both data capturing and QC and finally QA to fulfil user needs. Then, our practical and new approach which divides the quality into three phases is introduced. Finally, implementation of our approach to accomplish metrics, measures and thresholds of quality definitions is discussed. In this paper, especially geometry and semantics quality and quality control procedures that can be performed by the producers are discussed. Some applicable best-practices that we experienced on techniques of quality control, defining regulations that define the objectives and data production procedures are given in the final remarks. These quality control procedures should include the visual checks over the source data, captured vector data and printouts, some automatic checks that can be performed by software and some semi-automatic checks by the interaction with quality control personnel. Finally, these quality control procedures should ensure the geometric, semantic, attribution and metadata quality of vector data.

  7. Environmental statistics and optimal regulation.

    PubMed

    Sivak, David A; Thomson, Matt

    2014-09-01

    Any organism is embedded in an environment that changes over time. The timescale for and statistics of environmental change, the precision with which the organism can detect its environment, and the costs and benefits of particular protein expression levels all will affect the suitability of different strategies--such as constitutive expression or graded response--for regulating protein levels in response to environmental inputs. We propose a general framework-here specifically applied to the enzymatic regulation of metabolism in response to changing concentrations of a basic nutrient-to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, respectively, and the costs associated with enzyme production. We use this framework to address three fundamental questions: (i) when a cell should prefer thresholding to a graded response; (ii) when there is a fitness advantage to implementing a Bayesian decision rule; and (iii) when retaining memory of the past provides a selective advantage. We specifically find that: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones.

  8. Why cost-effectiveness should trump (clinical) effectiveness: the ethical economics of the South West quadrant.

    PubMed

    Dowie, Jack

    2004-05-01

    In many health decision making situations there is a requirement that the effectiveness of interventions, usually their 'clinical' effectiveness, be established, as well as their cost-effectiveness. Often indeed this is effectively a prior requirement for their cost-effectiveness being investigated. If, however, one accepts the ethical argument for using a threshold incremental cost-effectiveness ratio (ICER) for interventions that are more effective but more costly (i.e. fall in the NE quadrant of the cost-effectiveness plane), one should apply the same decision rule in the SW quadrant, where the intervention is less effective but less costly. This implication is present in most standard treatments of cost-effectiveness analysis, including recent stochastic versions, and had gone relatively unquestioned within the discipline until the recent suggestion that the ICER threshold might be 'kinked'. A kinked threshold would, O'Brien et al. argue, better reflect the asymmetrical individual preferences found in empirical studies of consumer's willingness to pay and willingness to accept and justify different decision rules in the NE and SW quadrants. We reject the validity of such asymmetric preferences in the context of public health care decisions and consider and counter the two main 'ethical' objections that probably underlie the asymmetry in this case--the objection to 'taking away' and the objection to being required to undergo treatment that is less effective than no treatment at all. Copyright 2004 John Wiley & Sons, Ltd.

  9. Influence of crisp values on the object-based data extraction procedure from LiDAR data

    NASA Astrophysics Data System (ADS)

    Tomljenovic, Ivan; Rousell, Adam

    2014-05-01

    Nowadays a plethora of approaches attempt to automate the process of object extraction from LiDAR data. However, the majority of these methods require the fusion of the LiDAR dataset with other information such as photogrammetric imagery. The approach that has been used as the basis for this paper is a novel method which makes use of human knowledge and the CNL modelling language to automatically extract buildings solely from LiDAR point cloud data in a transferable method. A number of rules are implemented to generate an artificial intelligence algorithm which is used for the object extraction. Although the single dataset method has been found to successfully extract building footprints from the point cloud dataset, at this initial stage it has one restriction that may limit its effectiveness - a number of the rules that are used are based on crisp boundary values. If, for example, the slope of the ground surface is used as a rule for determining objects then the slope value of a pixel would be assessed to determine if it is suitable for a building structure. This check would be performed by identifying whether the slope value is less than or greater than a threshold value. However, in reality such a crisp classification process is likely not to be a true reflection of real world scenarios. For example, using the crisp methods a difference of 1° in slope could result in one region in a dataset being deemed suitable and its neighboring region being seen as not suitable. It is likely however that there is in reality little difference in the actual suitability of these two neighboring regions. A more suitable classification process may be the use of fuzzy set theory whereby each region is seen as having degree of membership to a number of sets (or classifications). In the above example, the two regions would likely be seen as having very similar membership values to the different sets, although this is obviously dependent on factors such as the extent of each region. The purpose of this study is to identify to what extent the use of explicit boundary values has on the overall building footprint dataset extracted. By performing the analysis multiple times using differing threshold values for rules, it is possible to compare the resultant datasets and thus identify the impact of using such classification procedures. If a significant difference is found between the resultant datasets, this would highlight that the use of such crisp methods in the extraction processes may not be optimal and that a future enhancement to the method would be to consider the use of fuzzy classification methods.

  10. Near-threshold neutral pion electroproduction at high momentum transfers and generalized form factors

    NASA Astrophysics Data System (ADS)

    Khetarpal, P.; Stoler, P.; Aznauryan, I. G.; Kubarovsky, V.; Adhikari, K. P.; Adikaram, D.; Aghasyan, M.; Amaryan, M. J.; Anderson, M. D.; Anefalos Pereira, S.; Anghinolfi, M.; Avakian, H.; Baghdasaryan, H.; Ball, J.; Baltzell, N. A.; Battaglieri, M.; Batourine, V.; Bedlinskiy, I.; Biselli, A. S.; Bono, J.; Boiarinov, S.; Briscoe, W. J.; Brooks, W. K.; Burkert, V. D.; Carman, D. S.; Celentano, A.; Charles, G.; Cole, P. L.; Contalbrigo, M.; Crede, V.; D'Angelo, A.; Dashyan, N.; De Vita, R.; De Sanctis, E.; Deur, A.; Djalali, C.; Doughty, D.; Dugger, M.; Dupre, R.; Egiyan, H.; El Alaoui, A.; El Fassi, L.; Eugenio, P.; Fedotov, G.; Fegan, S.; Fersch, R.; Fleming, J. A.; Fradi, A.; Gabrielyan, M. Y.; Garçon, M.; Gevorgyan, N.; Gilfoyle, G. P.; Giovanetti, K. L.; Girod, F. X.; Goetz, J. T.; Gohn, W.; Golovatch, E.; Gothe, R. W.; Griffioen, K. A.; Guegan, B.; Guidal, M.; Guo, L.; Hafidi, K.; Hakobyan, H.; Hanretty, C.; Harrison, N.; Hicks, K.; Ho, D.; Holtrop, M.; Hyde, C. E.; Ilieva, Y.; Ireland, D. G.; Ishkhanov, B. S.; Isupov, E. L.; Jo, H. S.; Joo, K.; Keller, D.; Khandaker, M.; Kim, A.; Kim, W.; Klein, F. J.; Koirala, S.; Kubarovsky, A.; Kuleshov, S. V.; Kvaltine, N. D.; Lewis, S.; Livingston, K.; Lu, H. Y.; MacGregor, I. J. D.; Mao, Y.; Martinez, D.; Mayer, M.; McKinnon, B.; Meyer, C. A.; Mineeva, T.; Mirazita, M.; Mokeev, V.; Montgomery, R. A.; Moutarde, H.; Munevar, E.; Munoz Camacho, C.; Nadel-Turonski, P.; Nasseripour, R.; Niccolai, S.; Niculescu, G.; Niculescu, I.; Osipenko, M.; Ostrovidov, A. I.; Pappalardo, L. L.; Paremuzyan, R.; Park, K.; Park, S.; Pasyuk, E.; Phelps, E.; Phillips, J. J.; Pisano, S.; Pogorelko, O.; Pozdniakov, S.; Price, J. W.; Procureur, S.; Protopopescu, D.; Puckett, A. J. R.; Raue, B. A.; Ricco, G.; Rimal, D.; Ripani, M.; Rosner, G.; Rossi, P.; Sabatié, F.; Saini, M. S.; Salgado, C.; Saylor, N. A.; Schott, D.; Schumacher, R. A.; Seder, E.; Seraydaryan, H.; Sharabian, Y. G.; Smith, G. D.; Sober, D. I.; Sokhan, D.; Stepanyan, S. S.; Stepanyan, S.; Strakovsky, I. I.; Strauch, S.; Taiuti, M.; Tang, W.; Taylor, C. E.; Tkachenko, S.; Ungaro, M.; Vernarsky, B.; Voskanyan, H.; Voutier, E.; Walford, N. K.; Weinstein, L. B.; Weygand, D. P.; Wood, M. H.; Zachariou, N.; Zhang, J.; Zhao, Z. W.; Zonta, I.

    2013-04-01

    We report the measurement of near-threshold neutral pion electroproduction cross sections and the extraction of the associated structure functions on the proton in the kinematic range Q2 from 2 to 4.5 GeV2 and W from 1.08 to 1.16 GeV. These measurements allow us to access the dominant pion-nucleon s-wave multipoles E0+ and S0+ in the near-threshold region. In the light-cone sum-rule framework (LCSR), these multipoles are related to the generalized form factors G1π0p(Q2) and G2π0p(Q2). The data are compared to these generalized form factors and the results for G1π0p(Q2) are found to be in good agreement with the LCSR predictions, but the level of agreement with G2π0p(Q2) is poor.

  11. The Evolution of Utilizing Manual Throttles to Avoid Excessively Low LH2 NPSP at the SSME Inlet

    NASA Technical Reports Server (NTRS)

    Henfling, Rick

    2011-01-01

    In the late 1970s, years before the Space Shuttle flew its maiden voyage, it was understood low liquid hydrogen (LH2) Net Positive Suction Pressure (NPSP) at the inlet to the Space Shuttle Main Engine (SSME) could have adverse effects on engine operation. A number of failures within both the External Tank (ET) and the Orbiter Main Propulsion System (MPS) could result in a low LH2 NPSP condition, which at extremely low levels can result in cavitation of SSME turbomachinery. Operational workarounds were developed to take advantage of the onboard crew s ability to manually throttle down the SSMEs (via the Pilot s Speedbrake/Throttle Controller), which alleviated the low LH2 NPSP condition. Manually throttling the SSME to a lower power level resulted in an increase in NPSP, mainly due to the reduction in frictional flow losses while at the lower throttle setting. Early in the Space Shuttle Program s history, the relevant Flight Rule for the Booster flight controllers in Mission Control did not distinguish between ET and Orbiter MPS failures and the same crew action was taken for both. However, after a review of all Booster operational techniques following the Challenger disaster in the late 1980s, it was determined manually throttling the SSME to a lower power was only effective for Orbiter MPS failures and the Flight Rule was updated to reflect this change. The Flight Rule and associated crew actions initially called for a single throttle step to minimum power level when a low threshold for NPSP was met. As engineers refined their understanding of the NPSP requirements for the SSME (through a robust testing program), the operational techniques evolved to take advantage of the additional capabilities. This paper will examine the evolution of the Flight rule and associated procedure and how increases in knowledge about the SSME and the Space Shuttle vehicle as a whole have helped shape their development. What once was a single throttle step when NPSP decreased to a certain threshold has now become three throttle steps, each occurring at a lower NPSP threshold. Additionally the procedure, which for early Space Shuttle missions required a Return-to-Launch-Site abort, now results in a nominal Main Engine Cut Off and no loss of mission objectives.

  12. The Evolution of Utilizing Manual Throttling to Avoid Excessively Low LH2 NPSP at the SSME Inlet

    NASA Technical Reports Server (NTRS)

    Henfling, Rick

    2010-01-01

    In the late 1970s, years before the Space Shuttle flew its maiden voyage, it was understood low liquid hydrogen (LH2) Net Positive Suction Pressure (NPSP) at the inlet to the Space Shuttle Main Engine (SSME) could have adverse effects on engine operation. A number of failures within both the External Tank (ET) and the Orbiter Main Propulsion System (MPS) could result in a low LH2 NPSP condition, which at extremely low levels can result in cavitation of SSME turbomachinery. Operational workarounds were developed to take advantage of the onboard crew s ability to manually throttle down the SSMEs (via the Pilot s Speedbrake/Throttle Controller), which alleviated the low LH2 NPSP condition. Manually throttling the SSME to a lower power level resulted in an increase in NPSP, mainly due to the reduction in frictional flow losses while at the lower throttle setting. Early in the Space Shuttle Program s history, the relevant Flight Rule for the Booster flight controller in Mission Control did not distinguish between ET and Orbiter MPS failures and the same crew action was taken for both. However, after a review of all Booster operational techniques following the Challenger disaster in the late 1980s, it was determined manually throttling the SSME to a lower power was only effective for Orbiter MPS failures and the Flight Rule was updated to reflect this change. The Flight Rule and associated crew actions initially called for a single throttle step to minimum power level when a low threshold for NPSP was met. As engineers refined their understanding of the NPSP requirements for the SSME (through a robust testing program), the operational techniques evolved to take advantage of the additional capabilities. This paper will examine the evolution of the Flight rule and associated procedure and how increases in knowledge about the SSME and the Space Shuttle vehicle as a whole have helped shape their development. What once was a single throttle step when NPSP decreased to a certain low threshold has now become three throttle steps, each occurring at a lower NPSP threshold. Additionally the procedure, which for early Space Shuttle missions required a Return-to-Launch-Site abort, now results in a nominal Main Engine Cut Off and no loss of mission objectives.

  13. Feasibility study: refinement of the TTC concept by additional rules based on in silico and experimental data.

    PubMed

    Hauge-Nilsen, Kristin; Keller, Detlef

    2015-01-01

    Starting from a single generic limit value, the threshold of toxicological concern (TTC) concept has been further developed over the years, e.g., by including differentiated structural classes according to the rules of Cramer et al. (Food Chem Toxicol 16: 255-276, 1978). In practice, the refined TTC concept of Munro et al. (Food Chem Toxicol 34: 829-867, 1996) is often applied. The purpose of this work was to explore the possibility of refining the concept by introducing additional structure-activity relationships and available toxicity data. Computer modeling was performed using the OECD Toolbox. No observed (adverse) effect level (NO(A)EL) data of 176 substances were collected in a basic data set. New subgroups were created applying the following criteria: extended Cramer rules, low bioavailability, low acute toxicity, no protein binding affinity, and consideration of predicted liver metabolism. The highest TTC limit value of 236 µg/kg/day was determined for a subgroup that combined the criteria "no protein binding affinity" and "predicted liver metabolism." This value was approximately eight times higher than the original Cramer class 1 limit value of 30 µg/kg/day. The results of this feasibility study indicate that inclusion of the proposed criteria may lead to improved TTC values. Thereby, the applicability of the TTC concept in risk assessment could be extended which could reduce the need to perform animal tests.

  14. 78 FR 75644 - Self-Regulatory Organizations; BOX Options Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... Organizations; BOX Options Exchange LLC; Notice of Filing and Immediate Effectiveness of a Proposed Rule Change... thresholds required to achieve each volume tier and will issue an information circular to inform Participants... calculation is pro- competitive and will result in lower total costs to Participants, a positive [[Page 75645...

  15. "Think Like a Lawyer" Using a Legal Reasoning Grid and Criterion-Referenced Assessment Rubric on IRAC (Issue, Rule, Application, Conclusion)

    ERIC Educational Resources Information Center

    Burton, Kelley

    2017-01-01

    The Australian Learning and Teaching Council's Bachelor of Laws Learning and Teaching Academic Standards Statement identified "thinking skills" as one of the six threshold learning outcomes for a Bachelor of Laws Program, which reinforced the significance of learning, teaching and assessing "thinking skills" in law schools…

  16. 77 FR 30359 - Defense Federal Acquisition Regulation Supplement: New Free Trade Agreement With Colombia (DFARS...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-22

    ... (see FAR 25.408). II. Discussion and Analysis This interim rule adds Colombia to the definition of...,000 for construction. Because the Colombia FTA construction threshold of $7,777,000 is the same as the... states that acquisitions that do not exceed $150,000 (with some exceptions) are automatically reserved...

  17. 78 FR 52679 - Safety Standard for Cigarette Lighters; Adjusted Customs Value for Cigarette Lighters

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-26

    ... Customs Value for Cigarette Lighters AGENCY: Consumer Product Safety Commission. ACTION: Final rule... refillable lighters that use butane or similar fuels and have a Customs Value or ex-factory price below a threshold value (initially set at $2.00 in 1993). The standard provides that the initial $2.00 value adjusts...

  18. 15 CFR 801.11 - Rules and regulations for the BE-80, Benchmark Survey of Financial Services Transactions Between...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... companies that own or influence, and are principally engaged in making management decisions for these firms... intermediary, and who had transactions (either sales or purchases) directly with unaffiliated foreign persons... providers or intermediaries. Because the $3,000,000 threshold applies separately to sales and purchases, the...

  19. 77 FR 56739 - Federal Acquisition Regulation; United States-Korea Free Trade Agreement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-13

    ...DoD, GSA, and NASA are adopting as final, without change, an interim rule amending the Federal Acquisition Regulation (FAR) to implement the United States-Korea Free Trade Agreement. The Republic of Korea is already party to the World Trade Organization Government Procurement Agreement, but this trade agreement implements a lower procurement threshold.

  20. A cell-based study on pedestrian acceleration and overtaking in a transfer station corridor

    NASA Astrophysics Data System (ADS)

    Ji, Xiangfeng; Zhou, Xuemei; Ran, Bin

    2013-04-01

    Pedestrian speed in a transfer station corridor is faster than usual and sometimes running can be found among some of them. In this paper, pedestrians are divided into two categories. The first one is aggressive, and the other is conservative. Aggressive pedestrians weaving their way through crowd in the corridor are the study object of this paper. During recent decades, much attention has been paid to the pedestrians' behavior, such as overtaking (also deceleration) and collision avoidance, and that continues in this paper. After sufficiently analyzing the characteristics of pedestrian flow in transfer station corridor, a cell-based model is presented in this paper, including the acceleration (also deceleration) and overtaking analysis. Acceleration (also deceleration) in a corridor is fixed according to Newton's Law and then speed calculated with a kinematic formula is discretized into cells based on the fuzzy logic. After the speed is updated, overtaking is analyzed based on updated speed and force explicitly, compared to rule-based models, which herein we call implicit ones. During the analysis of overtaking, a threshold value to determine the overtaking direction is introduced. Actually, model in this paper is a two-step one. The first step is to update speed, which is the cells the pedestrian can move in one time interval and the other is to analyze the overtaking. Finally, a comparison between the rule-based cellular automata, the model in this paper and data in HCM 2000 is made to demonstrate our model can be used to achieve reasonable simulation of acceleration (also deceleration) and overtaking among pedestrians.

  1. Modeling the interannual variability of microbial quality metrics of irrigation water in a Pennsylvania stream.

    PubMed

    Hong, Eun-Mi; Shelton, Daniel; Pachepsky, Yakov A; Nam, Won-Ho; Coppock, Cary; Muirhead, Richard

    2017-02-01

    Knowledge of the microbial quality of irrigation waters is extremely limited. For this reason, the US FDA has promulgated the Produce Rule, mandating the testing of irrigation water sources for many farms. The rule requires the collection and analysis of at least 20 water samples over two to four years to adequately evaluate the quality of water intended for produce irrigation. The objective of this work was to evaluate the effect of interannual weather variability on surface water microbial quality. We used the Soil and Water Assessment Tool model to simulate E. coli concentrations in the Little Cove Creek; this is a perennial creek located in an agricultural watershed in south-eastern Pennsylvania. The model performance was evaluated using the US FDA regulatory microbial water quality metrics of geometric mean (GM) and the statistical threshold value (STV). Using the 90-year time series of weather observations, we simulated and randomly sampled the time series of E. coli concentrations. We found that weather conditions of a specific year may strongly affect the evaluation of microbial quality and that the long-term assessment of microbial water quality may be quite different from the evaluation based on short-term observations. The variations in microbial concentrations and water quality metrics were affected by location, wetness of the hydrological years, and seasonality, with 15.7-70.1% of samples exceeding the regulatory threshold. The results of this work demonstrate the value of using modeling to design and evaluate monitoring protocols to assess the microbial quality of water used for produce irrigation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Demonstration of an anti-hyperalgesic effect of a novel pan-Trk inhibitor PF-06273340 in a battery of human evoked pain models.

    PubMed

    Loudon, Peter; Siebenga, Pieter; Gorman, Donal; Gore, Katrina; Dua, Pinky; van Amerongen, Guido; Hay, Justin L; Groeneveld, Geert Jan; Butt, Richard P

    2018-02-01

    Inhibitors of nerve growth factor (NGF) reduce pain in several chronic pain indications. NGF signals through tyrosine kinase receptors of the tropomyosin-related kinase (Trk) family and the unrelated p75 receptor. PF-06273340 is a small molecule inhibitor of Trks A, B and C that reduces pain in nonclinical models, and the present study aimed to investigate the pharmacodynamics of this first-in-class molecule in humans. A randomized, double-blind, single-dose, placebo- and active-controlled five-period crossover study was conducted in healthy human subjects (NCT02260947). Subjects received five treatments: PF-06273340 50 mg, PF-06273340 400 mg, pregabalin 300 mg, ibuprofen 600 mg and placebo. The five primary endpoints were the pain detection threshold for the thermal pain tests and the pain tolerance threshold for the cold pressor, electrical stair and pressure pain tests. The trial had predefined decision rules based on 95% confidence that the PF-06273340 effect was better than that of placebo. Twenty subjects entered the study, with 18 completing all five periods. The high dose of PF-06273340 met the decision rules on the ultraviolet (UV) B skin thermal pain endpoint [least squares (LS) mean vs. placebo: 1.13, 95% confidence interval: 0.64-1.61], but not on the other four primary endpoints. The low dose did not meet the decision criteria for any of the five primary endpoints. Pregabalin (cold pressor and electrical stair tests) and ibuprofen (UVB thermal pain) showed significant analgesic effects on expected endpoints. The study demonstrated, for the first time, the translation of nonclinical effects into man in an inflammatory pain analgesic pharmacodynamic endpoint using a pan-Trk inhibitor. © 2017 The British Pharmacological Society.

  3. Twofold processing for denoising ultrasound medical images.

    PubMed

    Kishore, P V V; Kumar, K V V; Kumar, D Anil; Prasad, M V D; Goutham, E N D; Rahul, R; Krishna, C B S Vamsi; Sandeep, Y

    2015-01-01

    Ultrasound medical (US) imaging non-invasively pictures inside of a human body for disease diagnostics. Speckle noise attacks ultrasound images degrading their visual quality. A twofold processing algorithm is proposed in this work to reduce this multiplicative speckle noise. First fold used block based thresholding, both hard (BHT) and soft (BST), on pixels in wavelet domain with 8, 16, 32 and 64 non-overlapping block sizes. This first fold process is a better denoising method for reducing speckle and also inducing object of interest blurring. The second fold process initiates to restore object boundaries and texture with adaptive wavelet fusion. The degraded object restoration in block thresholded US image is carried through wavelet coefficient fusion of object in original US mage and block thresholded US image. Fusion rules and wavelet decomposition levels are made adaptive for each block using gradient histograms with normalized differential mean (NDF) to introduce highest level of contrast between the denoised pixels and the object pixels in the resultant image. Thus the proposed twofold methods are named as adaptive NDF block fusion with hard and soft thresholding (ANBF-HT and ANBF-ST). The results indicate visual quality improvement to an interesting level with the proposed twofold processing, where the first fold removes noise and second fold restores object properties. Peak signal to noise ratio (PSNR), normalized cross correlation coefficient (NCC), edge strength (ES), image quality Index (IQI) and structural similarity index (SSIM), measure the quantitative quality of the twofold processing technique. Validation of the proposed method is done by comparing with anisotropic diffusion (AD), total variational filtering (TVF) and empirical mode decomposition (EMD) for enhancement of US images. The US images are provided by AMMA hospital radiology labs at Vijayawada, India.

  4. Auditory brainstem response latency in forward masking, a marker of sensory deficits in listeners with normal hearing thresholds

    PubMed Central

    Mehraei, Golbarg; Gallardo, Andreu Paredes; Shinn-Cunningham, Barbara G.; Dau, Torsten

    2017-01-01

    In rodent models, acoustic exposure too modest to elevate hearing thresholds can nonetheless cause auditory nerve fiber deafferentation, interfering with the coding of supra-threshold sound. Low-spontaneous rate nerve fibers, important for encoding acoustic information at supra-threshold levels and in noise, are more susceptible to degeneration than high-spontaneous rate fibers. The change in auditory brainstem response (ABR) wave-V latency with noise level has been shown to be associated with auditory nerve deafferentation. Here, we measured ABR in a forward masking paradigm and evaluated wave-V latency changes with increasing masker-to-probe intervals. In the same listeners, behavioral forward masking detection thresholds were measured. We hypothesized that 1) auditory nerve fiber deafferentation increases forward masking thresholds and increases wave-V latency and 2) a preferential loss of low-SR fibers results in a faster recovery of wave-V latency as the slow contribution of these fibers is reduced. Results showed that in young audiometrically normal listeners, a larger change in wave-V latency with increasing masker-to-probe interval was related to a greater effect of a preceding masker behaviorally. Further, the amount of wave-V latency change with masker-to-probe interval was positively correlated with the rate of change in forward masking detection thresholds. Although we cannot rule out central contributions, these findings are consistent with the hypothesis that auditory nerve fiber deafferentation occurs in humans and may predict how well individuals can hear in noisy environments. PMID:28159652

  5. Adaptive time-sequential binary sensing for high dynamic range imaging

    NASA Astrophysics Data System (ADS)

    Hu, Chenhui; Lu, Yue M.

    2012-06-01

    We present a novel image sensor for high dynamic range imaging. The sensor performs an adaptive one-bit quantization at each pixel, with the pixel output switched from 0 to 1 only if the number of photons reaching that pixel is greater than or equal to a quantization threshold. With an oracle knowledge of the incident light intensity, one can pick an optimal threshold (for that light intensity) and the corresponding Fisher information contained in the output sequence follows closely that of an ideal unquantized sensor over a wide range of intensity values. This observation suggests the potential gains one may achieve by adaptively updating the quantization thresholds. As the main contribution of this work, we propose a time-sequential threshold-updating rule that asymptotically approaches the performance of the oracle scheme. With every threshold mapped to a number of ordered states, the dynamics of the proposed scheme can be modeled as a parametric Markov chain. We show that the frequencies of different thresholds converge to a steady-state distribution that is concentrated around the optimal choice. Moreover, numerical experiments show that the theoretical performance measures (Fisher information and Craḿer-Rao bounds) can be achieved by a maximum likelihood estimator, which is guaranteed to find globally optimal solution due to the concavity of the log-likelihood functions. Compared with conventional image sensors and the strategy that utilizes a constant single-photon threshold considered in previous work, the proposed scheme attains orders of magnitude improvement in terms of sensor dynamic ranges.

  6. Computational mate choice: theory and empirical evidence.

    PubMed

    Castellano, Sergio; Cadeddu, Giorgia; Cermelli, Paolo

    2012-06-01

    The present review is based on the thesis that mate choice results from information-processing mechanisms governed by computational rules and that, to understand how females choose their mates, we should identify which are the sources of information and how they are used to make decisions. We describe mate choice as a three-step computational process and for each step we present theories and review empirical evidence. The first step is a perceptual process. It describes the acquisition of evidence, that is, how females use multiple cues and signals to assign an attractiveness value to prospective mates (the preference function hypothesis). The second step is a decisional process. It describes the construction of the decision variable (DV), which integrates evidence (private information by direct assessment), priors (public information), and value (perceived utility) of prospective mates into a quantity that is used by a decision rule (DR) to produce a choice. We make the assumption that females are optimal Bayesian decision makers and we derive a formal model of DV that can explain the effects of preference functions, mate copying, social context, and females' state and condition on the patterns of mate choice. The third step of mating decision is a deliberative process that depends on the DRs. We identify two main categories of DRs (absolute and comparative rules), and review the normative models of mate sampling tactics associated to them. We highlight the limits of the normative approach and present a class of computational models (sequential-sampling models) that are based on the assumption that DVs accumulate noisy evidence over time until a decision threshold is reached. These models force us to rethink the dichotomy between comparative and absolute decision rules, between discrimination and recognition, and even between rational and irrational choice. Since they have a robust biological basis, we think they may represent a useful theoretical tool for behavioural ecologist interested in integrating proximate and ultimate causes of mate choice. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Process service quality evaluation based on Dempster-Shafer theory and support vector machine.

    PubMed

    Pei, Feng-Que; Li, Dong-Bo; Tong, Yi-Fei; He, Fei

    2017-01-01

    Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM) and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs) are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.

  8. Moral empiricism and the bias for act-based rules.

    PubMed

    Ayars, Alisabeth; Nichols, Shaun

    2017-10-01

    Previous studies on rule learning show a bias in favor of act-based rules, which prohibit intentionally producing an outcome but not merely allowing the outcome. Nichols, Kumar, Lopez, Ayars, and Chan (2016) found that exposure to a single sample violation in which an agent intentionally causes the outcome was sufficient for participants to infer that the rule was act-based. One explanation is that people have an innate bias to think rules are act-based. We suggest an alternative empiricist account: since most rules that people learn are act-based, people form an overhypothesis (Goodman, 1955) that rules are typically act-based. We report three studies that indicate that people can use information about violations to form overhypotheses about rules. In study 1, participants learned either three "consequence-based" rules that prohibited allowing an outcome or three "act-based" rules that prohibiting producing the outcome; in a subsequent learning task, we found that participants who had learned three consequence-based rules were more likely to think that the new rule prohibited allowing an outcome. In study 2, we presented participants with either 1 consequence-based rule or 3 consequence-based rules, and we found that those exposed to 3 such rules were more likely to think that a new rule was also consequence based. Thus, in both studies, it seems that learning 3 consequence-based rules generates an overhypothesis to expect new rules to be consequence-based. In a final study, we used a more subtle manipulation. We exposed participants to examples act-based or accident-based (strict liability) laws and then had them learn a novel rule. We found that participants who were exposed to the accident-based laws were more likely to think a new rule was accident-based. The fact that participants' bias for act-based rules can be shaped by evidence from other rules supports the idea that the bias for act-based rules might be acquired as an overhypothesis from the preponderance of act-based rules. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. 78 FR 4725 - Escrow Requirements Under the Truth in Lending Act (Regulation Z)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-22

    ...The Bureau of Consumer Financial Protection (Bureau) is publishing a final rule that amends Regulation Z (Truth in Lending) to implement certain amendments to the Truth in Lending Act made by the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd-Frank Act). Regulation Z currently requires creditors to establish escrow accounts for higher-priced mortgage loans secured by a first lien on a principal dwelling. The rule implements statutory changes made by the Dodd-Frank Act that lengthen the time for which a mandatory escrow account established for a higher-priced mortgage loan must be maintained. The rule also exempts certain transactions from the statute's escrow requirement. The primary exemption applies to mortgage transactions extended by creditors that operate predominantly in rural or underserved areas, originate a limited number of first-lien covered transactions, have assets below a certain threshold, and do not maintain escrow accounts on mortgage obligations they currently service.

  10. 77 FR 30087 - Air Quality Designations for the 2008 Ozone National Ambient Air Quality Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-21

    ...This rule establishes initial air quality designations for most areas in the United States, including areas of Indian country, for the 2008 primary and secondary national ambient air quality standards (NAAQS) for ozone. The designations for several counties in Illinois, Indiana, and Wisconsin that the EPA is considering for inclusion in the Chicago nonattainment area will be designated in a subsequent action, no later than May 31, 2012. Areas designated as nonattainment are also being classified by operation of law according to the severity of their air quality problems. The classification categories are Marginal, Moderate, Serious, Severe, and Extreme. The EPA is establishing the air quality thresholds that define the classifications in a separate rule that the EPA is signing and publishing in the Federal Register on the same schedule as these designations. In accordance with that separate rule, six nonattainment areas in California are being reclassified to a higher classification.

  11. Epidemic spreading on preferred degree adaptive networks.

    PubMed

    Jolad, Shivakumar; Liu, Wenjia; Schmittmann, B; Zia, R K P

    2012-01-01

    We study the standard SIS model of epidemic spreading on networks where individuals have a fluctuating number of connections around a preferred degree κ. Using very simple rules for forming such preferred degree networks, we find some unusual statistical properties not found in familiar Erdös-Rényi or scale free networks. By letting κ depend on the fraction of infected individuals, we model the behavioral changes in response to how the extent of the epidemic is perceived. In our models, the behavioral adaptations can be either 'blind' or 'selective'--depending on whether a node adapts by cutting or adding links to randomly chosen partners or selectively, based on the state of the partner. For a frozen preferred network, we find that the infection threshold follows the heterogeneous mean field result λ(c)/μ = <κ>/<κ2> and the phase diagram matches the predictions of the annealed adjacency matrix (AAM) approach. With 'blind' adaptations, although the epidemic threshold remains unchanged, the infection level is substantially affected, depending on the details of the adaptation. The 'selective' adaptive SIS models are most interesting. Both the threshold and the level of infection changes, controlled not only by how the adaptations are implemented but also how often the nodes cut/add links (compared to the time scales of the epidemic spreading). A simple mean field theory is presented for the selective adaptations which capture the qualitative and some of the quantitative features of the infection phase diagram.

  12. 78 FR 66661 - Restrictions on Sales of Assets of a Covered Financial Company by the Federal Deposit Insurance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-06

    ... notwithstanding the passage of any amount of time. The approach to determine whether a person has participated in... comparatively similar to the approach under Part 340. In the proposed rule, the dollar threshold for a... company's assets are being sold. The FDIC believes adopting this more stringent approach is consistent...

  13. 78 FR 26836 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-08

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-69503; File No. SR-NYSEArca-2013-44] Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule Change Amending Its Schedule of Fees and Charges for Exchange Services To Amend Step Up Tier 2 to Reduce the Volume Threshold Requirements Needed To...

  14. Evaluating Alerting and Guidance Performance of a UAS Detect-And-Avoid System

    NASA Technical Reports Server (NTRS)

    Lee, Seung Man; Park, Chunki; Thipphavong, David P.; Isaacson, Douglas R.; Santiago, Confesor

    2016-01-01

    A key challenge to the routine, safe operation of unmanned aircraft systems (UAS) is the development of detect-and-avoid (DAA) systems to aid the UAS pilot in remaining "well clear" of nearby aircraft. The goal of this study is to investigate the effect of alerting criteria and pilot response delay on the safety and performance of UAS DAA systems in the context of routine civil UAS operations in the National Airspace System (NAS). A NAS-wide fast-time simulation study was conducted to assess UAS DAA system performance with a large number of encounters and a broad set of DAA alerting and guidance system parameters. Three attributes of the DAA system were controlled as independent variables in the study to conduct trade-off analyses: UAS trajectory prediction method (dead-reckoning vs. intent-based), alerting time threshold (related to predicted time to LoWC), and alerting distance threshold (related to predicted Horizontal Miss Distance, or HMD). A set of metrics, such as the percentage of true positive, false positive, and missed alerts, based on signal detection theory and analysis methods utilizing the Receiver Operating Characteristic (ROC) curves were proposed to evaluate the safety and performance of DAA alerting and guidance systems and aid development of DAA system performance standards. The effect of pilot response delay on the performance of DAA systems was evaluated using a DAA alerting and guidance model and a pilot model developed to support this study. A total of 18 fast-time simulations were conducted with nine different DAA alerting threshold settings and two different trajectory prediction methods, using recorded radar traffic from current Visual Flight Rules (VFR) operations, and supplemented with DAA-equipped UAS traffic based on mission profiles modeling future UAS operations. Results indicate DAA alerting distance threshold has a greater effect on DAA system performance than DAA alerting time threshold or ownship trajectory prediction method. Further analysis on the alert lead time (time in advance of predicted loss of well clear at which a DAA alert is first issued) indicated a strong positive correlation between alert lead time and DAA system performance (i.e. the ability of the UAS pilot to maneuver the unmanned aircraft to remain well clear). While bigger distance thresholds had beneficial effects on alert lead time and missed alert rate, it also generated a higher rate of false alerts. In the design and development of DAA alerting and guidance systems, therefore, the positive and negative effects of false alerts and missed alerts should be carefully considered to achieve acceptable alerting system performance by balancing false and missed alerts. The results and methodology presented in this study are expected to help stakeholders, policymakers and standards committees define the appropriate setting of DAA system parameter thresholds for UAS that ensure safety while minimizing operational impacts to the NAS and equipage requirements for its users before DAA operational performance standards can be finalized.

  15. A generalized linear integrate-and-fire neural model produces diverse spiking behaviors.

    PubMed

    Mihalaş, Stefan; Niebur, Ernst

    2009-03-01

    For simulations of neural networks, there is a trade-off between the size of the network that can be simulated and the complexity of the model used for individual neurons. In this study, we describe a generalization of the leaky integrate-and-fire model that produces a wide variety of spiking behaviors while still being analytically solvable between firings. For different parameter values, the model produces spiking or bursting, tonic, phasic or adapting responses, depolarizing or hyperpolarizing after potentials and so forth. The model consists of a diagonalizable set of linear differential equations describing the time evolution of membrane potential, a variable threshold, and an arbitrary number of firing-induced currents. Each of these variables is modified by an update rule when the potential reaches threshold. The variables used are intuitive and have biological significance. The model's rich behavior does not come from the differential equations, which are linear, but rather from complex update rules. This single-neuron model can be implemented using algorithms similar to the standard integrate-and-fire model. It is a natural match with event-driven algorithms for which the firing times are obtained as a solution of a polynomial equation.

  16. A Generalized Linear Integrate-and-Fire Neural Model Produces Diverse Spiking Behaviors

    PubMed Central

    Mihalaş, Ştefan; Niebur, Ernst

    2010-01-01

    For simulations of neural networks, there is a trade-off between the size of the network that can be simulated and the complexity of the model used for individual neurons. In this study, we describe a generalization of the leaky integrate-and-fire model that produces a wide variety of spiking behaviors while still being analytically solvable between firings. For different parameter values, the model produces spiking or bursting, tonic, phasic or adapting responses, depolarizing or hyperpolarizing after potentials and so forth. The model consists of a diagonalizable set of linear differential equations describing the time evolution of membrane potential, a variable threshold, and an arbitrary number of firing-induced currents. Each of these variables is modified by an update rule when the potential reaches threshold. The variables used are intuitive and have biological significance. The model’s rich behavior does not come from the differential equations, which are linear, but rather from complex update rules. This single-neuron model can be implemented using algorithms similar to the standard integrate-and-fire model. It is a natural match with event-driven algorithms for which the firing times are obtained as a solution of a polynomial equation. PMID:18928368

  17. Detection algorithm for glass bottle mouth defect by continuous wavelet transform based on machine vision

    NASA Astrophysics Data System (ADS)

    Qian, Jinfang; Zhang, Changjiang

    2014-11-01

    An efficient algorithm based on continuous wavelet transform combining with pre-knowledge, which can be used to detect the defect of glass bottle mouth, is proposed. Firstly, under the condition of ball integral light source, a perfect glass bottle mouth image is obtained by Japanese Computar camera through the interface of IEEE-1394b. A single threshold method based on gray level histogram is used to obtain the binary image of the glass bottle mouth. In order to efficiently suppress noise, moving average filter is employed to smooth the histogram of original glass bottle mouth image. And then continuous wavelet transform is done to accurately determine the segmentation threshold. Mathematical morphology operations are used to get normal binary bottle mouth mask. A glass bottle to be detected is moving to the detection zone by conveyor belt. Both bottle mouth image and binary image are obtained by above method. The binary image is multiplied with normal bottle mask and a region of interest is got. Four parameters (number of connected regions, coordinate of centroid position, diameter of inner cycle, and area of annular region) can be computed based on the region of interest. Glass bottle mouth detection rules are designed by above four parameters so as to accurately detect and identify the defect conditions of glass bottle. Finally, the glass bottles of Coca-Cola Company are used to verify the proposed algorithm. The experimental results show that the proposed algorithm can accurately detect the defect conditions of the glass bottles and have 98% detecting accuracy.

  18. Environmental Statistics and Optimal Regulation

    PubMed Central

    2014-01-01

    Any organism is embedded in an environment that changes over time. The timescale for and statistics of environmental change, the precision with which the organism can detect its environment, and the costs and benefits of particular protein expression levels all will affect the suitability of different strategies–such as constitutive expression or graded response–for regulating protein levels in response to environmental inputs. We propose a general framework–here specifically applied to the enzymatic regulation of metabolism in response to changing concentrations of a basic nutrient–to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, respectively, and the costs associated with enzyme production. We use this framework to address three fundamental questions: (i) when a cell should prefer thresholding to a graded response; (ii) when there is a fitness advantage to implementing a Bayesian decision rule; and (iii) when retaining memory of the past provides a selective advantage. We specifically find that: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones. PMID:25254493

  19. An evaluation of inferential procedures for adaptive clinical trial designs with pre-specified rules for modifying the sample size.

    PubMed

    Levin, Gregory P; Emerson, Sarah C; Emerson, Scott S

    2014-09-01

    Many papers have introduced adaptive clinical trial methods that allow modifications to the sample size based on interim estimates of treatment effect. There has been extensive commentary on type I error control and efficiency considerations, but little research on estimation after an adaptive hypothesis test. We evaluate the reliability and precision of different inferential procedures in the presence of an adaptive design with pre-specified rules for modifying the sampling plan. We extend group sequential orderings of the outcome space based on the stage at stopping, likelihood ratio statistic, and sample mean to the adaptive setting in order to compute median-unbiased point estimates, exact confidence intervals, and P-values uniformly distributed under the null hypothesis. The likelihood ratio ordering is found to average shorter confidence intervals and produce higher probabilities of P-values below important thresholds than alternative approaches. The bias adjusted mean demonstrates the lowest mean squared error among candidate point estimates. A conditional error-based approach in the literature has the benefit of being the only method that accommodates unplanned adaptations. We compare the performance of this and other methods in order to quantify the cost of failing to plan ahead in settings where adaptations could realistically be pre-specified at the design stage. We find the cost to be meaningful for all designs and treatment effects considered, and to be substantial for designs frequently proposed in the literature. © 2014, The International Biometric Society.

  20. A New Automatic Method of Urban Areas Mapping in East Asia from LANDSAT Data

    NASA Astrophysics Data System (ADS)

    XU, R.; Jia, G.

    2012-12-01

    Cities, as places where human activities are concentrated, account for a small percent of global land cover but are frequently cited as the chief causes of, and solutions to, climate, biogeochemistry, and hydrology processes at local, regional, and global scales. Accompanying with uncontrolled economic growth, urban sprawl has been attributed to the accelerating integration of East Asia into the world economy and involved dramatic changes in its urban form and land use. To understand the impact of urban extent on biogeophysical processes, reliable mapping of built-up areas is particularly essential in eastern cities as a result of their characteristics of smaller patches, more fragile, and a lower fraction of the urban landscape which does not have natural than in the West. Segmentation of urban land from other land-cover types using remote sensing imagery can be done by standard classification processes as well as a logic rule calculation based on spectral indices and their derivations. Efforts to establish such a logic rule with no threshold for automatically mapping are highly worthwhile. Existing automatic methods are reviewed, and then a proposed approach is introduced including the calculation of the new index and the improved logic rule. Following this, existing automatic methods as well as the proposed approach are compared in a common context. Afterwards, the proposed approach is tested separately in cities of large, medium, and small scale in East Asia selected from different LANDSAT images. The results are promising as the approach can efficiently segment urban areas, even in the presence of more complex eastern cities. Key words: Urban extraction; Automatic Method; Logic Rule; LANDSAT images; East AisaThe Proposed Approach of Extraction of Urban Built-up Areas in Guangzhou, China

  1. How mechanisms of perceptual decision-making affect the psychometric function.

    PubMed

    Gold, Joshua I; Ding, Long

    2013-04-01

    Psychometric functions are often interpreted in the context of Signal Detection Theory, which emphasizes a distinction between sensory processing and non-sensory decision rules in the brain. This framework has helped to relate perceptual sensitivity to the "neurometric" sensitivity of sensory-driven neural activity. However, perceptual sensitivity, as interpreted via Signal Detection Theory, is based on not just how the brain represents relevant sensory information, but also how that information is read out to form the decision variable to which the decision rule is applied. Here we discuss recent advances in our understanding of this readout process and describe its effects on the psychometric function. In particular, we show that particular aspects of the readout process can have specific, identifiable effects on the threshold, slope, upper asymptote, time dependence, and choice dependence of psychometric functions. To illustrate these points, we emphasize studies of perceptual learning that have identified changes in the readout process that can lead to changes in these aspects of the psychometric function. We also discuss methods that have been used to distinguish contributions of the sensory representation versus its readout to psychophysical performance. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Recruitment and Consolidation of Cell Assemblies for Words by Way of Hebbian Learning and Competition in a Multi-Layer Neural Network

    PubMed Central

    Garagnani, Max; Wennekers, Thomas; Pulvermüller, Friedemann

    2009-01-01

    Current cognitive theories postulate either localist representations of knowledge or fully overlapping, distributed ones. We use a connectionist model that closely replicates known anatomical properties of the cerebral cortex and neurophysiological principles to show that Hebbian learning in a multi-layer neural network leads to memory traces (cell assemblies) that are both distributed and anatomically distinct. Taking the example of word learning based on action-perception correlation, we document mechanisms underlying the emergence of these assemblies, especially (i) the recruitment of neurons and consolidation of connections defining the kernel of the assembly along with (ii) the pruning of the cell assembly’s halo (consisting of very weakly connected cells). We found that, whereas a learning rule mapping covariance led to significant overlap and merging of assemblies, a neurobiologically grounded synaptic plasticity rule with fixed LTP/LTD thresholds produced minimal overlap and prevented merging, exhibiting competitive learning behaviour. Our results are discussed in light of current theories of language and memory. As simulations with neurobiologically realistic neural networks demonstrate here spontaneous emergence of lexical representations that are both cortically dispersed and anatomically distinct, both localist and distributed cognitive accounts receive partial support. PMID:20396612

  3. Recruitment and Consolidation of Cell Assemblies for Words by Way of Hebbian Learning and Competition in a Multi-Layer Neural Network.

    PubMed

    Garagnani, Max; Wennekers, Thomas; Pulvermüller, Friedemann

    2009-06-01

    Current cognitive theories postulate either localist representations of knowledge or fully overlapping, distributed ones. We use a connectionist model that closely replicates known anatomical properties of the cerebral cortex and neurophysiological principles to show that Hebbian learning in a multi-layer neural network leads to memory traces (cell assemblies) that are both distributed and anatomically distinct. Taking the example of word learning based on action-perception correlation, we document mechanisms underlying the emergence of these assemblies, especially (i) the recruitment of neurons and consolidation of connections defining the kernel of the assembly along with (ii) the pruning of the cell assembly's halo (consisting of very weakly connected cells). We found that, whereas a learning rule mapping covariance led to significant overlap and merging of assemblies, a neurobiologically grounded synaptic plasticity rule with fixed LTP/LTD thresholds produced minimal overlap and prevented merging, exhibiting competitive learning behaviour. Our results are discussed in light of current theories of language and memory. As simulations with neurobiologically realistic neural networks demonstrate here spontaneous emergence of lexical representations that are both cortically dispersed and anatomically distinct, both localist and distributed cognitive accounts receive partial support.

  4. A universal hybrid decision tree classifier design for human activity classification.

    PubMed

    Chien, Chieh; Pottie, Gregory J

    2012-01-01

    A system that reliably classifies daily life activities can contribute to more effective and economical treatments for patients with chronic conditions or undergoing rehabilitative therapy. We propose a universal hybrid decision tree classifier for this purpose. The tree classifier can flexibly implement different decision rules at its internal nodes, and can be adapted from a population-based model when supplemented by training data for individuals. The system was tested using seven subjects each monitored by 14 triaxial accelerometers. Each subject performed fourteen different activities typical of daily life. Using leave-one-out cross validation, our decision tree produced average classification accuracies of 89.9%. In contrast, the MATLAB personalized tree classifiers using Gini's diversity index as the split criterion followed by optimally tuning the thresholds for each subject yielded 69.2%.

  5. 75 FR 22706 - Defense Federal Acquisition Regulation Supplement; Service Contract Surveillance (DFARS Case 2008...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-30

    ...DoD is issuing a final rule amending the Defense Federal Acquisition Regulation Supplement (DFARS) to ensure that the requirement for a quality assurance surveillance plan is addressed for each contract with a dollar value above the simplified acquisition threshold, and that contracts for services have appropriate performance management or surveillance plans prepared for the work being performed under the contract.

  6. Recherche d'une politique de gestion des stocks d'eau de barrages-réservoirs en vue de soutenir les étiages. 1. Élaboration d'un modèle alterné annuel apports-déficits

    NASA Astrophysics Data System (ADS)

    Bocquillon, C.; Masson, J. M.

    1983-01-01

    Lack of water supply during periods of deficient flow affects the economic potentiality of the great river valleys which are the most developed areas in the country. Reservoir dams built in the upper stream catchments store excess flow and provide controlled release in the dry season. Capital costs of construction and the consequences of failures justify a thorough study of operating rules. The low flows and conditional variability of availability of water call for carry-over procedures (reservoir capacity is sometimes greater than the mean available water). It is not possible to predict future sequence of flows, thus the carry-over rule is a statistical decision-making tool. The flow data are only one of the very many possible sources of information. But the analysis of flow data provides us with statistical measures to generate long series of synthetic inflows associated with summer deficits. A simplification has been introduced by choosing only the values which are absolutely necessary for optimal management research: available water volumes and reserve volumes for a flow threshold. Yearly alternate periods of excess and deficiency of water are defined by the values above and below a threshold of flow discharge at a location gage named "objective point", where the reservoir effects are to be estimated. Yearly periods are described by water volumes, either inflows into reservoirs, or deficits below various thresholds of summer flow discharges. Marginal and conditional probability distributions of these volumes and the physical laws which mark their bounds and relationships were estimated on the basis of 31 years of daily flow records. The synthetic simulated series for 1000 years was compared to records of historical levels (since 1863). Extreme events such as sequences of dry years, have return periods of comparable magnitude. This synthetic series has a similar statistical character of short historical series and makes the analysis of operating rules possible.

  7. Capillary rise and swelling in cellulose sponges

    NASA Astrophysics Data System (ADS)

    Ha, Jonghyun; Kim, Jungchul; Kim, Ho-Young

    2015-11-01

    A cellulose sponge, which is a mundane example of a porous hydrophilic structure, can absorb and hold a significant amount of liquid. We present the results of experimental and theoretical investigation of the dynamics of the capillary imbibition of various aqueous solutions in the sponge that swells at the same time. We find that the rate of water rise against the resistance caused by gravitational and viscous effects deviates from Washburn's rule beyond a certain threshold height. We rationalize the novel power law of the rise height versus time by combining Darcy's law with hygroscopic swelling equation and also predict the threshold height. The scaling law constructed through this work agrees well with the experimental results, shedding light on the physics of capillary flow in deforming porous media.

  8. Discontinuous non-equilibrium phase transition in a threshold Schloegl model for autocatalysis: Generic two-phase coexistence and metastability

    DOE PAGES

    Wang, Chi -Jen; Liu, Da -Jiang; Evans, James W.

    2015-04-28

    Threshold versions of Schloegl’s model on a lattice, which involve autocatalytic creation and spontaneous annihilation of particles, can provide a simple prototype for discontinuous non-equilibrium phase transitions. These models are equivalent to so-called threshold contact processes. A discontinuous transition between populated and vacuum states can occur selecting a threshold of N ≥ 2 for the minimum number, N, of neighboring particles enabling autocatalytic creation at an empty site. Fundamental open questions remain given the lack of a thermodynamic framework for analysis. For a square lattice with N = 2, we show that phase coexistence occurs not at a unique valuemore » but for a finite range of particle annihilation rate (the natural control parameter). This generic two-phase coexistence also persists when perturbing the model to allow spontaneous particle creation. Such behavior contrasts both the Gibbs phase rule for thermodynamic systems and also previous analysis for this model. We find metastability near the transition corresponding to a non-zero effective line tension, also contrasting previously suggested critical behavior. As a result, mean-field type analysis, extended to treat spatially heterogeneous states, further elucidates model behavior.« less

  9. Discontinuous non-equilibrium phase transition in a threshold Schloegl model for autocatalysis: Generic two-phase coexistence and metastability

    NASA Astrophysics Data System (ADS)

    Wang, Chi-Jen; Liu, Da-Jiang; Evans, James W.

    2015-04-01

    Threshold versions of Schloegl's model on a lattice, which involve autocatalytic creation and spontaneous annihilation of particles, can provide a simple prototype for discontinuous non-equilibrium phase transitions. These models are equivalent to so-called threshold contact processes. A discontinuous transition between populated and vacuum states can occur selecting a threshold of N ≥ 2 for the minimum number, N, of neighboring particles enabling autocatalytic creation at an empty site. Fundamental open questions remain given the lack of a thermodynamic framework for analysis. For a square lattice with N = 2, we show that phase coexistence occurs not at a unique value but for a finite range of particle annihilation rate (the natural control parameter). This generic two-phase coexistence also persists when perturbing the model to allow spontaneous particle creation. Such behavior contrasts both the Gibbs phase rule for thermodynamic systems and also previous analysis for this model. We find metastability near the transition corresponding to a non-zero effective line tension, also contrasting previously suggested critical behavior. Mean-field type analysis, extended to treat spatially heterogeneous states, further elucidates model behavior.

  10. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  11. A New Approach to Threshold Attribute Based Signatures

    DTIC Science & Technology

    2011-01-01

    Inspired by developments in attribute based encryption and signatures, there has recently been a spurtof progress in the direction of threshold ...attribute based signatures (t-ABS). In this work we propose anovel approach to construct threshold attribute based signatures inspired by ring signatures...Thresholdattribute based signatures, dened by a (t; n) threshold predicate, ensure that the signer holds atleastt out of a specied set of n attributes

  12. 77 FR 52977 - Regulatory Capital Rules: Advanced Approaches Risk-Based Capital Rule; Market Risk Capital Rule

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ... Corporation 12 CFR Parts 324, 325 Regulatory Capital Rules: Advanced Approaches Risk-Based Capital Rule... 325 RIN 3064-AD97 Regulatory Capital Rules: Advanced Approaches Risk-Based Capital Rule; Market Risk... the agencies' current capital rules. In this NPR (Advanced Approaches and Market Risk NPR) the...

  13. Exploration of SWRL Rule Bases through Visualization, Paraphrasing, and Categorization of Rules

    NASA Astrophysics Data System (ADS)

    Hassanpour, Saeed; O'Connor, Martin J.; Das, Amar K.

    Rule bases are increasingly being used as repositories of knowledge content on the Semantic Web. As the size and complexity of these rule bases increases, developers and end users need methods of rule abstraction to facilitate rule management. In this paper, we describe a rule abstraction method for Semantic Web Rule Language (SWRL) rules that is based on lexical analysis and a set of heuristics. Our method results in a tree data structure that we exploit in creating techniques to visualize, paraphrase, and categorize SWRL rules. We evaluate our approach by applying it to several biomedical ontologies that contain SWRL rules, and show how the results reveal rule patterns within the rule base. We have implemented our method as a plug-in tool for Protégé-OWL, the most widely used ontology modeling software for the Semantic Web. Our tool can allow users to rapidly explore content and patterns in SWRL rule bases, enabling their acquisition and management.

  14. The game of making decisions under uncertainty: How sure must one be?

    NASA Astrophysics Data System (ADS)

    Werner, Micha; Verkade, Jan; Wetterhall, Fredrik; van Andel, Schalk-Jan; Ramos, Maria-Helena

    2016-04-01

    Probabilistic hydrometeorological forecasting is now widely accepted to be more skillful than deterministic forecasts, and is increasingly being integrated into operational practice. Provided they are reliable and unbiased, probabilistic forecasts have the advantage that they give decision maker not only the forecast value, but also the uncertainty associated to that prediction. Though that information provides more insight, it does now leave the forecaster/decision maker with the challenge of deciding at what level of probability of a threshold being exceeded the decision to act should be taken. According to the cost-loss theory, that probability should be related to the impact of the threshold being exceeded. However, it is not entirely clear how easy it is for decision makers to follow that rule, even when the impact of a threshold being exceeded, and the actions to choose from are known. To continue the tradition in the "Ensemble Hydrometeorological Forecast" session, we will address the challenge of making decisions based on probabilistic forecasts through a game to be played with the audience. We will explore how decisions made differ depending on the known impacts of the forecasted events. Participants will be divided into a number of groups with differing levels of impact, and will be faced with a number of forecast situations. They will be asked to make decisions and record the consequence of those decisions. A discussion of the differences in the decisions made will be presented at the end of the game, with a fuller analysis later posted on the HEPEX web site blog (www.hepex.org).

  15. Extreme D'Hondt and round-off effects in voting computations

    NASA Astrophysics Data System (ADS)

    Konstantinov, M. M.; Pelova, G. B.

    2015-11-01

    D'Hondt (or Jefferson) method and Hare-Niemeyer (or Hamilton) method are widely used worldwide for seat allocation in proportional systems. Everything seems to be well known in this area. However, this is not the case. For example the D'Hondt method can violate the quota rule from above but this effect is not analyzed as a function of the number of parties and/or the threshold used. Also, allocation methods are often implemented automatically as computer codes in machine arithmetic believing that following the IEEE standards for double precision binary arithmetics would guarantee correct results. Unfortunately this may not happen not only for double precision arithmetic (usually producing 15-16 true decimal digits) but also for any relative precision of the underlying binary machine arithmetics. This paper deals with the following new issues.Find conditions (threshold in particular) such that D'Hondt seat allocation violates maximally the quota rule. Analyze possible influence of rounding errors in the automatic implementation of Hare-Niemeyer method in machine arithmetic.Concerning the first issue, it is known that the maximal deviation of D'Hondt allocation from upper quota for the Bulgarian proportional system (240 MP and 4% barrier) is 5. This fact had been established in 1991. A classical treatment of voting issues is the monograph [1], while electoral problems specific for Bulgaria have been treated in [2, 4]. The effect of threshold on extreme seat allocations is also analyzed in [3]. Finally we would like to stress that Voting Theory may sometimes be mathematically trivial but always has great political impact. This is a strong motivation for further investigations in this area.

  16. On Decision-Making Among Multiple Rule-Bases in Fuzzy Control Systems

    NASA Technical Reports Server (NTRS)

    Tunstel, Edward; Jamshidi, Mo

    1997-01-01

    Intelligent control of complex multi-variable systems can be a challenge for single fuzzy rule-based controllers. This class of problems cam often be managed with less difficulty by distributing intelligent decision-making amongst a collection of rule-bases. Such an approach requires that a mechanism be chosen to ensure goal-oriented interaction between the multiple rule-bases. In this paper, a hierarchical rule-based approach is described. Decision-making mechanisms based on generalized concepts from single-rule-based fuzzy control are described. Finally, the effects of different aggregation operators on multi-rule-base decision-making are examined in a navigation control problem for mobile robots.

  17. Role of biomarkers in the management of antibiotic therapy: an expert panel review II: clinical use of biomarkers for initiation or discontinuation of antibiotic therapy

    PubMed Central

    2013-01-01

    Biomarker-guided initiation of antibiotic therapy has been studied in four conditions: acute pancreatitis, lower respiratory tract infection (LRTI), meningitis, and sepsis in the ICU. In pancreatitis with suspected infected necrosis, initiating antibiotics best relies on fine-needle aspiration and demonstration of infected material. We suggest that PCT be measured to help predict infection; however, available data are insufficient to decide on initiating antibiotics based on PCT levels. In adult patients suspected of community-acquired LRTI, we suggest withholding antibiotic therapy when the serum PCT level is low (<0.25 ng/mL); in patients having nosocomial LRTI, data are insufficient to recommend initiating therapy based on a single PCT level or even repeated measurements. For children with suspected bacterial meningitis, we recommend using a decision rule as an aid to therapeutic decisions, such as the Bacterial Meningitis Score or the Meningitest®; a single PCT level ≥0.5 ng/mL also may be used, but false-negatives may occur. In adults with suspected bacterial meningitis, we suggest integrating serum PCT measurements in a clinical decision rule to help distinguish between viral and bacterial meningitis, using a 0.5 ng/mL threshold. For ICU patients suspected of community-acquired infection, we do not recommend using a threshold serum PCT value to help the decision to initiate antibiotic therapy; data are insufficient to recommend using PCT serum kinetics for the decision to initiate antibiotic therapy in patients suspected of ICU-acquired infection. In children, CRP can probably be used to help discontinue therapy, although the evidence is limited. In adults, antibiotic discontinuation can be based on an algorithm using repeated PCT measurements. In non-immunocompromised out- or in- patients treated for RTI, antibiotics can be discontinued if the PCT level at day 3 is < 0.25 ng/mL or has decreased by >80-90%, whether or not microbiological documentation has been obtained. For ICU patients who have nonbacteremic sepsis from a known site of infection, antibiotics can be stopped if the PCT level at day 3 is < 0.5 ng/mL or has decreased by >80% relative to the highest level recorded, irrespective of the severity of the infectious episode; in bacteremic patients, a minimal duration of therapy of 5 days is recommended. PMID:23830525

  18. Mode Selection Rules for a Two-Delay System with Positive and Negative Feedback Loops

    NASA Astrophysics Data System (ADS)

    Takahashi, Kin'ya; Kobayashi, Taizo

    2018-04-01

    The mode selection rules for a two-delay system, which has negative feedback with a short delay time t1 and positive feedback with a long delay time t2, are studied numerically and theoretically. We find two types of mode selection rules depending on the strength of the negative feedback. When the strength of the negative feedback |α1| (α1 < 0) is sufficiently small compared with that of the positive feedback α2 (> 0), 2m + 1-th harmonic oscillation is well sustained in a neighborhood of t1/t2 = even/odd, i.e., relevant condition. In a neighborhood of the irrelevant condition given by t1/t2 = odd/even or t1/t2 = odd/odd, higher harmonic oscillations are observed. However, if |α1| is slightly less than α2, a different mode selection rule works, where the condition t1/t2 = odd/even is relevant and the conditions t1/t2 = odd/odd and t1/t2 = even/odd are irrelevant. These mode selection rules are different from the mode selection rule of the normal two-delay system with two positive feedback loops, where t1/t2 = odd/odd is relevant and the others are irrelevant. The two types of mode selection rules are induced by individually different mechanisms controlling the Hopf bifurcation, i.e., the Hopf bifurcation controlled by the "boosted bifurcation process" and by the "anomalous bifurcation process", which occur for |α1| below and above the threshold value αth, respectively.

  19. Improving Cyber-Security of Smart Grid Systems via Anomaly Detection and Linguistic Domain Knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ondrej Linda; Todd Vollmer; Milos Manic

    The planned large scale deployment of smart grid network devices will generate a large amount of information exchanged over various types of communication networks. The implementation of these critical systems will require appropriate cyber-security measures. A network anomaly detection solution is considered in this work. In common network architectures multiple communications streams are simultaneously present, making it difficult to build an anomaly detection solution for the entire system. In addition, common anomaly detection algorithms require specification of a sensitivity threshold, which inevitably leads to a tradeoff between false positives and false negatives rates. In order to alleviate these issues, thismore » paper proposes a novel anomaly detection architecture. The designed system applies the previously developed network security cyber-sensor method to individual selected communication streams allowing for learning accurate normal network behavior models. Furthermore, the developed system dynamically adjusts the sensitivity threshold of each anomaly detection algorithm based on domain knowledge about the specific network system. It is proposed to model this domain knowledge using Interval Type-2 Fuzzy Logic rules, which linguistically describe the relationship between various features of the network communication and the possibility of a cyber attack. The proposed method was tested on experimental smart grid system demonstrating enhanced cyber-security.« less

  20. Numerical simulation study on thermal response of PBX 9501 to low velocity impact

    NASA Astrophysics Data System (ADS)

    Lou, Jianfeng; Zhou, Tingting; Zhang, Yangeng; Zhang, Xiaoli

    2017-01-01

    Impact sensitivity of solid high explosives, an important index in evaluating the safety and performance of explosives, is an important concern in handling, storage, and shipping procedures. It is a great threat for either bare dynamite or shell charge when subjected to low velocity impact involved in traffic accidents or charge piece drops. The Steven test is an effective tool to study the relative sensitivity of various explosives. In this paper, we built the numerical simulation method involving mechanical, thermo and chemical properties of Steven test based on the thermo-mechanical coupled material model. In the model, the stress-strain relationship is described by dynamic plasticity model, the thermal effect of the explosive induced by impact is depicted by isotropic thermal material model, the chemical reaction of explosives is described by Arrhenius reaction rate law, and the effects of heating and melting on mechanical properties and thermal properties of materials are also taken into account. Specific to the standard Steven test, the thermal and mechanical response rules of PBX 9501 at various impact velocities were numerically analyzed, and the threshold velocity of explosive initiation was obtained, which is in good agreement with experimental results. In addition, the effect of confine condition of test device to the threshold velocity was explored.

  1. 78 FR 803 - Revisions to Electric Reliability Organization Definition of Bulk Electric System and Rules of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-04

    ...In this Final Rule, pursuant to section 215 of the Federal Power Act, the Federal Energy Regulatory Commission (Commission) approves modifications to the currently-effective definition of ``bulk electric system'' developed by the North American Electric Reliability Corporation (NERC), the Commission-certified Electric Reliability Organization. The Commission finds that the modified definition of ``bulk electric system'' removes language allowing for regional discretion in the currently-effective bulk electric system definition and establishes a bright-line threshold that includes all facilities operated at or above 100 kV. The modified definition also identifies specific categories of facilities and configurations as inclusions and exclusions to provide clarity in the definition of ``bulk electric system.'' In this Final Rule, the Commission also approves: NERC's revisions to its Rules of Procedure, which create an exception process to add elements to, or remove elements from, the definition of ``bulk electric system'' on a case-by-case basis; NERC's form entitled ``Detailed Information To Support an Exception Request'' that entities will use to support requests for exception from the ``bulk electric system'' definition; and NERC's implementation plan for the revised ``bulk electric system'' definition.

  2. Online Phase Detection Using Wearable Sensors for Walking with a Robotic Prosthesis

    PubMed Central

    Goršič, Maja; Kamnik, Roman; Ambrožič, Luka; Vitiello, Nicola; Lefeber, Dirk; Pasquini, Guido; Munih, Marko

    2014-01-01

    This paper presents a gait phase detection algorithm for providing feedback in walking with a robotic prosthesis. The algorithm utilizes the output signals of a wearable wireless sensory system incorporating sensorized shoe insoles and inertial measurement units attached to body segments. The principle of detecting transitions between gait phases is based on heuristic threshold rules, dividing a steady-state walking stride into four phases. For the evaluation of the algorithm, experiments with three amputees, walking with the robotic prosthesis and wearable sensors, were performed. Results show a high rate of successful detection for all four phases (the average success rate across all subjects >90%). A comparison of the proposed method to an off-line trained algorithm using hidden Markov models reveals a similar performance achieved without the need for learning dataset acquisition and previous model training. PMID:24521944

  3. A seasonal forecast scheme for the Inner Mongolia spring drought part-II: a logical reasoning evidence-based method for spring predictions

    NASA Astrophysics Data System (ADS)

    Gao, Tao; Wulan, Wulan; Yu, Xiao; Yang, Zelong; Gao, Jing; Hua, Weiqi; Yang, Peng; Si, Yaobing

    2018-05-01

    Spring precipitation is the predominant factor that controls meteorological drought in Inner Mongolia (IM), China. This study used the anomaly percentage of spring precipitation (PAP) as a drought index to measure spring drought. A scheme for forecasting seasonal drought was designed based on evidence of spring drought occurrence and speculative reasoning methods introduced in computer artificial intelligence theory. Forecast signals with sufficient lead-time for predictions of spring drought were extracted from eight crucial areas of oceans and 500-hPa geopotential height. Using standardized values, these signals were synthesized into three examples of spring drought evidence (SDE) depending on their primary effects on three major atmospheric circulation components of spring precipitation in IM: the western Pacific subtropical high, North Polar vortex, and East Asian trough. Thresholds for the SDE were determined following numerical analyses of the influential factors. Furthermore, five logical reasoning rules for distinguishing the occurrence of SDE were designed after examining all possible combined cases. The degree of confidence in the rules was determined based on estimations of their prior probabilities. Then, an optimized logical reasoning scheme was identified for judging the possibility of spring drought. The scheme was successful in hindcast predictions of 11 of the 16 (accuracy: 68.8%) spring droughts that have occurred during 1960-2009. Moreover, the accuracy ratio for the same period was 82.0% for drought (PAP ≤ -20%) or not (PAP > -20%). Predictions for the recent 6-year period (2010-2015) demonstrated successful outcomes.

  4. The organization of societal conflicts by pavement ants Tetramorium caespitum: an agent-based model of amine-mediated decision making

    PubMed Central

    Hoover, Kevin M.; Bubak, Andrew N.; Law, Isaac J.; Yaeger, Jazmine D. W.; Renner, Kenneth J.; Swallow, John G.; Greene, Michael J.

    2016-01-01

    Abstract Ant colonies self-organize to solve complex problems despite the simplicity of an individual ant’s brain. Pavement ant Tetramorium caespitum colonies must solve the problem of defending the territory that they patrol in search of energetically rich forage. When members of 2 colonies randomly interact at the territory boundary a decision to fight occurs when: 1) there is a mismatch in nestmate recognition cues and 2) each ant has a recent history of high interaction rates with nestmate ants. Instead of fighting, some ants will decide to recruit more workers from the nest to the fighting location, and in this way a positive feedback mediates the development of colony wide wars. In ants, the monoamines serotonin (5-HT) and octopamine (OA) modulate many behaviors associated with colony organization and in particular behaviors associated with nestmate recognition and aggression. In this article, we develop and explore an agent-based model that conceptualizes how individual changes in brain concentrations of 5-HT and OA, paired with a simple threshold-based decision rule, can lead to the development of colony wide warfare. Model simulations do lead to the development of warfare with 91% of ants fighting at the end of 1 h. When conducting a sensitivity analysis, we determined that uncertainty in monoamine concentration signal decay influences the behavior of the model more than uncertainty in the decision-making rule or density. We conclude that pavement ant behavior is consistent with the detection of interaction rate through a single timed interval rather than integration of multiple interactions. PMID:29491915

  5. The organization of societal conflicts by pavement ants Tetramorium caespitum: an agent-based model of amine-mediated decision making.

    PubMed

    Hoover, Kevin M; Bubak, Andrew N; Law, Isaac J; Yaeger, Jazmine D W; Renner, Kenneth J; Swallow, John G; Greene, Michael J

    2016-06-01

    Ant colonies self-organize to solve complex problems despite the simplicity of an individual ant's brain. Pavement ant Tetramorium caespitum colonies must solve the problem of defending the territory that they patrol in search of energetically rich forage. When members of 2 colonies randomly interact at the territory boundary a decision to fight occurs when: 1) there is a mismatch in nestmate recognition cues and 2) each ant has a recent history of high interaction rates with nestmate ants. Instead of fighting, some ants will decide to recruit more workers from the nest to the fighting location, and in this way a positive feedback mediates the development of colony wide wars. In ants, the monoamines serotonin (5-HT) and octopamine (OA) modulate many behaviors associated with colony organization and in particular behaviors associated with nestmate recognition and aggression. In this article, we develop and explore an agent-based model that conceptualizes how individual changes in brain concentrations of 5-HT and OA, paired with a simple threshold-based decision rule, can lead to the development of colony wide warfare. Model simulations do lead to the development of warfare with 91% of ants fighting at the end of 1 h. When conducting a sensitivity analysis, we determined that uncertainty in monoamine concentration signal decay influences the behavior of the model more than uncertainty in the decision-making rule or density. We conclude that pavement ant behavior is consistent with the detection of interaction rate through a single timed interval rather than integration of multiple interactions.

  6. A Pilot Study of the Snap & Sniff Threshold Test.

    PubMed

    Jiang, Rong-San; Liang, Kai-Li

    2018-05-01

    The Snap & Sniff ® Threshold Test (S&S) has been recently developed to determine the olfactory threshold. The aim of this study was to further evaluate the validity and test-retest reliability of the S&S. The olfactory thresholds of 120 participants were determined using both the Smell Threshold Test (STT) and the S&S. The participants included 30 normosmic volunteers and 90 patients (60 hyposmic, 30 anosmic). The normosmic participants were retested using the STT and S&S at an intertest interval of at least 1 day. The mean olfactory threshold determined with the S&S was -6.76 for the normosmic participants, -3.79 for the hyposmic patients, and -2 for the anosmic patients. The olfactory thresholds were significantly different across the 3 groups ( P < .001). Snap & Sniff-based and STT-based olfactory thresholds were correlated weakly in the normosmic group (correlation coefficient = 0.162, P = .391) but more strongly correlated in the patient groups (hyposmic: correlation coefficient = 0.376, P = .003; anosmic: correlation coefficient = 1.0). The test-retest correlation for the S&S-based olfactory thresholds was 0.384 ( P = .036). Based on validity and test-retest reliability, we concluded that the S&S is a proper test for olfactory thresholds.

  7. Mimic expert judgement through automated procedure for selecting rainfall events responsible for shallow landslide: A statistical approach to validation

    NASA Astrophysics Data System (ADS)

    Giovanna, Vessia; Luca, Pisano; Carmela, Vennari; Mauro, Rossi; Mario, Parise

    2016-01-01

    This paper proposes an automated method for the selection of rainfall data (duration, D, and cumulated, E), responsible for shallow landslide initiation. The method mimics an expert person identifying D and E from rainfall records through a manual procedure whose rules are applied according to her/his judgement. The comparison between the two methods is based on 300 D-E pairs drawn from temporal rainfall data series recorded in a 30 days time-lag before the landslide occurrence. Statistical tests, employed on D and E samples considered both paired and independent values to verify whether they belong to the same population, show that the automated procedure is able to replicate the expert pairs drawn by the expert judgment. Furthermore, a criterion based on cumulated distribution functions (CDFs) is proposed to select the most related D-E pairs to the expert one among the 6 drawn from the coded procedure for tracing the empirical rainfall threshold line.

  8. Finding new pathway-specific regulators by clustering method using threshold standard deviation based on DNA chip data of Streptomyces coelicolor.

    PubMed

    Yang, Yung-Hun; Kim, Ji-Nu; Song, Eunjung; Kim, Eunjung; Oh, Min-Kyu; Kim, Byung-Gee

    2008-09-01

    In order to identify the regulators involved in antibiotic production or time-specific cellular events, the messenger ribonucleic acid (mRNA) expression data of the two gene clusters, actinorhodin (ACT) and undecylprodigiosin (RED) biosynthetic genes, were clustered with known mRNA expression data of regulators from S. coelicolor using a filtering method based on standard deviation and clustering analysis. The result identified five regulators including two well-known regulators namely, SCO3579 (WlbA) and SCO6722 (SsgD). Using overexpression and deletion of the regulator genes, we were able to identify two regulators, i.e., SCO0608 and SCO6808, playing roles as repressors in antibiotics production and sporulation. This approach can be easily applied to mapping out new regulators related to any interesting target gene clusters showing characteristic expression patterns. The result can also be used to provide insightful information on the selection rules among a large number of regulators.

  9. Limits on Momentum-Dependent Asymmetric Dark Matter with CRESST-II.

    PubMed

    Angloher, G; Bento, A; Bucci, C; Canonica, L; Defay, X; Erb, A; Feilitzsch, F V; Ferreiro Iachellini, N; Gorla, P; Gütlein, A; Hauff, D; Jochum, J; Kiefer, M; Kluck, H; Kraus, H; Lanfranchi, J-C; Loebell, J; Münster, A; Pagliarone, C; Petricca, F; Potzel, W; Pröbst, F; Reindl, F; Schäffner, K; Schieck, J; Schönert, S; Seidel, W; Stodolsky, L; Strandhagen, C; Strauss, R; Tanzke, A; Trinh Thi, H H; Türkoğlu, C; Uffinger, M; Ulrich, A; Usherov, I; Wawoczny, S; Willers, M; Wüstrich, M; Zöller, A

    2016-07-08

    The usual assumption in direct dark matter searches is to consider only the spin-dependent or spin-independent scattering of dark matter particles. However, especially in models with light dark matter particles O(GeV/c^{2}), operators which carry additional powers of the momentum transfer q^{2} can become dominant. One such model based on asymmetric dark matter has been invoked to overcome discrepancies in helioseismology and an indication was found for a particle with a preferred mass of 3  GeV/c^{2} and a cross section of 10^{-37}  cm^{2}. Recent data from the CRESST-II experiment, which uses cryogenic detectors based on CaWO_{4} to search for nuclear recoils induced by dark matter particles, are used to constrain these momentum-dependent models. The low energy threshold of 307 eV for nuclear recoils of the detector used, allows us to rule out the proposed best fit value above.

  10. Threshold Assessment of Gear Diagnostic Tools on Flight and Test Rig Data

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Mosher, Marianne; Huff, Edward M.

    2003-01-01

    A method for defining thresholds for vibration-based algorithms that provides the minimum number of false alarms while maintaining sensitivity to gear damage was developed. This analysis focused on two vibration based gear damage detection algorithms, FM4 and MSA. This method was developed using vibration data collected during surface fatigue tests performed in a spur gearbox rig. The thresholds were defined based on damage progression during tests with damage. The thresholds false alarm rates were then evaluated on spur gear tests without damage. Next, the same thresholds were applied to flight data from an OH-58 helicopter transmission. Results showed that thresholds defined in test rigs can be used to define thresholds in flight to correctly classify the transmission operation as normal.

  11. RuleMonkey: software for stochastic simulation of rule-based models

    PubMed Central

    2010-01-01

    Background The system-level dynamics of many molecular interactions, particularly protein-protein interactions, can be conveniently represented using reaction rules, which can be specified using model-specification languages, such as the BioNetGen language (BNGL). A set of rules implicitly defines a (bio)chemical reaction network. The reaction network implied by a set of rules is often very large, and as a result, generation of the network implied by rules tends to be computationally expensive. Moreover, the cost of many commonly used methods for simulating network dynamics is a function of network size. Together these factors have limited application of the rule-based modeling approach. Recently, several methods for simulating rule-based models have been developed that avoid the expensive step of network generation. The cost of these "network-free" simulation methods is independent of the number of reactions implied by rules. Software implementing such methods is now needed for the simulation and analysis of rule-based models of biochemical systems. Results Here, we present a software tool called RuleMonkey, which implements a network-free method for simulation of rule-based models that is similar to Gillespie's method. The method is suitable for rule-based models that can be encoded in BNGL, including models with rules that have global application conditions, such as rules for intramolecular association reactions. In addition, the method is rejection free, unlike other network-free methods that introduce null events, i.e., steps in the simulation procedure that do not change the state of the reaction system being simulated. We verify that RuleMonkey produces correct simulation results, and we compare its performance against DYNSTOC, another BNGL-compliant tool for network-free simulation of rule-based models. We also compare RuleMonkey against problem-specific codes implementing network-free simulation methods. Conclusions RuleMonkey enables the simulation of rule-based models for which the underlying reaction networks are large. It is typically faster than DYNSTOC for benchmark problems that we have examined. RuleMonkey is freely available as a stand-alone application http://public.tgen.org/rulemonkey. It is also available as a simulation engine within GetBonNie, a web-based environment for building, analyzing and sharing rule-based models. PMID:20673321

  12. Masking Misfit in Confirmatory Factor Analysis by Increasing Unique Variances: A Cautionary Note on the Usefulness of Cutoff Values of Fit Indices

    ERIC Educational Resources Information Center

    Heene, Moritz; Hilbert, Sven; Draxler, Clemens; Ziegler, Matthias; Buhner, Markus

    2011-01-01

    Fit indices are widely used in order to test the model fit for structural equation models. In a highly influential study, Hu and Bentler (1999) showed that certain cutoff values for these indices could be derived, which, over time, has led to the reification of these suggested thresholds as "golden rules" for establishing the fit or other aspects…

  13. Exploration of the association rules mining technique for the signal detection of adverse drug events in spontaneous reporting systems.

    PubMed

    Wang, Chao; Guo, Xiao-Jing; Xu, Jin-Fang; Wu, Cheng; Sun, Ya-Lin; Ye, Xiao-Fei; Qian, Wei; Ma, Xiu-Qiang; Du, Wen-Min; He, Jia

    2012-01-01

    The detection of signals of adverse drug events (ADEs) has increased because of the use of data mining algorithms in spontaneous reporting systems (SRSs). However, different data mining algorithms have different traits and conditions for application. The objective of our study was to explore the application of association rule (AR) mining in ADE signal detection and to compare its performance with that of other algorithms. Monte Carlo simulation was applied to generate drug-ADE reports randomly according to the characteristics of SRS datasets. Thousand simulated datasets were mined by AR and other algorithms. On average, 108,337 reports were generated by the Monte Carlo simulation. Based on the predefined criterion that 10% of the drug-ADE combinations were true signals, with RR equaling to 10, 4.9, 1.5, and 1.2, AR detected, on average, 284 suspected associations with a minimum support of 3 and a minimum lift of 1.2. The area under the receiver operating characteristic (ROC) curve of the AR was 0.788, which was equivalent to that shown for other algorithms. Additionally, AR was applied to reports submitted to the Shanghai SRS in 2009. Five hundred seventy combinations were detected using AR from 24,297 SRS reports, and they were compared with recognized ADEs identified by clinical experts and various other sources. AR appears to be an effective method for ADE signal detection, both in simulated and real SRS datasets. The limitations of this method exposed in our study, i.e., a non-uniform thresholds setting and redundant rules, require further research.

  14. Optimal setups for forced-choice staircases with fixed step sizes.

    PubMed

    García-Pérez, M A

    2000-01-01

    Forced-choice staircases with fixed step sizes are used in a variety of formats whose relative merits have never been studied. This paper presents a comparative study aimed at determining their optimal format. Factors included in the study were the up/down rule, the length (number of reversals), and the size of the steps. The study also addressed the issue of whether a protocol involving three staircases running for N reversals each (with a subsequent average of the estimates provided by each individual staircase) has better statistical properties than an alternative protocol involving a single staircase running for 3N reversals. In all cases the size of a step up was different from that of a step down, in the appropriate ratio determined by García-Pérez (Vision Research, 1998, 38, 1861 - 1881). The results of a simulation study indicate that a) there are no conditions in which the 1-down/1-up rule is advisable; b) different combinations of up/down rule and number of reversals appear equivalent in terms of precision and cost: c) using a single long staircase with 3N reversals is more efficient than running three staircases with N reversals each: d) to avoid bias and attain sufficient accuracy, threshold estimates should be based on at least 30 reversals: and e) to avoid excessive cost and imprecision, the size of the step up should be between 2/3 and 3/3 the (known or presumed) spread of the psychometric function. An empirical study with human subjects confirmed the major characteristics revealed by the simulations.

  15. An Evaluation of Performance Thresholds in Nursing Home Pay-for-Performance.

    PubMed

    Werner, Rachel M; Skira, Meghan; Konetzka, R Tamara

    2016-12-01

    Performance thresholds are commonly used in pay-for-performance (P4P) incentives, where providers receive a bonus payment for achieving a prespecified target threshold but may produce discontinuous incentives, with providers just below the threshold having the strongest incentive to improve and providers either far below or above the threshold having little incentive. We investigate the effect of performance thresholds on provider response in the setting of nursing home P4P. The Minimum Data Set (MDS) and Online Survey, Certification, and Reporting (OSCAR) datasets. Difference-in-differences design to test for changes in nursing home performance in three states that implemented threshold-based P4P (Colorado, Georgia, and Oklahoma) versus three comparator states (Arizona, Tennessee, and Arkansas) between 2006 and 2009. We find that those farthest below the threshold (i.e., the worst-performing nursing homes) had the largest improvements under threshold-based P4P while those farthest above the threshold worsened. This effect did not vary with the percentage of Medicaid residents in a nursing home. Threshold-based P4P may provide perverse incentives for nursing homes above the performance threshold, but we do not find evidence to support concerns about the effects of performance thresholds on low-performing nursing homes. © Health Research and Educational Trust.

  16. Risk indicators for water supply systems for a drought Decision Support System in central Tuscany (Italy)

    NASA Astrophysics Data System (ADS)

    Rossi, Giuseppe; Garrote, Luis; Caporali, Enrica

    2010-05-01

    Identifying the occurrence, the extent and the magnitude of a drought can be delicate, requiring detection of depletions of supplies and increases in demand. Drought indices, particularly the meteorological ones, can describe the onset and the persistency of droughts, especially in natural systems. However they have to be used cautiously when applied to water supply systems. They show little correlation with water shortage situations, since water storage, as well as demand fluctuation, play an important role in water resources management. For that reason a more dynamic indicator relating supply and demand is required in order to identify situations when there is risk of water shortages. In water supply systems there is great variability on the natural water resources and also on the demands. These quantities can only be defined probabilistically. This great variability is faced defining some threshold values, expressed in probabilistic terms, that measure the hydrologic state of the system. They can identify specific actions in an operational context in different levels of severity, like the normal, pre-alert, alert and emergency scenarios. They can simplify the decision-making required during stressful periods and can help mitigate the impacts of drought by clearly defining the conditions requiring actions. The threshold values are defined considering the probability to satisfy a given fraction of the demand in a certain time horizon, and are calibrated through discussion with water managers. A simplified model of the water resources system is built to evaluate the threshold values and the management rules. The threshold values are validated with a long term simulation that takes into account the characteristics of the evaluated system. The levels and volumes in the different reservoirs are simulated using 20-30 years time series. The critical situations are assessed month by month in order to evaluate optimal management rules during the year and avoid conditions of total water shortage. The methodology is applied to the urban area Firenze-Prato-Pistoia in central Tuscany, in central Italy. The catchment of the investigated area has a surface of 1231 km2 and, accordingly to the census ISTAT 2001, 945˙972 inhabitants.

  17. Automated microaneurysm detection in diabetic retinopathy using curvelet transform

    NASA Astrophysics Data System (ADS)

    Ali Shah, Syed Ayaz; Laude, Augustinus; Faye, Ibrahima; Tang, Tong Boon

    2016-10-01

    Microaneurysms (MAs) are known to be the early signs of diabetic retinopathy (DR). An automated MA detection system based on curvelet transform is proposed for color fundus image analysis. Candidates of MA were extracted in two parallel steps. In step one, blood vessels were removed from preprocessed green band image and preliminary MA candidates were selected by local thresholding technique. In step two, based on statistical features, the image background was estimated. The results from the two steps allowed us to identify preliminary MA candidates which were also present in the image foreground. A collection set of features was fed to a rule-based classifier to divide the candidates into MAs and non-MAs. The proposed system was tested with Retinopathy Online Challenge database. The automated system detected 162 MAs out of 336, thus achieved a sensitivity of 48.21% with 65 false positives per image. Counting MA is a means to measure the progression of DR. Hence, the proposed system may be deployed to monitor the progression of DR at early stage in population studies.

  18. Automated microaneurysm detection in diabetic retinopathy using curvelet transform.

    PubMed

    Ali Shah, Syed Ayaz; Laude, Augustinus; Faye, Ibrahima; Tang, Tong Boon

    2016-10-01

    Microaneurysms (MAs) are known to be the early signs of diabetic retinopathy (DR). An automated MA detection system based on curvelet transform is proposed for color fundus image analysis. Candidates of MA were extracted in two parallel steps. In step one, blood vessels were removed from preprocessed green band image and preliminary MA candidates were selected by local thresholding technique. In step two, based on statistical features, the image background was estimated. The results from the two steps allowed us to identify preliminary MA candidates which were also present in the image foreground. A collection set of features was fed to a rule-based classifier to divide the candidates into MAs and non-MAs. The proposed system was tested with Retinopathy Online Challenge database. The automated system detected 162 MAs out of 336, thus achieved a sensitivity of 48.21% with 65 false positives per image. Counting MA is a means to measure the progression of DR. Hence, the proposed system may be deployed to monitor the progression of DR at early stage in population studies.

  19. Dynamic Task Optimization in Remote Diabetes Monitoring Systems.

    PubMed

    Suh, Myung-Kyung; Woodbridge, Jonathan; Moin, Tannaz; Lan, Mars; Alshurafa, Nabil; Samy, Lauren; Mortazavi, Bobak; Ghasemzadeh, Hassan; Bui, Alex; Ahmadi, Sheila; Sarrafzadeh, Majid

    2012-09-01

    Diabetes is the seventh leading cause of death in the United States, but careful symptom monitoring can prevent adverse events. A real-time patient monitoring and feedback system is one of the solutions to help patients with diabetes and their healthcare professionals monitor health-related measurements and provide dynamic feedback. However, data-driven methods to dynamically prioritize and generate tasks are not well investigated in the domain of remote health monitoring. This paper presents a wireless health project (WANDA) that leverages sensor technology and wireless communication to monitor the health status of patients with diabetes. The WANDA dynamic task management function applies data analytics in real-time to discretize continuous features, applying data clustering and association rule mining techniques to manage a sliding window size dynamically and to prioritize required user tasks. The developed algorithm minimizes the number of daily action items required by patients with diabetes using association rules that satisfy a minimum support, confidence and conditional probability thresholds. Each of these tasks maximizes information gain, thereby improving the overall level of patient adherence and satisfaction. Experimental results from applying EM-based clustering and Apriori algorithms show that the developed algorithm can predict further events with higher confidence levels and reduce the number of user tasks by up to 76.19 %.

  20. Dynamic Task Optimization in Remote Diabetes Monitoring Systems

    PubMed Central

    Suh, Myung-kyung; Woodbridge, Jonathan; Moin, Tannaz; Lan, Mars; Alshurafa, Nabil; Samy, Lauren; Mortazavi, Bobak; Ghasemzadeh, Hassan; Bui, Alex; Ahmadi, Sheila; Sarrafzadeh, Majid

    2016-01-01

    Diabetes is the seventh leading cause of death in the United States, but careful symptom monitoring can prevent adverse events. A real-time patient monitoring and feedback system is one of the solutions to help patients with diabetes and their healthcare professionals monitor health-related measurements and provide dynamic feedback. However, data-driven methods to dynamically prioritize and generate tasks are not well investigated in the domain of remote health monitoring. This paper presents a wireless health project (WANDA) that leverages sensor technology and wireless communication to monitor the health status of patients with diabetes. The WANDA dynamic task management function applies data analytics in real-time to discretize continuous features, applying data clustering and association rule mining techniques to manage a sliding window size dynamically and to prioritize required user tasks. The developed algorithm minimizes the number of daily action items required by patients with diabetes using association rules that satisfy a minimum support, confidence and conditional probability thresholds. Each of these tasks maximizes information gain, thereby improving the overall level of patient adherence and satisfaction. Experimental results from applying EM-based clustering and Apriori algorithms show that the developed algorithm can predict further events with higher confidence levels and reduce the number of user tasks by up to 76.19 %. PMID:27617297

  1. Chaotic Signal Denoising Based on Hierarchical Threshold Synchrosqueezed Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Wang, Wen-Bo; Jing, Yun-yu; Zhao, Yan-chao; Zhang, Lian-Hua; Wang, Xiang-Li

    2017-12-01

    In order to overcoming the shortcoming of single threshold synchrosqueezed wavelet transform(SWT) denoising method, an adaptive hierarchical threshold SWT chaotic signal denoising method is proposed. Firstly, a new SWT threshold function is constructed based on Stein unbiased risk estimation, which is two order continuous derivable. Then, by using of the new threshold function, a threshold process based on the minimum mean square error was implemented, and the optimal estimation value of each layer threshold in SWT chaotic denoising is obtained. The experimental results of the simulating chaotic signal and measured sunspot signals show that, the proposed method can filter the noise of chaotic signal well, and the intrinsic chaotic characteristic of the original signal can be recovered very well. Compared with the EEMD denoising method and the single threshold SWT denoising method, the proposed method can obtain better denoising result for the chaotic signal.

  2. An endorsement-based approach to student modeling for planner-controlled intelligent tutoring systems

    NASA Technical Reports Server (NTRS)

    Murray, William R.

    1990-01-01

    An approach is described to student modeling for intelligent tutoring systems based on an explicit representation of the tutor's beliefs about the student and the arguments for and against those beliefs (called endorsements). A lexicographic comparison of arguments, sorted according to evidence reliability, provides a principled means of determining those beliefs that are considered true, false, or uncertain. Each of these beliefs is ultimately justified by underlying assessment data. The endorsement-based approach to student modeling is particularly appropriate for tutors controlled by instructional planners. These tutors place greater demands on a student model than opportunistic tutors. Numerical calculi approaches are less well-suited because it is difficult to correctly assign numbers for evidence reliability and rule plausibility. It may also be difficult to interpret final results and provide suitable combining functions. When numeric measures of uncertainty are used, arbitrary numeric thresholds are often required for planning decisions. Such an approach is inappropriate when robust context-sensitive planning decisions must be made. A TMS-based implementation of the endorsement-based approach to student modeling is presented, this approach is compared to alternatives, and a project history is provided describing the evolution of this approach.

  3. Percolation in suspensions of hard nanoparticles: From spheres to needles

    NASA Astrophysics Data System (ADS)

    Schilling, Tanja; Miller, Mark A.; van der Schoot, Paul

    2015-09-01

    We investigate geometric percolation and scaling relations in suspensions of nanorods, covering the entire range of aspect ratios from spheres to extremely slender needles. A new version of connectedness percolation theory is introduced and tested against specialised Monte Carlo simulations. The theory accurately predicts percolation thresholds for aspect ratios of rod length to width as low as 10. The percolation threshold for rod-like particles of aspect ratios below 1000 deviates significantly from the inverse aspect ratio scaling prediction, thought to be valid in the limit of infinitely slender rods and often used as a rule of thumb for nanofibres in composite materials. Hence, most fibres that are currently used as fillers in composite materials cannot be regarded as practically infinitely slender for the purposes of percolation theory. Comparing percolation thresholds of hard rods and new benchmark results for ideal rods, we find that i) for large aspect ratios, they differ by a factor that is inversely proportional to the connectivity distance between the hard cores, and ii) they approach the slender rod limit differently.

  4. Altruism in multiplayer snowdrift games with threshold and punishment

    NASA Astrophysics Data System (ADS)

    Zhang, Chunyan; Liu, Zhongxin; Sun, Qinglin; Chen, Zengqiang

    2015-09-01

    The puzzle of cooperation attracts broader concerns of the scientific community nowadays. Here we adopt an extra mechanism of punishment in the framework of a threshold multiple-player snowdrift game employed as the scenario for the cooperation problem. Two scenarios are considered: defectors will suffer punishment regardless of the game results, and defectors will incur punishment only when the game fails. We show by analysis that given this assumption, punishing free riders can significantly influence the evolution outcomes, and the results are driven by the specific components of the punishing rule. Particularly, punishing defectors always, not only when the game fails, can be more effective for maintaining public cooperation in multi-player systems. Intriguingly larger thresholds of the game provide a more favorable scenario for the coexistence of the cooperators and defectors under a broad value range of parameters. Further, cooperators are best supported by the large punishment on defectors, and then dominate and stabilize in the population, under the premise that defectors always incur punishment regardless of whether the game ends successfully or not.

  5. A rule-based software test data generator

    NASA Technical Reports Server (NTRS)

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  6. Rule groupings: An approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Knowledge-based expert systems are playing an increasingly important role in NASA space and aircraft systems. However, many of NASA's software applications are life- or mission-critical and knowledge-based systems do not lend themselves to the traditional verification and validation techniques for highly reliable software. Rule-based systems lack the control abstractions found in procedural languages. Hence, it is difficult to verify or maintain such systems. Our goal is to automatically structure a rule-based system into a set of rule-groups having a well-defined interface to other rule-groups. Once a rule base is decomposed into such 'firewalled' units, studying the interactions between rules would become more tractable. Verification-aid tools can then be developed to test the behavior of each such rule-group. Furthermore, the interactions between rule-groups can be studied in a manner similar to integration testing. Such efforts will go a long way towards increasing our confidence in the expert-system software. Our research efforts address the feasibility of automating the identification of rule groups, in order to decompose the rule base into a number of meaningful units.

  7. Methods for automatic trigger threshold adjustment

    DOEpatents

    Welch, Benjamin J; Partridge, Michael E

    2014-03-18

    Methods are presented for adjusting trigger threshold values to compensate for drift in the quiescent level of a signal monitored for initiating a data recording event, thereby avoiding false triggering conditions. Initial threshold values are periodically adjusted by re-measuring the quiescent signal level, and adjusting the threshold values by an offset computation based upon the measured quiescent signal level drift. Re-computation of the trigger threshold values can be implemented on time based or counter based criteria. Additionally, a qualification width counter can be utilized to implement a requirement that a trigger threshold criterion be met a given number of times prior to initiating a data recording event, further reducing the possibility of a false triggering situation.

  8. Determination of prospective displacement-based gate threshold for respiratory-gated radiation delivery from retrospective phase-based gate threshold selected at 4D CT simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vedam, S.; Archambault, L.; Starkschall, G.

    2007-11-15

    Four-dimensional (4D) computed tomography (CT) imaging has found increasing importance in the localization of tumor and surrounding normal structures throughout the respiratory cycle. Based on such tumor motion information, it is possible to identify the appropriate phase interval for respiratory gated treatment planning and delivery. Such a gating phase interval is determined retrospectively based on tumor motion from internal tumor displacement. However, respiratory-gated treatment is delivered prospectively based on motion determined predominantly from an external monitor. Therefore, the simulation gate threshold determined from the retrospective phase interval selected for gating at 4D CT simulation may not correspond to the deliverymore » gate threshold that is determined from the prospective external monitor displacement at treatment delivery. The purpose of the present work is to establish a relationship between the thresholds for respiratory gating determined at CT simulation and treatment delivery, respectively. One hundred fifty external respiratory motion traces, from 90 patients, with and without audio-visual biofeedback, are analyzed. Two respiratory phase intervals, 40%-60% and 30%-70%, are chosen for respiratory gating from the 4D CT-derived tumor motion trajectory. From residual tumor displacements within each such gating phase interval, a simulation gate threshold is defined based on (a) the average and (b) the maximum respiratory displacement within the phase interval. The duty cycle for prospective gated delivery is estimated from the proportion of external monitor displacement data points within both the selected phase interval and the simulation gate threshold. The delivery gate threshold is then determined iteratively to match the above determined duty cycle. The magnitude of the difference between such gate thresholds determined at simulation and treatment delivery is quantified in each case. Phantom motion tests yielded coincidence of simulation and delivery gate thresholds to within 0.3%. For patient data analysis, differences between simulation and delivery gate thresholds are reported as a fraction of the total respiratory motion range. For the smaller phase interval, the differences between simulation and delivery gate thresholds are 8{+-}11% and 14{+-}21% with and without audio-visual biofeedback, respectively, when the simulation gate threshold is determined based on the mean respiratory displacement within the 40%-60% gating phase interval. For the longer phase interval, corresponding differences are 4{+-}7% and 8{+-}15% with and without audio-visual biofeedback, respectively. Alternatively, when the simulation gate threshold is determined based on the maximum average respiratory displacement within the gating phase interval, greater differences between simulation and delivery gate thresholds are observed. A relationship between retrospective simulation gate threshold and prospective delivery gate threshold for respiratory gating is established and validated for regular and nonregular respiratory motion. Using this relationship, the delivery gate threshold can be reliably estimated at the time of 4D CT simulation, thereby improving the accuracy and efficiency of respiratory-gated radiation delivery.« less

  9. Determination of prospective displacement-based gate threshold for respiratory-gated radiation delivery from retrospective phase-based gate threshold selected at 4D CT simulation.

    PubMed

    Vedam, S; Archambault, L; Starkschall, G; Mohan, R; Beddar, S

    2007-11-01

    Four-dimensional (4D) computed tomography (CT) imaging has found increasing importance in the localization of tumor and surrounding normal structures throughout the respiratory cycle. Based on such tumor motion information, it is possible to identify the appropriate phase interval for respiratory gated treatment planning and delivery. Such a gating phase interval is determined retrospectively based on tumor motion from internal tumor displacement. However, respiratory-gated treatment is delivered prospectively based on motion determined predominantly from an external monitor. Therefore, the simulation gate threshold determined from the retrospective phase interval selected for gating at 4D CT simulation may not correspond to the delivery gate threshold that is determined from the prospective external monitor displacement at treatment delivery. The purpose of the present work is to establish a relationship between the thresholds for respiratory gating determined at CT simulation and treatment delivery, respectively. One hundred fifty external respiratory motion traces, from 90 patients, with and without audio-visual biofeedback, are analyzed. Two respiratory phase intervals, 40%-60% and 30%-70%, are chosen for respiratory gating from the 4D CT-derived tumor motion trajectory. From residual tumor displacements within each such gating phase interval, a simulation gate threshold is defined based on (a) the average and (b) the maximum respiratory displacement within the phase interval. The duty cycle for prospective gated delivery is estimated from the proportion of external monitor displacement data points within both the selected phase interval and the simulation gate threshold. The delivery gate threshold is then determined iteratively to match the above determined duty cycle. The magnitude of the difference between such gate thresholds determined at simulation and treatment delivery is quantified in each case. Phantom motion tests yielded coincidence of simulation and delivery gate thresholds to within 0.3%. For patient data analysis, differences between simulation and delivery gate thresholds are reported as a fraction of the total respiratory motion range. For the smaller phase interval, the differences between simulation and delivery gate thresholds are 8 +/- 11% and 14 +/- 21% with and without audio-visual biofeedback, respectively, when the simulation gate threshold is determined based on the mean respiratory displacement within the 40%-60% gating phase interval. For the longer phase interval, corresponding differences are 4 +/- 7% and 8 +/- 15% with and without audiovisual biofeedback, respectively. Alternatively, when the simulation gate threshold is determined based on the maximum average respiratory displacement within the gating phase interval, greater differences between simulation and delivery gate thresholds are observed. A relationship between retrospective simulation gate threshold and prospective delivery gate threshold for respiratory gating is established and validated for regular and nonregular respiratory motion. Using this relationship, the delivery gate threshold can be reliably estimated at the time of 4D CT simulation, thereby improving the accuracy and efficiency of respiratory-gated radiation delivery.

  10. 75 FR 4745 - Approval and Promulgation of Implementation Plans, State of California, San Joaquin Valley...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-29

    ...Under section 110(k)(6) of the Clean Air Act, EPA is proposing to correct our May 2004 final approval of revisions to the San Joaquin Valley Unified Air Pollution Control District portion of the California State Implementation Plan. EPA is also proposing to take action on three amended District rules, one of which was submitted on March 7, 2008 and the other two of which were submitted on March 17, 2009. Two of the submitted rules reflect revisions to approved District rules that provide for review of new and modified stationary sources (``new source review'' or NSR) within the District, and the third reflects revisions to an approved District rule that provides a mechanism by which existing stationary sources may be exempt from the requirement to secure a Federally-mandated operating permit. The NSR rule revisions relate to exemptions from permitting and from offsets for certain agricultural operations, to the establishment of NSR applicability and offset thresholds consistent with a classification of ``extreme'' nonattainment for the ozone standard, and to the implementation of EPA's NSR Reform Rules. With respect to the revised District NSR rules, EPA is proposing a limited approval and limited disapproval because, although the changes would strengthen the SIP, there are deficiencies in enforceability that prevent full approval. With respect to the operating permit rule, EPA is proposing a full approval. Lastly, EPA is proposing to rescind certain obsolete permitting requirements from the District portion of the California plan. If EPA were to finalize the limited approval and limited disapproval action, as proposed, then a sanctions clock, and EPA's obligation to promulgate a Federal implementation plan, would be triggered because certain revisions to the District rules that are the subject of this action are required under anti-backsliding principles established for the transition from the 1-hour to the 8-hour ozone standard.

  11. Clinical Utility of Risk Models to Refer Patients with Adnexal Masses to Specialized Oncology Care: Multicenter External Validation Using Decision Curve Analysis.

    PubMed

    Wynants, Laure; Timmerman, Dirk; Verbakel, Jan Y; Testa, Antonia; Savelli, Luca; Fischerova, Daniela; Franchi, Dorella; Van Holsbeke, Caroline; Epstein, Elisabeth; Froyman, Wouter; Guerriero, Stefano; Rossi, Alberto; Fruscio, Robert; Leone, Francesco Pg; Bourne, Tom; Valentin, Lil; Van Calster, Ben

    2017-09-01

    Purpose: To evaluate the utility of preoperative diagnostic models for ovarian cancer based on ultrasound and/or biomarkers for referring patients to specialized oncology care. The investigated models were RMI, ROMA, and 3 models from the International Ovarian Tumor Analysis (IOTA) group [LR2, ADNEX, and the Simple Rules risk score (SRRisk)]. Experimental Design: A secondary analysis of prospectively collected data from 2 cross-sectional cohort studies was performed to externally validate diagnostic models. A total of 2,763 patients (2,403 in dataset 1 and 360 in dataset 2) from 18 centers (11 oncology centers and 7 nononcology hospitals) in 6 countries participated. Excised tissue was histologically classified as benign or malignant. The clinical utility of the preoperative diagnostic models was assessed with net benefit (NB) at a range of risk thresholds (5%-50% risk of malignancy) to refer patients to specialized oncology care. We visualized results with decision curves and generated bootstrap confidence intervals. Results: The prevalence of malignancy was 41% in dataset 1 and 40% in dataset 2. For thresholds up to 10% to 15%, RMI and ROMA had a lower NB than referring all patients. SRRisks and ADNEX demonstrated the highest NB. At a threshold of 20%, the NBs of ADNEX, SRrisks, and RMI were 0.348, 0.350, and 0.270, respectively. Results by menopausal status and type of center (oncology vs. nononcology) were similar. Conclusions: All tested IOTA methods, especially ADNEX and SRRisks, are clinically more useful than RMI and ROMA to select patients with adnexal masses for specialized oncology care. Clin Cancer Res; 23(17); 5082-90. ©2017 AACR . ©2017 American Association for Cancer Research.

  12. Is there a kink in consumers' threshold value for cost-effectiveness in health care?

    PubMed

    O'Brien, Bernie J; Gertsen, Kirsten; Willan, Andrew R; Faulkner, Lisa A

    2002-03-01

    A reproducible observation is that consumers' willingness-to-accept (WTA) monetary compensation to forgo a program is greater than their stated willingness-to-pay (WTP) for the same benefit. Several explanations exist, including the psychological principle that the utility of losses weighs heavier than gains. We sought to quantify the WTP-WTA disparity from published literature and explore implications for cost-effectiveness analysis accept-reject thresholds in the south-west quadrant of the cost-effectiveness plane (less effect, less cost). We reviewed published studies (health and non-health) to estimate the ratio of WTA to WTP for the same program benefit for each study and to determine if WTA is consistently greater than WTP in the literature. WTA/WTP ratios were greater than unity for every study we reviewed. The ratios ranged from 3.2 to 89.4 for environmental studies (n=7), 1.9 to 6.4 for health care studies (n=2), 1.1 to 3.6 for safety studies (n=4) and 1.3 to 2.6 for experimental studies (n=7). Given that WTA is greater than WTP based on individual preferences, should not societal preferences used to determine cost-effectiveness thresholds reflect this disparity? Current convention in cost-effectiveness analysis is that any given accept-rejection criterion (e.g. $50 k/QALY gained) is symmetric - a straight line through the origin of the cost-effectiveness plane. The WTA-WTP evidence suggests a downward 'kink' through the origin for the south-west quadrant, such that the 'selling price' of a QALY is greater than the 'buying price'. The possibility of 'kinky cost-effectiveness' decision rules and the size of the kink merits further exploration. Copyright 2002 John Wiley & Sons, Ltd.

  13. Objectives, Budgets, Thresholds, and Opportunity Costs-A Health Economics Approach: An ISPOR Special Task Force Report [4].

    PubMed

    Danzon, Patricia M; Drummond, Michael F; Towse, Adrian; Pauly, Mark V

    2018-02-01

    The fourth section of our Special Task Force report focuses on a health plan or payer's technology adoption or reimbursement decision, given the array of technologies, on the basis of their different values and costs. We discuss the role of budgets, thresholds, opportunity costs, and affordability in making decisions. First, we discuss the use of budgets and thresholds in private and public health plans, their interdependence, and connection to opportunity cost. Essentially, each payer should adopt a decision rule about what is good value for money given their budget; consistent use of a cost-per-quality-adjusted life-year threshold will ensure the maximum health gain for the budget. In the United States, different public and private insurance programs could use different thresholds, reflecting the differing generosity of their budgets and implying different levels of access to technologies. In addition, different insurance plans could consider different additional elements to the quality-adjusted life-year metric discussed elsewhere in our Special Task Force report. We then define affordability and discuss approaches to deal with it, including consideration of disinvestment and related adjustment costs, the impact of delaying new technologies, and comparative cost effectiveness of technologies. Over time, the availability of new technologies may increase the amount that populations want to spend on health care. We then discuss potential modifiers to thresholds, including uncertainty about the evidence used in the decision-making process. This article concludes by discussing the application of these concepts in the context of the pluralistic US health care system, as well as the "excess burden" of tax-financed public programs versus private programs. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  14. Efficient Discovery of De-identification Policies Through a Risk-Utility Frontier

    PubMed Central

    Xia, Weiyi; Heatherly, Raymond; Ding, Xiaofeng; Li, Jiuyong; Malin, Bradley

    2014-01-01

    Modern information technologies enable organizations to capture large quantities of person-specific data while providing routine services. Many organizations hope, or are legally required, to share such data for secondary purposes (e.g., validation of research findings) in a de-identified manner. In previous work, it was shown de-identification policy alternatives could be modeled on a lattice, which could be searched for policies that met a prespecified risk threshold (e.g., likelihood of re-identification). However, the search was limited in several ways. First, its definition of utility was syntactic - based on the level of the lattice - and not semantic - based on the actual changes induced in the resulting data. Second, the threshold may not be known in advance. The goal of this work is to build the optimal set of policies that trade-off between privacy risk (R) and utility (U), which we refer to as a R-U frontier. To model this problem, we introduce a semantic definition of utility, based on information theory, that is compatible with the lattice representation of policies. To solve the problem, we initially build a set of policies that define a frontier. We then use a probability-guided heuristic to search the lattice for policies likely to update the frontier. To demonstrate the effectiveness of our approach, we perform an empirical analysis with the Adult dataset of the UCI Machine Learning Repository. We show that our approach can construct a frontier closer to optimal than competitive approaches by searching a smaller number of policies. In addition, we show that a frequently followed de-identification policy (i.e., the Safe Harbor standard of the HIPAA Privacy Rule) is suboptimal in comparison to the frontier discovered by our approach. PMID:25520961

  15. Efficient Discovery of De-identification Policies Through a Risk-Utility Frontier.

    PubMed

    Xia, Weiyi; Heatherly, Raymond; Ding, Xiaofeng; Li, Jiuyong; Malin, Bradley

    2013-01-01

    Modern information technologies enable organizations to capture large quantities of person-specific data while providing routine services. Many organizations hope, or are legally required, to share such data for secondary purposes (e.g., validation of research findings) in a de-identified manner. In previous work, it was shown de-identification policy alternatives could be modeled on a lattice, which could be searched for policies that met a prespecified risk threshold (e.g., likelihood of re-identification). However, the search was limited in several ways. First, its definition of utility was syntactic - based on the level of the lattice - and not semantic - based on the actual changes induced in the resulting data. Second, the threshold may not be known in advance. The goal of this work is to build the optimal set of policies that trade-off between privacy risk (R) and utility (U), which we refer to as a R-U frontier. To model this problem, we introduce a semantic definition of utility, based on information theory, that is compatible with the lattice representation of policies. To solve the problem, we initially build a set of policies that define a frontier. We then use a probability-guided heuristic to search the lattice for policies likely to update the frontier. To demonstrate the effectiveness of our approach, we perform an empirical analysis with the Adult dataset of the UCI Machine Learning Repository. We show that our approach can construct a frontier closer to optimal than competitive approaches by searching a smaller number of policies. In addition, we show that a frequently followed de-identification policy (i.e., the Safe Harbor standard of the HIPAA Privacy Rule) is suboptimal in comparison to the frontier discovered by our approach.

  16. Comparison of bedside screening methods for frailty assessment in older adult trauma patients in the emergency department.

    PubMed

    Shah, Sachita P; Penn, Kevin; Kaplan, Stephen J; Vrablik, Michael; Jablonowski, Karl; Pham, Tam N; Reed, May J

    2018-04-14

    Frailty is linked to poor outcomes in older patients. We prospectively compared the utility of the picture-based Clinical Frailty Scale (CFS9), clinical assessments, and ultrasound muscle measurements against the reference FRAIL scale in older adult trauma patients in the emergency department (ED). We recruited a convenience sample of adults 65 yrs. or older with blunt trauma and injury severity scores <9. We queried subjects (or surrogates) on the FRAIL scale, and compared this to: physician-based and subject/surrogate-based CFS9; mid-upper arm circumference (MUAC) and grip strength; and ultrasound (US) measures of muscle thickness (limbs and abdominal wall). We derived optimal diagnostic thresholds and calculated performance metrics for each comparison using sensitivity, specificity, predictive values, and area under receiver operating characteristic curves (AUROC). Fifteen of 65 patients were frail by FRAIL scale (23%). CFS9 performed well when assessed by subject/surrogate (AUROC 0.91 [95% CI 0.84-0.98] or physician (AUROC 0.77 [95% CI 0.63-0.91]. Optimal thresholds for both physician and subject/surrogate were CFS9 of 4 or greater. If both physician and subject/surrogate provided scores <4, sensitivity and negative predictive value were 90.0% (54.1-99.5%) and 95.0% (73.1-99.7%). Grip strength and MUAC were not predictors. US measures that combined biceps and quadriceps thickness showed an AUROC of 0.75 compared to the reference standard. The ED needs rapid, validated tools to screen for frailty. The CFS9 has excellent negative predictive value in ruling out frailty. Ultrasound of combined biceps and quadriceps has modest concordance as an alternative in trauma patients who cannot provide a history. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Behavior of a stochastic SIR epidemic model with saturated incidence and vaccination rules

    NASA Astrophysics Data System (ADS)

    Zhang, Yue; Li, Yang; Zhang, Qingling; Li, Aihua

    2018-07-01

    In this paper, the threshold behavior of a susceptible-infected-recovered (SIR) epidemic model with stochastic perturbation is investigated. Firstly, it is obtained that the system has a unique global positive solution with any positive initial value. Random effect may lead to disease extinction under a simple condition. Subsequently, sufficient condition for persistence has been established in the mean of the disease. Finally, some numerical simulations are carried out to confirm the analytical results.

  18. 76 FR 33170 - Defense Federal Acquisition Regulation Supplement; Inclusion of Option Amounts in Limitations on...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-08

    ...DoD is issuing this final rule amending the Defense Federal Acquisition Regulation Supplement (DFARS) to implement section 826 of the National Defense Authorization Act for Fiscal Year 2011. Section 826 amended the DoD pilot program for transition to follow-on contracting after use of other transaction authority, to establish that the threshold limitation of $50 million for contracts and subcontracts under the program includes the dollar value of all options.

  19. Wettability behavior of water droplet on organic-polluted fused quartz surfaces of pillar-type nanostructures applying molecular dynamics simulation

    NASA Astrophysics Data System (ADS)

    Chen, Jiaxuan; Chen, Wenyang; Xie, Yajing; Wang, Zhiguo; Qin, Jianbo

    2017-02-01

    Molecular dynamics (MD) is applied to research the wettability behaviors of different scale of water clusters absorbed on organic-polluted fused quartz (FQ) surface and different surface structures. The wettability of water clusters is studied under the effect of organic pollutant. With the combined influence of pillar height and interval, the stair-step Wenzel-Cassie transition critical line is obtained by analyzing stable state of water clusters on different surface structures. The results also show that when interval of pillars and the height of pillars keep constant respectively, the changing rules are exactly the opposite and these are termed as the "waterfall" rules. The substrate models of water clusters at Cassie-Baxter state which are at the vicinity of critical line are chosen to analyze the relationship of HI (refers to the pillar height/interval) ratio and scale of water cluster. The study has found that there is a critical changing threshold in the wettability changing process. When the HI ratio keeps constant, the wettability decreases first and then increase as the size of cluster increases; on the contrary, when the size of cluster keeps constant, the wettability decreases and then increase with the decrease of HI ratio, but when the size of water cluster is close to the threshold the HI ratio has little effect on the wettability.

  20. Position-dependent effects of locked nucleic acid (LNA) on DNA sequencing and PCR primers

    PubMed Central

    Levin, Joshua D.; Fiala, Dean; Samala, Meinrado F.; Kahn, Jason D.; Peterson, Raymond J.

    2006-01-01

    Genomes are becoming heavily annotated with important features. Analysis of these features often employs oligonucleotides that hybridize at defined locations. When the defined location lies in a poor sequence context, traditional design strategies may fail. Locked Nucleic Acid (LNA) can enhance oligonucleotide affinity and specificity. Though LNA has been used in many applications, formal design rules are still being defined. To further this effort we have investigated the effect of LNA on the performance of sequencing and PCR primers in AT-rich regions, where short primers yield poor sequencing reads or PCR yields. LNA was used in three positional patterns: near the 5′ end (LNA-5′), near the 3′ end (LNA-3′) and distributed throughout (LNA-Even). Quantitative measures of sequencing read length (Phred Q30 count) and real-time PCR signal (cycle threshold, CT) were characterized using two-way ANOVA. LNA-5′ increased the average Phred Q30 score by 60% and it was never observed to decrease performance. LNA-5′ generated cycle thresholds in quantitative PCR that were comparable to high-yielding conventional primers. In contrast, LNA-3′ and LNA-Even did not improve read lengths or CT. ANOVA demonstrated the statistical significance of these results and identified significant interaction between the positional design rule and primer sequence. PMID:17071964

  1. Invasion percolation with memory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kharabaf, H.; Yortsos, Y.C.

    Motivated by the problem of finding the minimum threshold path (MTP) in a lattice of elements with random thresholds {tau}{sub i}, we propose a new class of invasion processes, in which the front advances by minimizing or maximizing the measure S{sub n}={summation}{sub i}{tau}{sub i}{sup n} for real n. This rule assigns long-time memory to the invasion process. If the rule minimizes S{sub n} (case of minimum penalty), the fronts are stable and connected to invasion percolation in a gradient [J. P. Hulin, E. Clement, C. Baudet, J. F. Gouyet, and M. Rosso, Phys. Rev. Lett. {bold 61}, 333 (1988)] butmore » in a correlated lattice, with invasion percolation [D. Wilkinson and J. F. Willemsen, J. Phys. A {bold 16}, 3365 (1983)] recovered in the limit {vert_bar}n{vert_bar}={infinity}. For small n, the MTP is shown to be related to the optimal path of the directed polymer in random media (DPRM) problem [T. Halpin-Healy and Y.-C. Zhang, Phys. Rep. {bold 254}, 215 (1995)]. In the large n limit, however, it reduces to the backbone of a mixed site-bond percolation cluster. The algorithm allows for various properties of the MTP and the DPRM to be studied. In the unstable case (case of maximum gain), the front is a self-avoiding random walk. {copyright} {ital 1997} {ital The American Physical Society}« less

  2. HERB: A production system for programming with hierarchical expert rule bases: User's manual, HERB Version 1. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummel, K.E.

    1987-12-01

    Expert systems are artificial intelligence programs that solve problems requiring large amounts of heuristic knowledge, based on years of experience and tradition. Production systems are domain-independent tools that support the development of rule-based expert systems. This document describes a general purpose production system known as HERB. This system was developed to support the programming of expert systems using hierarchically structured rule bases. HERB encourages the partitioning of rules into multiple rule bases and supports the use of multiple conflict resolution strategies. Multiple rule bases can also be placed on a system stack and simultaneously searched during each interpreter cycle. Bothmore » backward and forward chaining rules are supported by HERB. The condition portion of each rule can contain both patterns, which are matched with facts in a data base, and LISP expressions, which are explicitly evaluated in the LISP environment. Properties of objects can also be stored in the HERB data base and referenced within the scope of each rule. This document serves both as an introduction to the principles of LISP-based production systems and as a user's manual for the HERB system. 6 refs., 17 figs.« less

  3. An investigation of care-based vs. rule-based morality in frontotemporal dementia, Alzheimer's disease, and healthy controls.

    PubMed

    Carr, Andrew R; Paholpak, Pongsatorn; Daianu, Madelaine; Fong, Sylvia S; Mather, Michelle; Jimenez, Elvira E; Thompson, Paul; Mendez, Mario F

    2015-11-01

    Behavioral changes in dementia, especially behavioral variant frontotemporal dementia (bvFTD), may result in alterations in moral reasoning. Investigators have not clarified whether these alterations reflect differential impairment of care-based vs. rule-based moral behavior. This study investigated 18 bvFTD patients, 22 early onset Alzheimer's disease (eAD) patients, and 20 healthy age-matched controls on care-based and rule-based items from the Moral Behavioral Inventory and the Social Norms Questionnaire, neuropsychological measures, and magnetic resonance imaging (MRI) regions of interest. There were significant group differences with the bvFTD patients rating care-based morality transgressions less severely than the eAD group and rule-based moral behavioral transgressions more severely than controls. Across groups, higher care-based morality ratings correlated with phonemic fluency on neuropsychological tests, whereas higher rule-based morality ratings correlated with increased difficulty set-shifting and learning new rules to tasks. On neuroimaging, severe care-based reasoning correlated with cortical volume in right anterior temporal lobe, and rule-based reasoning correlated with decreased cortical volume in the right orbitofrontal cortex. Together, these findings suggest that frontotemporal disease decreases care-based morality and facilitates rule-based morality possibly from disturbed contextual abstraction and set-shifting. Future research can examine whether frontal lobe disorders and bvFTD result in a shift from empathic morality to the strong adherence to conventional rules. Published by Elsevier Ltd.

  4. An Investigation of Care-Based vs. Rule-Based Morality in Frontotemporal Dementia, Alzheimer’s Disease, and Healthy Controls

    PubMed Central

    Carr, Andrew R.; Paholpak, Pongsatorn; Daianu, Madelaine; Fong, Sylvia S.; Mather, Michelle; Jimenez, Elvira E.; Thompson, Paul; Mendez, Mario F.

    2015-01-01

    Behavioral changes in dementia, especially behavioral variant frontotemporal dementia (bvFTD), may result in alterations in moral reasoning. Investigators have not clarified whether these alterations reflect differential impairment of care-based vs. rule-based moral behavior. This study investigated 18 bvFTD patients, 22 early onset Alzheimer’s disease (eAD) patients, and 20 healthy age-matched controls on care-based and rule-based items from the Moral Behavioral Inventory and the Social Norms Questionnaire, neuropsychological measures, and magnetic resonance imaging (MRI) regions of interest. There were significant group differences with the bvFTD patients rating care-based morality transgressions less severely than the eAD group and rule-based moral behavioral transgressions more severely than controls. Across groups, higher care-based morality ratings correlated with phonemic fluency on neuropsychological tests, whereas higher rule-based morality ratings correlated with increased difficulty set-shifting and learning new rules to tasks. On neuroimaging, severe care-based reasoning correlated with cortical volume in right anterior temporal lobe, and rule-based reasoning correlated with decreased cortical volume in the right orbitofrontal cortex. Together, these findings suggest that frontotemporal disease decreases care-based morality and facilitates rule-based morality possibly from disturbed contextual abstraction and set-shifting. Future research can examine whether frontal lobe disorders and bvFTD result in a shift from empathic morality to the strong adherence to conventional rules. PMID:26432341

  5. A Swarm Optimization approach for clinical knowledge mining.

    PubMed

    Christopher, J Jabez; Nehemiah, H Khanna; Kannan, A

    2015-10-01

    Rule-based classification is a typical data mining task that is being used in several medical diagnosis and decision support systems. The rules stored in the rule base have an impact on classification efficiency. Rule sets that are extracted with data mining tools and techniques are optimized using heuristic or meta-heuristic approaches in order to improve the quality of the rule base. In this work, a meta-heuristic approach called Wind-driven Swarm Optimization (WSO) is used. The uniqueness of this work lies in the biological inspiration that underlies the algorithm. WSO uses Jval, a new metric, to evaluate the efficiency of a rule-based classifier. Rules are extracted from decision trees. WSO is used to obtain different permutations and combinations of rules whereby the optimal ruleset that satisfies the requirement of the developer is used for predicting the test data. The performance of various extensions of decision trees, namely, RIPPER, PART, FURIA and Decision Tables are analyzed. The efficiency of WSO is also compared with the traditional Particle Swarm Optimization. Experiments were carried out with six benchmark medical datasets. The traditional C4.5 algorithm yields 62.89% accuracy with 43 rules for liver disorders dataset where as WSO yields 64.60% with 19 rules. For Heart disease dataset, C4.5 is 68.64% accurate with 98 rules where as WSO is 77.8% accurate with 34 rules. The normalized standard deviation for accuracy of PSO and WSO are 0.5921 and 0.5846 respectively. WSO provides accurate and concise rulesets. PSO yields results similar to that of WSO but the novelty of WSO lies in its biological motivation and it is customization for rule base optimization. The trade-off between the prediction accuracy and the size of the rule base is optimized during the design and development of rule-based clinical decision support system. The efficiency of a decision support system relies on the content of the rule base and classification accuracy. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Automated visualization of rule-based models

    PubMed Central

    Tapia, Jose-Juan; Faeder, James R.

    2017-01-01

    Frameworks such as BioNetGen, Kappa and Simmune use “reaction rules” to specify biochemical interactions compactly, where each rule specifies a mechanism such as binding or phosphorylation and its structural requirements. Current rule-based models of signaling pathways have tens to hundreds of rules, and these numbers are expected to increase as more molecule types and pathways are added. Visual representations are critical for conveying rule-based models, but current approaches to show rules and interactions between rules scale poorly with model size. Also, inferring design motifs that emerge from biochemical interactions is an open problem, so current approaches to visualize model architecture rely on manual interpretation of the model. Here, we present three new visualization tools that constitute an automated visualization framework for rule-based models: (i) a compact rule visualization that efficiently displays each rule, (ii) the atom-rule graph that conveys regulatory interactions in the model as a bipartite network, and (iii) a tunable compression pipeline that incorporates expert knowledge and produces compact diagrams of model architecture when applied to the atom-rule graph. The compressed graphs convey network motifs and architectural features useful for understanding both small and large rule-based models, as we show by application to specific examples. Our tools also produce more readable diagrams than current approaches, as we show by comparing visualizations of 27 published models using standard graph metrics. We provide an implementation in the open source and freely available BioNetGen framework, but the underlying methods are general and can be applied to rule-based models from the Kappa and Simmune frameworks also. We expect that these tools will promote communication and analysis of rule-based models and their eventual integration into comprehensive whole-cell models. PMID:29131816

  7. Compartmental and Spatial Rule-Based Modeling with Virtual Cell.

    PubMed

    Blinov, Michael L; Schaff, James C; Vasilescu, Dan; Moraru, Ion I; Bloom, Judy E; Loew, Leslie M

    2017-10-03

    In rule-based modeling, molecular interactions are systematically specified in the form of reaction rules that serve as generators of reactions. This provides a way to account for all the potential molecular complexes and interactions among multivalent or multistate molecules. Recently, we introduced rule-based modeling into the Virtual Cell (VCell) modeling framework, permitting graphical specification of rules and merger of networks generated automatically (using the BioNetGen modeling engine) with hand-specified reaction networks. VCell provides a number of ordinary differential equation and stochastic numerical solvers for single-compartment simulations of the kinetic systems derived from these networks, and agent-based network-free simulation of the rules. In this work, compartmental and spatial modeling of rule-based models has been implemented within VCell. To enable rule-based deterministic and stochastic spatial simulations and network-free agent-based compartmental simulations, the BioNetGen and NFSim engines were each modified to support compartments. In the new rule-based formalism, every reactant and product pattern and every reaction rule are assigned locations. We also introduce the rule-based concept of molecular anchors. This assures that any species that has a molecule anchored to a predefined compartment will remain in this compartment. Importantly, in addition to formulation of compartmental models, this now permits VCell users to seamlessly connect reaction networks derived from rules to explicit geometries to automatically generate a system of reaction-diffusion equations. These may then be simulated using either the VCell partial differential equations deterministic solvers or the Smoldyn stochastic simulator. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  8. Dynamics of epidemic diseases on a growing adaptive network

    PubMed Central

    Demirel, Güven; Barter, Edmund; Gross, Thilo

    2017-01-01

    The study of epidemics on static networks has revealed important effects on disease prevalence of network topological features such as the variance of the degree distribution, i.e. the distribution of the number of neighbors of nodes, and the maximum degree. Here, we analyze an adaptive network where the degree distribution is not independent of epidemics but is shaped through disease-induced dynamics and mortality in a complex interplay. We study the dynamics of a network that grows according to a preferential attachment rule, while nodes are simultaneously removed from the network due to disease-induced mortality. We investigate the prevalence of the disease using individual-based simulations and a heterogeneous node approximation. Our results suggest that in this system in the thermodynamic limit no epidemic thresholds exist, while the interplay between network growth and epidemic spreading leads to exponential networks for any finite rate of infectiousness when the disease persists. PMID:28186146

  9. Dynamics of epidemic diseases on a growing adaptive network.

    PubMed

    Demirel, Güven; Barter, Edmund; Gross, Thilo

    2017-02-10

    The study of epidemics on static networks has revealed important effects on disease prevalence of network topological features such as the variance of the degree distribution, i.e. the distribution of the number of neighbors of nodes, and the maximum degree. Here, we analyze an adaptive network where the degree distribution is not independent of epidemics but is shaped through disease-induced dynamics and mortality in a complex interplay. We study the dynamics of a network that grows according to a preferential attachment rule, while nodes are simultaneously removed from the network due to disease-induced mortality. We investigate the prevalence of the disease using individual-based simulations and a heterogeneous node approximation. Our results suggest that in this system in the thermodynamic limit no epidemic thresholds exist, while the interplay between network growth and epidemic spreading leads to exponential networks for any finite rate of infectiousness when the disease persists.

  10. Dynamics of epidemic diseases on a growing adaptive network

    NASA Astrophysics Data System (ADS)

    Demirel, Güven; Barter, Edmund; Gross, Thilo

    2017-02-01

    The study of epidemics on static networks has revealed important effects on disease prevalence of network topological features such as the variance of the degree distribution, i.e. the distribution of the number of neighbors of nodes, and the maximum degree. Here, we analyze an adaptive network where the degree distribution is not independent of epidemics but is shaped through disease-induced dynamics and mortality in a complex interplay. We study the dynamics of a network that grows according to a preferential attachment rule, while nodes are simultaneously removed from the network due to disease-induced mortality. We investigate the prevalence of the disease using individual-based simulations and a heterogeneous node approximation. Our results suggest that in this system in the thermodynamic limit no epidemic thresholds exist, while the interplay between network growth and epidemic spreading leads to exponential networks for any finite rate of infectiousness when the disease persists.

  11. A fast and accurate dihedral interpolation loop subdivision scheme

    NASA Astrophysics Data System (ADS)

    Shi, Zhuo; An, Yalei; Wang, Zhongshuai; Yu, Ke; Zhong, Si; Lan, Rushi; Luo, Xiaonan

    2018-04-01

    In this paper, we propose a fast and accurate dihedral interpolation Loop subdivision scheme for subdivision surfaces based on triangular meshes. In order to solve the problem of surface shrinkage, we keep the limit condition unchanged, which is important. Extraordinary vertices are handled using modified Butterfly rules. Subdivision schemes are computationally costly as the number of faces grows exponentially at higher levels of subdivision. To address this problem, our approach is to use local surface information to adaptively refine the model. This is achieved simply by changing the threshold value of the dihedral angle parameter, i.e., the angle between the normals of a triangular face and its adjacent faces. We then demonstrate the effectiveness of the proposed method for various 3D graphic triangular meshes, and extensive experimental results show that it can match or exceed the expected results at lower computational cost.

  12. A Deep Learning Approach to Examine Ischemic ST Changes in Ambulatory ECG Recordings.

    PubMed

    Xiao, Ran; Xu, Yuan; Pelter, Michele M; Mortara, David W; Hu, Xiao

    2018-01-01

    Patients with suspected acute coronary syndrome (ACS) are at risk of transient myocardial ischemia (TMI), which could lead to serious morbidity or even mortality. Early detection of myocardial ischemia can reduce damage to heart tissues and improve patient condition. Significant ST change in the electrocardiogram (ECG) is an important marker for detecting myocardial ischemia during the rule-out phase of potential ACS. However, current ECG monitoring software is vastly underused due to excessive false alarms. The present study aims to tackle this problem by combining a novel image-based approach with deep learning techniques to improve the detection accuracy of significant ST depression change. The obtained convolutional neural network (CNN) model yields an average area under the curve (AUC) at 89.6% from an independent testing set. At selected optimal cutoff thresholds, the proposed model yields a mean sensitivity at 84.4% while maintaining specificity at 84.9%.

  13. Resilient filtering for time-varying stochastic coupling networks under the event-triggering scheduling

    NASA Astrophysics Data System (ADS)

    Wang, Fan; Liang, Jinling; Dobaie, Abdullah M.

    2018-07-01

    The resilient filtering problem is considered for a class of time-varying networks with stochastic coupling strengths. An event-triggered strategy is adopted to save the network resources by scheduling the signal transmission from the sensors to the filters based on certain prescribed rules. Moreover, the filter parameters to be designed are subject to gain perturbations. The primary aim of the addressed problem is to determine a resilient filter that ensures an acceptable filtering performance for the considered network with event-triggering scheduling. To handle such an issue, an upper bound on the estimation error variance is established for each node according to the stochastic analysis. Subsequently, the resilient filter is designed by locally minimizing the derived upper bound at each iteration. Moreover, rigorous analysis shows the monotonicity of the minimal upper bound regarding the triggering threshold. Finally, a simulation example is presented to show effectiveness of the established filter scheme.

  14. Local and global epidemic outbreaks in populations moving in inhomogeneous environments

    NASA Astrophysics Data System (ADS)

    Buscarino, Arturo; Fortuna, Luigi; Frasca, Mattia; Rizzo, Alessandro

    2014-10-01

    We study disease spreading in a system of agents moving in a space where the force of infection is not homogeneous. Agents are random walkers that additionally execute long-distance jumps, and the plane in which they move is divided into two regions where the force of infection takes different values. We show the onset of a local epidemic threshold and a global one and explain them in terms of mean-field approximations. We also elucidate the critical role of the agent velocity, jump probability, and density parameters in achieving the conditions for local and global outbreaks. Finally, we show that the results are independent of the specific microscopic rules adopted for agent motion, since a similar behavior is also observed for the distribution of agent velocity based on a truncated power law, which is a model often used to fit real data on motion patterns of animals and humans.

  15. Decision making in recurrent neuronal circuits.

    PubMed

    Wang, Xiao-Jing

    2008-10-23

    Decision making has recently emerged as a central theme in neurophysiological studies of cognition, and experimental and computational work has led to the proposal of a cortical circuit mechanism of elemental decision computations. This mechanism depends on slow recurrent synaptic excitation balanced by fast feedback inhibition, which not only instantiates attractor states for forming categorical choices but also long transients for gradually accumulating evidence in favor of or against alternative options. Such a circuit endowed with reward-dependent synaptic plasticity is able to produce adaptive choice behavior. While decision threshold is a core concept for reaction time tasks, it can be dissociated from a general decision rule. Moreover, perceptual decisions and value-based economic choices are described within a unified framework in which probabilistic choices result from irregular neuronal activity as well as iterative interactions of a decision maker with an uncertain environment or other unpredictable decision makers in a social group.

  16. Revised Interim Final Consolidated Enforcement Response and Penalty Policy for the Pre-Renovation Education Rule; Renovation, Repair and Painting Rule; and Lead-Based Paint Activities Rule

    EPA Pesticide Factsheets

    This is the revised version of the Interim Final Consolidated Enforcement Response and Penalty Policy for the Pre-Renovation Education Rule; Renovation, Repair and Painting Rule; and Lead-Based Paint Activities Rule.

  17. Comparison of GOES Cloud Classification Algorithms Employing Explicit and Implicit Physics

    NASA Technical Reports Server (NTRS)

    Bankert, Richard L.; Mitrescu, Cristian; Miller, Steven D.; Wade, Robert H.

    2009-01-01

    Cloud-type classification based on multispectral satellite imagery data has been widely researched and demonstrated to be useful for distinguishing a variety of classes using a wide range of methods. The research described here is a comparison of the classifier output from two very different algorithms applied to Geostationary Operational Environmental Satellite (GOES) data over the course of one year. The first algorithm employs spectral channel thresholding and additional physically based tests. The second algorithm was developed through a supervised learning method with characteristic features of expertly labeled image samples used as training data for a 1-nearest-neighbor classification. The latter's ability to identify classes is also based in physics, but those relationships are embedded implicitly within the algorithm. A pixel-to-pixel comparison analysis was done for hourly daytime scenes within a region in the northeastern Pacific Ocean. Considerable agreement was found in this analysis, with many of the mismatches or disagreements providing insight to the strengths and limitations of each classifier. Depending upon user needs, a rule-based or other postprocessing system that combines the output from the two algorithms could provide the most reliable cloud-type classification.

  18. An online sleep apnea detection method based on recurrence quantification analysis.

    PubMed

    Nguyen, Hoa Dinh; Wilkins, Brek A; Cheng, Qi; Benjamin, Bruce Allen

    2014-07-01

    This paper introduces an online sleep apnea detection method based on heart rate complexity as measured by recurrence quantification analysis (RQA) statistics of heart rate variability (HRV) data. RQA statistics can capture nonlinear dynamics of a complex cardiorespiratory system during obstructive sleep apnea. In order to obtain a more robust measurement of the nonstationarity of the cardiorespiratory system, we use different fixed amount of neighbor thresholdings for recurrence plot calculation. We integrate a feature selection algorithm based on conditional mutual information to select the most informative RQA features for classification, and hence, to speed up the real-time classification process without degrading the performance of the system. Two types of binary classifiers, i.e., support vector machine and neural network, are used to differentiate apnea from normal sleep. A soft decision fusion rule is developed to combine the results of these classifiers in order to improve the classification performance of the whole system. Experimental results show that our proposed method achieves better classification results compared with the previous recurrence analysis-based approach. We also show that our method is flexible and a strong candidate for a real efficient sleep apnea detection system.

  19. Effect of thermal insulation on the electrical characteristics of NbOx threshold switches

    NASA Astrophysics Data System (ADS)

    Wang, Ziwen; Kumar, Suhas; Wong, H.-S. Philip; Nishi, Yoshio

    2018-02-01

    Threshold switches based on niobium oxide (NbOx) are promising candidates as bidirectional selector devices in crossbar memory arrays and building blocks for neuromorphic computing. Here, it is experimentally demonstrated that the electrical characteristics of NbOx threshold switches can be tuned by engineering the thermal insulation. Increasing the thermal insulation by ˜10× is shown to produce ˜7× reduction in threshold current and ˜45% reduction in threshold voltage. The reduced threshold voltage leads to ˜5× reduction in half-selection leakage, which highlights the effectiveness of reducing half-selection leakage of NbOx selectors by engineering the thermal insulation. A thermal feedback model based on Poole-Frenkel conduction in NbOx can explain the experimental results very well, which also serves as a piece of strong evidence supporting the validity of the Poole-Frenkel based mechanism in NbOx threshold switches.

  20. Rules based process window OPC

    NASA Astrophysics Data System (ADS)

    O'Brien, Sean; Soper, Robert; Best, Shane; Mason, Mark

    2008-03-01

    As a preliminary step towards Model-Based Process Window OPC we have analyzed the impact of correcting post-OPC layouts using rules based methods. Image processing on the Brion Tachyon was used to identify sites where the OPC model/recipe failed to generate an acceptable solution. A set of rules for 65nm active and poly were generated by classifying these failure sites. The rules were based upon segment runlengths, figure spaces, and adjacent figure widths. 2.1 million sites for active were corrected in a small chip (comparing the pre and post rules based operations), and 59 million were found at poly. Tachyon analysis of the final reticle layout found weak margin sites distinct from those sites repaired by rules-based corrections. For the active layer more than 75% of the sites corrected by rules would have printed without a defect indicating that most rulesbased cleanups degrade the lithographic pattern. Some sites were missed by the rules based cleanups due to either bugs in the DRC software or gaps in the rules table. In the end dramatic changes to the reticle prevented catastrophic lithography errors, but this method is far too blunt. A more subtle model-based procedure is needed changing only those sites which have unsatisfactory lithographic margin.

  1. Bayesian population decoding of spiking neurons.

    PubMed

    Gerwinn, Sebastian; Macke, Jakob; Bethge, Matthias

    2009-01-01

    The timing of action potentials in spiking neurons depends on the temporal dynamics of their inputs and contains information about temporal fluctuations in the stimulus. Leaky integrate-and-fire neurons constitute a popular class of encoding models, in which spike times depend directly on the temporal structure of the inputs. However, optimal decoding rules for these models have only been studied explicitly in the noiseless case. Here, we study decoding rules for probabilistic inference of a continuous stimulus from the spike times of a population of leaky integrate-and-fire neurons with threshold noise. We derive three algorithms for approximating the posterior distribution over stimuli as a function of the observed spike trains. In addition to a reconstruction of the stimulus we thus obtain an estimate of the uncertainty as well. Furthermore, we derive a 'spike-by-spike' online decoding scheme that recursively updates the posterior with the arrival of each new spike. We use these decoding rules to reconstruct time-varying stimuli represented by a Gaussian process from spike trains of single neurons as well as neural populations.

  2. Lowering the Percolation Threshold of Conductive Composites Using Particulate Polymer Microstructure

    NASA Astrophysics Data System (ADS)

    Grunlan, Jaime; Gerberich, William; Francis, Lorraine

    2000-03-01

    In an effort to lower the percolation threshold of carbon black-filled polymer composites, various polymer microstructures were examined. Composites were prepared using polyvinyl acetate (PVAc) latex, PVAc water-dispersible powder and polyvinylpyrrolidone (PVP) solution as the matrix starting material. Composites prepared using the particulate microstructures showed a significantly lowered percolation threshold relative to an equivalently prepared composite using the PVP solution. The PVAc latex-based composites has a percolation threshold of 3 volthe PVP solution-based composite yielded a percolation threshold near 15 voloccupied by polymer particles, the particulate matrix-based composites create a segregated CB network at low filler concentration.

  3. Some aspects of doping and medication control in equine sports.

    PubMed

    Houghton, Ed; Maynard, Steve

    2010-01-01

    This chapter reviews drug and medication control in equestrian sports and addresses the rules of racing, the technological advances that have been made in drug detection and the importance of metabolism studies in the development of effective drug surveillance programmes. Typical approaches to screening and confirmatory analysis are discussed, as are the quality processes that underpin these procedures. The chapter also addresses four specific topics relevant to equestrian sports: substances controlled by threshold values, the approach adopted recently by European racing authorities to control some therapeutic substances, anabolic steroids in the horse and LC-MS analysis in drug testing in animal sports and metabolism studies. The purpose of discussing these specific topics is to emphasise the importance of research and development and collaboration to further global harmonisation and the development and support of international rules.

  4. Classification Based on Pruning and Double Covered Rule Sets for the Internet of Things Applications

    PubMed Central

    Zhou, Zhongmei; Wang, Weiping

    2014-01-01

    The Internet of things (IOT) is a hot issue in recent years. It accumulates large amounts of data by IOT users, which is a great challenge to mining useful knowledge from IOT. Classification is an effective strategy which can predict the need of users in IOT. However, many traditional rule-based classifiers cannot guarantee that all instances can be covered by at least two classification rules. Thus, these algorithms cannot achieve high accuracy in some datasets. In this paper, we propose a new rule-based classification, CDCR-P (Classification based on the Pruning and Double Covered Rule sets). CDCR-P can induce two different rule sets A and B. Every instance in training set can be covered by at least one rule not only in rule set A, but also in rule set B. In order to improve the quality of rule set B, we take measure to prune the length of rules in rule set B. Our experimental results indicate that, CDCR-P not only is feasible, but also it can achieve high accuracy. PMID:24511304

  5. Classification based on pruning and double covered rule sets for the internet of things applications.

    PubMed

    Li, Shasha; Zhou, Zhongmei; Wang, Weiping

    2014-01-01

    The Internet of things (IOT) is a hot issue in recent years. It accumulates large amounts of data by IOT users, which is a great challenge to mining useful knowledge from IOT. Classification is an effective strategy which can predict the need of users in IOT. However, many traditional rule-based classifiers cannot guarantee that all instances can be covered by at least two classification rules. Thus, these algorithms cannot achieve high accuracy in some datasets. In this paper, we propose a new rule-based classification, CDCR-P (Classification based on the Pruning and Double Covered Rule sets). CDCR-P can induce two different rule sets A and B. Every instance in training set can be covered by at least one rule not only in rule set A, but also in rule set B. In order to improve the quality of rule set B, we take measure to prune the length of rules in rule set B. Our experimental results indicate that, CDCR-P not only is feasible, but also it can achieve high accuracy.

  6. A new edge detection algorithm based on Canny idea

    NASA Astrophysics Data System (ADS)

    Feng, Yingke; Zhang, Jinmin; Wang, Siming

    2017-10-01

    The traditional Canny algorithm has poor self-adaptability threshold, and it is more sensitive to noise. In order to overcome these drawbacks, this paper proposed a new edge detection method based on Canny algorithm. Firstly, the media filtering and filtering based on the method of Euclidean distance are adopted to process it; secondly using the Frei-chen algorithm to calculate gradient amplitude; finally, using the Otsu algorithm to calculate partial gradient amplitude operation to get images of thresholds value, then find the average of all thresholds that had been calculated, half of the average is high threshold value, and the half of the high threshold value is low threshold value. Experiment results show that this new method can effectively suppress noise disturbance, keep the edge information, and also improve the edge detection accuracy.

  7. Comparison of three microarray probe annotation pipelines: differences in strategies and their effect on downstream analysis

    PubMed Central

    Neerincx, Pieter BT; Casel, Pierrot; Prickett, Dennis; Nie, Haisheng; Watson, Michael; Leunissen, Jack AM; Groenen, Martien AM; Klopp, Christophe

    2009-01-01

    Background Reliable annotation linking oligonucleotide probes to target genes is essential for functional biological analysis of microarray experiments. We used the IMAD, OligoRAP and sigReannot pipelines to update the annotation for the ARK-Genomics Chicken 20 K array as part of a joined EADGENE/SABRE workshop. In this manuscript we compare their annotation strategies and results. Furthermore, we analyse the effect of differences in updated annotation on functional analysis for an experiment involving Eimeria infected chickens and finally we propose guidelines for optimal annotation strategies. Results IMAD, OligoRAP and sigReannot update both annotation and estimated target specificity. The 3 pipelines can assign oligos to target specificity categories although with varying degrees of resolution. Target specificity is judged based on the amount and type of oligo versus target-gene alignments (hits), which are determined by filter thresholds that users can adjust based on their experimental conditions. Linking oligos to annotation on the other hand is based on rigid rules, which differ between pipelines. For 52.7% of the oligos from a subset selected for in depth comparison all pipelines linked to one or more Ensembl genes with consensus on 44.0%. In 31.0% of the cases none of the pipelines could assign an Ensembl gene to an oligo and for the remaining 16.3% the coverage differed between pipelines. Differences in updated annotation were mainly due to different thresholds for hybridisation potential filtering of oligo versus target-gene alignments and different policies for expanding annotation using indirect links. The differences in updated annotation packages had a significant effect on GO term enrichment analysis with consensus on only 67.2% of the enriched terms. Conclusion In addition to flexible thresholds to determine target specificity, annotation tools should provide metadata describing the relationships between oligos and the annotation assigned to them. These relationships can then be used to judge the varying degrees of reliability allowing users to fine-tune the balance between reliability and coverage. This is important as it can have a significant effect on functional microarray analysis as exemplified by the lack of consensus on almost one third of the terms found with GO term enrichment analysis based on updated IMAD, OligoRAP or sigReannot annotation. PMID:19615109

  8. Comparison of three microarray probe annotation pipelines: differences in strategies and their effect on downstream analysis.

    PubMed

    Neerincx, Pieter Bt; Casel, Pierrot; Prickett, Dennis; Nie, Haisheng; Watson, Michael; Leunissen, Jack Am; Groenen, Martien Am; Klopp, Christophe

    2009-07-16

    Reliable annotation linking oligonucleotide probes to target genes is essential for functional biological analysis of microarray experiments. We used the IMAD, OligoRAP and sigReannot pipelines to update the annotation for the ARK-Genomics Chicken 20 K array as part of a joined EADGENE/SABRE workshop. In this manuscript we compare their annotation strategies and results. Furthermore, we analyse the effect of differences in updated annotation on functional analysis for an experiment involving Eimeria infected chickens and finally we propose guidelines for optimal annotation strategies. IMAD, OligoRAP and sigReannot update both annotation and estimated target specificity. The 3 pipelines can assign oligos to target specificity categories although with varying degrees of resolution. Target specificity is judged based on the amount and type of oligo versus target-gene alignments (hits), which are determined by filter thresholds that users can adjust based on their experimental conditions. Linking oligos to annotation on the other hand is based on rigid rules, which differ between pipelines.For 52.7% of the oligos from a subset selected for in depth comparison all pipelines linked to one or more Ensembl genes with consensus on 44.0%. In 31.0% of the cases none of the pipelines could assign an Ensembl gene to an oligo and for the remaining 16.3% the coverage differed between pipelines. Differences in updated annotation were mainly due to different thresholds for hybridisation potential filtering of oligo versus target-gene alignments and different policies for expanding annotation using indirect links. The differences in updated annotation packages had a significant effect on GO term enrichment analysis with consensus on only 67.2% of the enriched terms. In addition to flexible thresholds to determine target specificity, annotation tools should provide metadata describing the relationships between oligos and the annotation assigned to them. These relationships can then be used to judge the varying degrees of reliability allowing users to fine-tune the balance between reliability and coverage. This is important as it can have a significant effect on functional microarray analysis as exemplified by the lack of consensus on almost one third of the terms found with GO term enrichment analysis based on updated IMAD, OligoRAP or sigReannot annotation.

  9. Normal Threshold Size of Stimuli in Children Using a Game-Based Visual Field Test.

    PubMed

    Wang, Yanfang; Ali, Zaria; Subramani, Siddharth; Biswas, Susmito; Fenerty, Cecilia; Henson, David B; Aslam, Tariq

    2017-06-01

    The aim of this study was to demonstrate and explore the ability of novel game-based perimetry to establish normal visual field thresholds in children. One hundred and eighteen children (aged 8.0 ± 2.8 years old) with no history of visual field loss or significant medical history were recruited. Each child had one eye tested using a game-based visual field test 'Caspar's Castle' at four retinal locations 12.7° (N = 118) from fixation. Thresholds were established repeatedly using up/down staircase algorithms with stimuli of varying diameter (luminance 20 cd/m 2 , duration 200 ms, background luminance 10 cd/m 2 ). Relationships between threshold and age were determined along with measures of intra- and intersubject variability. The Game-based visual field test was able to establish threshold estimates in the full range of children tested. Threshold size reduced with increasing age in children. Intrasubject variability and intersubject variability were inversely related to age in children. Normal visual field thresholds were established for specific locations in children using a novel game-based visual field test. These could be used as a foundation for developing a game-based perimetry screening test for children.

  10. Rule groupings: A software engineering approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Currently, most expert system shells do not address software engineering issues for developing or maintaining expert systems. As a result, large expert systems tend to be incomprehensible, difficult to debug or modify and almost impossible to verify or validate. Partitioning rule based systems into rule groups which reflect the underlying subdomains of the problem should enhance the comprehensibility, maintainability, and reliability of expert system software. Attempts were made to semiautomatically structure a CLIPS rule base into groups of related rules that carry the same type of information. Different distance metrics that capture relevant information from the rules for grouping are discussed. Two clustering algorithms that partition the rule base into groups of related rules are given. Two independent evaluation criteria are developed to measure the effectiveness of the grouping strategies. Results of the experiment with three sample rule bases are presented.

  11. Novel high/low solubility classification methods for new molecular entities.

    PubMed

    Dave, Rutwij A; Morris, Marilyn E

    2016-09-10

    This research describes a rapid solubility classification approach that could be used in the discovery and development of new molecular entities. Compounds (N=635) were divided into two groups based on information available in the literature: high solubility (BDDCS/BCS 1/3) and low solubility (BDDCS/BCS 2/4). We established decision rules for determining solubility classes using measured log solubility in molar units (MLogSM) or measured solubility (MSol) in mg/ml units. ROC curve analysis was applied to determine statistically significant threshold values of MSol and MLogSM. Results indicated that NMEs with MLogSM>-3.05 or MSol>0.30mg/mL will have ≥85% probability of being highly soluble and new molecular entities with MLogSM≤-3.05 or MSol≤0.30mg/mL will have ≥85% probability of being poorly soluble. When comparing solubility classification using the threshold values of MLogSM or MSol with BDDCS, we were able to correctly classify 85% of compounds. We also evaluated solubility classification of an independent set of 108 orally administered drugs using MSol (0.3mg/mL) and our method correctly classified 81% and 95% of compounds into high and low solubility classes, respectively. The high/low solubility classification using MLogSM or MSol is novel and independent of traditionally used dose number criteria. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. EDRN Breast and Ovary Cancer CVC, Study 2: Phase 3 retrospective validation of ovarian cancer early detection markers in serial preclinical samples from the PLCO trial — EDRN Public Portal

    Cancer.gov

    Over 70% of women with ovarian/fallopian tube cancer (OC) are diagnosed with advanced stage disease which has a 5-year relative survival rate of 30%. Five-year survival is 90% when disease is confined to the ovaries, but overall survival is poor because only 25% of cases are found early. Screening for ovarian cancer using tools with high sensitivity is potentially cost-effective, but because OC is so rare, very high specificity is needed to achieve an acceptable PPV. We have conducted preliminary work both in clinical and in preclinical (CARET) samples. We have identified candidate markers, developed assays for novel markers including HE4 and MSLN, and evaluated their diagnostic performance. We evaluated the markers’ contribution to a diagnostic panel in a standard set in order to identify the best of the candidates and developed methods for combining markers to define a decision rule for a marker panel. We found that our PEB rule yields comparable performance to the Single Threshold (ST) rule 2 years earlier, using the same two markers. The PEB makes an even larger contribution with the 4-marker panel. The 4-marker panel with the PEB rule represents a substantial improvement over any of the other decision rules as a first-line screen to select women for imaging. Our goal in the proposed work is to estimate the improvement in performance possible in the PLCO serial samples.

  13. Cognitive changes in conjunctive rule-based category learning: An ERP approach.

    PubMed

    Rabi, Rahel; Joanisse, Marc F; Zhu, Tianshu; Minda, John Paul

    2018-06-25

    When learning rule-based categories, sufficient cognitive resources are needed to test hypotheses, maintain the currently active rule in working memory, update rules after feedback, and to select a new rule if necessary. Prior research has demonstrated that conjunctive rules are more complex than unidimensional rules and place greater demands on executive functions like working memory. In our study, event-related potentials (ERPs) were recorded while participants performed a conjunctive rule-based category learning task with trial-by-trial feedback. In line with prior research, correct categorization responses resulted in a larger stimulus-locked late positive complex compared to incorrect responses, possibly indexing the updating of rule information in memory. Incorrect trials elicited a pronounced feedback-locked P300 elicited which suggested a disconnect between perception, and the rule-based strategy. We also examined the differential processing of stimuli that were able to be correctly classified by the suboptimal single-dimensional rule ("easy" stimuli) versus those that could only be correctly classified by the optimal, conjunctive rule ("difficult" stimuli). Among strong learners, a larger, late positive slow wave emerged for difficult compared with easy stimuli, suggesting differential processing of category items even though strong learners performed well on the conjunctive category set. Overall, the findings suggest that ERP combined with computational modelling can be used to better understand the cognitive processes involved in rule-based category learning.

  14. An evaluation and implementation of rule-based Home Energy Management System using the Rete algorithm.

    PubMed

    Kawakami, Tomoya; Fujita, Naotaka; Yoshihisa, Tomoki; Tsukamoto, Masahiko

    2014-01-01

    In recent years, sensors become popular and Home Energy Management System (HEMS) takes an important role in saving energy without decrease in QoL (Quality of Life). Currently, many rule-based HEMSs have been proposed and almost all of them assume "IF-THEN" rules. The Rete algorithm is a typical pattern matching algorithm for IF-THEN rules. Currently, we have proposed a rule-based Home Energy Management System (HEMS) using the Rete algorithm. In the proposed system, rules for managing energy are processed by smart taps in network, and the loads for processing rules and collecting data are distributed to smart taps. In addition, the number of processes and collecting data are reduced by processing rules based on the Rete algorithm. In this paper, we evaluated the proposed system by simulation. In the simulation environment, rules are processed by a smart tap that relates to the action part of each rule. In addition, we implemented the proposed system as HEMS using smart taps.

  15. Statistical characteristics of the sequential detection of signals in correlated noise

    NASA Astrophysics Data System (ADS)

    Averochkin, V. A.; Baranov, P. E.

    1985-10-01

    A solution is given to the problem of determining the distribution of the duration of the sequential two-threshold Wald rule for the time-discrete detection of determinate and Gaussian correlated signals on a background of Gaussian correlated noise. Expressions are obtained for the joint probability densities of the likelihood ratio logarithms, and an analysis is made of the effect of correlation and SNR on the duration distribution and the detection efficiency. Comparison is made with Neumann-Pearson detection.

  16. Dyonisius Exiguus - the Father of the Christian Era Was Born in Romanian Territory

    NASA Astrophysics Data System (ADS)

    Stavinschi, Magda

    In "Liber de Paschal", written in the 6th century, was introduced the new calendar, having as origin the year of Jesus' birth. This idea belongs to Dionysius Exiguus, born in Tomis, in Romanian territory. He established the rules for a new calendar, which was accepted by the whole civilized world. Now, at the threshold of a new millennium, is the moment to know the personality of this erudite monk better and the manner he established the calendar, which became universal.

  17. N Reasons Why Production-Rules are Insufficient Models for Expert System Knowledge Representation Schemes

    DTIC Science & Technology

    1991-02-01

    3 2.2 Hybrid Rule/Fact Schemas .............................................................. 3 3 THE LIMITATIONS OF RULE BASED KNOWLEDGE...or hybrid rule/fact schemas. 2 UNCLASSIFIED .WA UNCLASSIFIED ERL-0520-RR 2.1 Propositional Logic The simplest form of production-rules are based upon...requirements which may lead to poor system performance. 2.2 Hybrid Rule/Fact Schemas Hybrid rule/fact relationships (also known as Predicate Calculus ) have

  18. Flood Extent Delineation by Thresholding Sentinel-1 SAR Imagery Based on Ancillary Land Cover Information

    NASA Astrophysics Data System (ADS)

    Liang, J.; Liu, D.

    2017-12-01

    Emergency responses to floods require timely information on water extents that can be produced by satellite-based remote sensing. As SAR image can be acquired in adverse illumination and weather conditions, it is particularly suitable for delineating water extent during a flood event. Thresholding SAR imagery is one of the most widely used approaches to delineate water extent. However, most studies apply only one threshold to separate water and dry land without considering the complexity and variability of different dry land surface types in an image. This paper proposes a new thresholding method for SAR image to delineate water from other different land cover types. A probability distribution of SAR backscatter intensity is fitted for each land cover type including water before a flood event and the intersection between two distributions is regarded as a threshold to classify the two. To extract water, a set of thresholds are applied to several pairs of land cover types—water and urban or water and forest. The subsets are merged to form the water distribution for the SAR image during or after the flooding. Experiments show that this land cover based thresholding approach outperformed the traditional single thresholding by about 5% to 15%. This method has great application potential with the broadly acceptance of the thresholding based methods and availability of land cover data, especially for heterogeneous regions.

  19. Spinal primitives and intra-spinal micro-stimulation (ISMS) based prostheses: a neurobiological perspective on the “known unknowns” in ISMS and future prospects

    PubMed Central

    Giszter, Simon F.

    2015-01-01

    The current literature on Intra-Spinal Micro-Stimulation (ISMS) for motor prostheses is reviewed in light of neurobiological data on spinal organization, and a neurobiological perspective on output motor modularity, ISMS maps, stimulation combination effects, and stability. By comparing published data in these areas, the review identifies several gaps in current knowledge that are crucial to the development of effective intraspinal neuroprostheses. Gaps can be categorized into a lack of systematic and reproducible details of: (a) Topography and threshold for ISMS across the segmental motor system, the topography of autonomic recruitment by ISMS, and the coupling relations between these two types of outputs in practice. (b) Compositional rules for ISMS motor responses tested across the full range of the target spinal topographies. (c) Rules for ISMS effects' dependence on spinal cord state and neural dynamics during naturally elicited or ISMS triggered behaviors. (d) Plasticity of the compositional rules for ISMS motor responses, and understanding plasticity of ISMS topography in different spinal cord lesion states, disease states, and following rehabilitation. All these knowledge gaps to a greater or lesser extent require novel electrode technology in order to allow high density chronic recording and stimulation. The current lack of this technology may explain why these prominent gaps in the ISMS literature currently exist. It is also argued that given the “known unknowns” in the current ISMS literature, it may be prudent to adopt and develop control schemes that can manage the current results with simple superposition and winner-take-all interactions, but can also incorporate the possible plastic and stochastic dynamic interactions that may emerge in fuller analyses over longer terms, and which have already been noted in some simpler model systems. PMID:25852454

  20. A common fluence threshold for first positive and second positive phototropism in Arabidopsis thaliana

    NASA Technical Reports Server (NTRS)

    Janoudi, A.; Poff, K. L.

    1990-01-01

    The relationship between the amount of light and the amount of response for any photobiological process can be based on the number of incident quanta per unit time (fluence rate-response) or on the number of incident quanta during a given period of irradiation (fluence-response). Fluence-response and fluence rate-response relationships have been measured for second positive phototropism by seedlings of Arabidopsis thaliana. The fluence-response relationships exhibit a single limiting threshold at about 0.01 micromole per square meter when measured at fluence rates from 2.4 x 10(-5) to 6.5 x 10(-3) micromoles per square meter per second. The threshold values in the fluence rate-response curves decrease with increasing time of irradiation, but show a common fluence threshold at about 0.01 micromole per square meter. These thresholds are the same as the threshold of about 0.01 micromole per square meter measured for first positive phototropism. Based on these data, it is suggested that second positive curvature has a threshold in time of about 10 minutes. Moreover, if the times of irradiation exceed the time threshold, there is a single limiting fluence threshold at about 0.01 micromole per square meter. Thus, the limiting fluence threshold for second positive phototropism is the same as the fluence threshold for first positive phototropism. Based on these data, we suggest that this common fluence threshold for first positive and second positive phototropism is set by a single photoreceptor pigment system.

  1. A critique of the use of indicator-species scores for identifying thresholds in species responses

    USGS Publications Warehouse

    Cuffney, Thomas F.; Qian, Song S.

    2013-01-01

    Identification of ecological thresholds is important both for theoretical and applied ecology. Recently, Baker and King (2010, King and Baker 2010) proposed a method, threshold indicator analysis (TITAN), to calculate species and community thresholds based on indicator species scores adapted from Dufrêne and Legendre (1997). We tested the ability of TITAN to detect thresholds using models with (broken-stick, disjointed broken-stick, dose-response, step-function, Gaussian) and without (linear) definitive thresholds. TITAN accurately and consistently detected thresholds in step-function models, but not in models characterized by abrupt changes in response slopes or response direction. Threshold detection in TITAN was very sensitive to the distribution of 0 values, which caused TITAN to identify thresholds associated with relatively small differences in the distribution of 0 values while ignoring thresholds associated with large changes in abundance. Threshold identification and tests of statistical significance were based on the same data permutations resulting in inflated estimates of statistical significance. Application of bootstrapping to the split-point problem that underlies TITAN led to underestimates of the confidence intervals of thresholds. Bias in the derivation of the z-scores used to identify TITAN thresholds and skewedness in the distribution of data along the gradient produced TITAN thresholds that were much more similar than the actual thresholds. This tendency may account for the synchronicity of thresholds reported in TITAN analyses. The thresholds identified by TITAN represented disparate characteristics of species responses that, when coupled with the inability of TITAN to identify thresholds accurately and consistently, does not support the aggregation of individual species thresholds into a community threshold.

  2. Estimation of pulse rate from ambulatory PPG using ensemble empirical mode decomposition and adaptive thresholding.

    PubMed

    Pittara, Melpo; Theocharides, Theocharis; Orphanidou, Christina

    2017-07-01

    A new method for deriving pulse rate from PPG obtained from ambulatory patients is presented. The method employs Ensemble Empirical Mode Decomposition to identify the pulsatile component from noise-corrupted PPG, and then uses a set of physiologically-relevant rules followed by adaptive thresholding, in order to estimate the pulse rate in the presence of noise. The method was optimized and validated using 63 hours of data obtained from ambulatory hospital patients. The F1 score obtained with respect to expertly annotated data was 0.857 and the mean absolute errors of estimated pulse rates with respect to heart rates obtained from ECG collected in parallel were 1.72 bpm for "good" quality PPG and 4.49 bpm for "bad" quality PPG. Both errors are within the clinically acceptable margin-of-error for pulse rate/heart rate measurements, showing the promise of the proposed approach for inclusion in next generation wearable sensors.

  3. Measurement of the generalized form factors near threshold via γ*p→nπ+ at high Q2

    NASA Astrophysics Data System (ADS)

    Park, K.; Gothe, R. W.; Adhikari, K. P.; Adikaram, D.; Anghinolfi, M.; Baghdasaryan, H.; Ball, J.; Battaglieri, M.; Batourine, V.; Bedlinskiy, I.; Bennett, R. P.; Biselli, A. S.; Bookwalter, C.; Boiarinov, S.; Branford, D.; Briscoe, W. J.; Brooks, W. K.; Burkert, V. D.; Carman, D. S.; Celentano, A.; Chandavar, S.; Charles, G.; Cole, P. L.; Contalbrigo, M.; Crede, V.; D'Angelo, A.; Daniel, A.; Dashyan, N.; De Vita, R.; De Sanctis, E.; Deur, A.; Djalali, C.; Doughty, D.; Dupre, R.; El Alaoui, A.; El Fassi, L.; Eugenio, P.; Fedotov, G.; Fradi, A.; Gabrielyan, M. Y.; Gevorgyan, N.; Gilfoyle, G. P.; Giovanetti, K. L.; Girod, F. X.; Goetz, J. T.; Gohn, W.; Golovatch, E.; Graham, L.; Griffioen, K. A.; Guidal, M.; Guo, L.; Hafidi, K.; Hakobyan, H.; Hanretty, C.; Heddle, D.; Hicks, K.; Holtrop, M.; Hyde, C. E.; Ilieva, Y.; Ireland, D. G.; Ishkhanov, B. S.; Isupov, E. L.; Jenkins, D.; Jo, H. S.; Joo, K.; Kalantarians, N.; Khandaker, M.; Khetarpal, P.; Kim, A.; Kim, W.; Klein, A.; Klein, F. J.; Kubarovsky, A.; Kubarovsky, V.; Kuhn, S. E.; Kuleshov, S. V.; Kvaltine, N. D.; Livingston, K.; Lu, H. Y.; MacGregor, I. J. D.; Markov, N.; Mayer, M.; McKinnon, B.; Mestayer, M. D.; Meyer, C. A.; Mineeva, T.; Mirazita, M.; Mokeev, V.; Moutarde, H.; Munevar, E.; Nadel-Turonski, P.; Nasseripour, R.; Niccolai, S.; Niculescu, G.; Niculescu, I.; Osipenko, M.; Ostrovidov, A. I.; Paolone, M.; Pappalardo, L.; Paremuzyan, R.; Park, S.; Pereira, S. Anefalos; Phelps, E.; Pisano, S.; Pogorelko, O.; Pozdniakov, S.; Price, J. W.; Procureur, S.; Prok, Y.; Ricco, G.; Rimal, D.; Ripani, M.; Ritchie, B. G.; Rosner, G.; Rossi, P.; Sabatié, F.; Saini, M. S.; Salgado, C.; Schott, D.; Schumacher, R. A.; Seraydaryan, H.; Sharabian, Y. G.; Smith, E. S.; Smith, G. D.; Sober, D. I.; Sokhan, D.; Stepanyan, S. S.; Stepanyan, S.; Stoler, P.; Strakovsky, I. I.; Strauch, S.; Taiuti, M.; Tang, W.; Taylor, C. E.; Tian, Y.; Tkachenko, S.; Trivedi, A.; Ungaro, M.; Vernarsky, B.; Vlassov, A. V.; Voutier, E.; Watts, D. P.; Weygand, D. P.; Wood, M. H.; Zachariou, N.; Zhao, B.; Zhao, Z. W.

    2012-03-01

    We report the first extraction of the pion-nucleon multipoles near the production threshold for the nπ+ channel at relatively high momentum transfer (Q2 up to 4.2 GeV2). The dominance of the s-wave transverse multipole (E0+), expected in this region, allowed us to access the generalized form factor G1 within the light-cone sum-rule (LCSR) framework as well as the axial form factor GA. The data analyzed in this work were collected by the nearly 4π CEBAF Large Acceptance Spectrometer (CLAS) using a 5.754-GeV electron beam on a proton target. The differential cross section and the π-N multipole E0+/GD were measured using two different methods, the LCSR and a direct multipole fit. The results from the two methods are found to be consistent and almost Q2 independent.

  4. Attentional limits on the perception and memory of visual information.

    PubMed

    Palmer, J

    1990-05-01

    Attentional limits on perception and memory were measured by the decline in performance with increasing numbers of objects in a display. Multiple objects were presented to Ss who discriminated visual attributes. In a representative condition, 4 lines were briefly presented followed by a single line in 1 of the same locations. Ss were required to judge if the single line in the 2nd display was longer or shorter than the line in the corresponding location of the 1st display. The length difference threshold was calculated as a function of the number of objects. The difference thresholds doubled when the number of objects was increased from 1 to 4. This effect was generalized in several ways, and nonattentional explanations were ruled out. Further analyses showed that the attentional processes must share information from at least 4 objects and can be described by a simple model.

  5. Spin Dependence of η Meson Production in Proton-Proton Collisions Close to Threshold.

    PubMed

    Adlarson, P; Augustyniak, W; Bardan, W; Bashkanov, M; Bass, S D; Bergmann, F S; Berłowski, M; Bondar, A; Büscher, M; Calén, H; Ciepał, I; Clement, H; Czerwiński, E; Demmich, K; Engels, R; Erven, A; Erven, W; Eyrich, W; Fedorets, P; Föhl, K; Fransson, K; Goldenbaum, F; Goswami, A; Grigoryev, K; Gullström, C-O; Heijkenskjöld, L; Hejny, V; Hüsken, N; Jarczyk, L; Johansson, T; Kamys, B; Kemmerling, G; Khatri, G; Khoukaz, A; Khreptak, O; Kirillov, D A; Kistryn, S; Kleines, H; Kłos, B; Krzemień, W; Kulessa, P; Kupść, A; Kuzmin, A; Lalwani, K; Lersch, D; Lorentz, B; Magiera, A; Maier, R; Marciniewski, P; Mariański, B; Morsch, H-P; Moskal, P; Ohm, H; Parol, W; Perez Del Rio, E; Piskunov, N M; Prasuhn, D; Pszczel, D; Pysz, K; Pyszniak, A; Ritman, J; Roy, A; Rudy, Z; Rundel, O; Sawant, S; Schadmand, S; Schätti-Ozerianska, I; Sefzick, T; Serdyuk, V; Shwartz, B; Sitterberg, K; Skorodko, T; Skurzok, M; Smyrski, J; Sopov, V; Stassen, R; Stepaniak, J; Stephan, E; Sterzenbach, G; Stockhorst, H; Ströher, H; Szczurek, A; Trzciński, A; Wolke, M; Wrońska, A; Wüstner, P; Yamamoto, A; Zabierowski, J; Zieliński, M J; Złomańczuk, J; Żuprański, P; Żurek, M

    2018-01-12

    Taking advantage of the high acceptance and axial symmetry of the WASA-at-COSY detector, and the high polarization degree of the proton beam of COSY, the reaction p[over →]p→ppη has been measured close to threshold to explore the analyzing power A_{y}. The angular distribution of A_{y} is determined with the precision improved by more than 1 order of magnitude with respect to previous results, allowing a first accurate comparison with theoretical predictions. The determined analyzing power is consistent with zero for an excess energy of Q=15  MeV, signaling s-wave production with no evidence for higher partial waves. At Q=72  MeV the data reveal strong interference of Ps and Pp partial waves and cancellation of (Pp)^{2} and Ss^{*}Sd contributions. These results rule out the presently available theoretical predictions for the production mechanism of the η meson.

  6. 75 FR 34277 - Federal Acquisition Regulation; FAR Case 2008-007, Additional Requirements for Market Research

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-16

    ...The Civilian Agency Acquisition Council and the Defense Acquisition Regulations Council (Councils) have agreed on an interim rule amending the Federal Acquisition Regulation (FAR) to implement Section 826 of the National Defense Authorization Act for Fiscal Year 2008 (FY08 NDAA). Section 826 established additional requirements in subsection (c) of 10 U.S.C. 2377. As a matter of policy, these requirements are extended to all executive agencies. Specifically, the head of the agency must conduct market research before issuing an indefinite-delivery indefinite-quantity (ID/IQ) task or delivery order for a noncommercial item in excess of the simplified acquisition threshold. In addition, a prime contractor with a contract in excess of $5 million for the procurement of items other than commercial items is required to conduct market research before making purchases that exceed the simplified acquisition threshold for or on behalf of the Government.

  7. Challenges for Rule Systems on the Web

    NASA Astrophysics Data System (ADS)

    Hu, Yuh-Jong; Yeh, Ching-Long; Laun, Wolfgang

    The RuleML Challenge started in 2007 with the objective of inspiring the issues of implementation for management, integration, interoperation and interchange of rules in an open distributed environment, such as the Web. Rules are usually classified as three types: deductive rules, normative rules, and reactive rules. The reactive rules are further classified as ECA rules and production rules. The study of combination rule and ontology is traced back to an earlier active rule system for relational and object-oriented (OO) databases. Recently, this issue has become one of the most important research problems in the Semantic Web. Once we consider a computer executable policy as a declarative set of rules and ontologies that guides the behavior of entities within a system, we have a flexible way to implement real world policies without rewriting the computer code, as we did before. Fortunately, we have de facto rule markup languages, such as RuleML or RIF to achieve the portability and interchange of rules for different rule systems. Otherwise, executing real-life rule-based applications on the Web is almost impossible. Several commercial or open source rule engines are available for the rule-based applications. However, we still need a standard rule language and benchmark for not only to compare the rule systems but also to measure the progress in the field. Finally, a number of real-life rule-based use cases will be investigated to demonstrate the applicability of current rule systems on the Web.

  8. Typology of patients with fibromyalgia: cluster analysis of duloxetine study patients.

    PubMed

    Lipkovich, Ilya A; Choy, Ernest H; Van Wambeke, Peter; Deberdt, Walter; Sagman, Doron

    2014-12-23

    To identify distinct groups of patients with fibromyalgia (FM) with respect to multiple outcome measures. Data from 631 duloxetine-treated women in 4 randomized, placebo-controlled trials were included in a cluster analysis based on outcomes after up to 12 weeks of treatment. Corresponding classification rules were constructed using a classification tree method. Probabilities for transitioning from baseline to Week 12 category were estimated for placebo and duloxetine patients (Ntotal = 1188) using logistic regression. Five clusters were identified, from "worst" (high pain levels and severe mental/physical impairment) to "best" (low pain levels and nearly normal mental/physical function). For patients with moderate overall severity, mental and physical symptoms were less correlated, resulting in 2 distinct clusters based on these 2 symptom domains. Three key variables with threshold values were identified for classification of patients: Brief Pain Inventory (BPI) pain interference overall scores of <3.29 and <7.14, respectively, a Fibromyalgia Impact Questionnaire (FIQ) interference with work score of <2, and an FIQ depression score of ≥5. Patient characteristics and frequencies per baseline category were similar between treatments; >80% of patients were in the 3 worst categories. Duloxetine patients were significantly more likely to improve after 12 weeks than placebo patients. A sustained effect was seen with continued duloxetine treatment. FM patients are heterogeneous and can be classified into distinct subgroups by simple descriptive rules derived from only 3 variables, which may guide individual patient management. Duloxetine showed higher improvement rates than placebo and had a sustained effect beyond 12 weeks.

  9. An Approach for the Accurate Measurement of Social Morality Levels

    PubMed Central

    Liu, Haiyan; Chen, Xia; Zhang, Bo

    2013-01-01

    In the social sciences, computer-based modeling has become an increasingly important tool receiving widespread attention. However, the derivation of the quantitative relationships linking individual moral behavior and social morality levels, so as to provide a useful basis for social policy-making, remains a challenge in the scholarly literature today. A quantitative measurement of morality from the perspective of complexity science constitutes an innovative attempt. Based on the NetLogo platform, this article examines the effect of various factors on social morality levels, using agents modeling moral behavior, immoral behavior, and a range of environmental social resources. Threshold values for the various parameters are obtained through sensitivity analysis; and practical solutions are proposed for reversing declines in social morality levels. The results show that: (1) Population size may accelerate or impede the speed with which immoral behavior comes to determine the overall level of social morality, but it has no effect on the level of social morality itself; (2) The impact of rewards and punishment on social morality levels follows the “5∶1 rewards-to-punishment rule,” which is to say that 5 units of rewards have the same effect as 1 unit of punishment; (3) The abundance of public resources is inversely related to the level of social morality; (4) When the cost of population mobility reaches 10% of the total energy level, immoral behavior begins to be suppressed (i.e. the 1/10 moral cost rule). The research approach and methods presented in this paper successfully address the difficulties involved in measuring social morality levels, and promise extensive application potentials. PMID:24312189

  10. An approach for the accurate measurement of social morality levels.

    PubMed

    Liu, Haiyan; Chen, Xia; Zhang, Bo

    2013-01-01

    In the social sciences, computer-based modeling has become an increasingly important tool receiving widespread attention. However, the derivation of the quantitative relationships linking individual moral behavior and social morality levels, so as to provide a useful basis for social policy-making, remains a challenge in the scholarly literature today. A quantitative measurement of morality from the perspective of complexity science constitutes an innovative attempt. Based on the NetLogo platform, this article examines the effect of various factors on social morality levels, using agents modeling moral behavior, immoral behavior, and a range of environmental social resources. Threshold values for the various parameters are obtained through sensitivity analysis; and practical solutions are proposed for reversing declines in social morality levels. The results show that: (1) Population size may accelerate or impede the speed with which immoral behavior comes to determine the overall level of social morality, but it has no effect on the level of social morality itself; (2) The impact of rewards and punishment on social morality levels follows the "5∶1 rewards-to-punishment rule," which is to say that 5 units of rewards have the same effect as 1 unit of punishment; (3) The abundance of public resources is inversely related to the level of social morality; (4) When the cost of population mobility reaches 10% of the total energy level, immoral behavior begins to be suppressed (i.e. the 1/10 moral cost rule). The research approach and methods presented in this paper successfully address the difficulties involved in measuring social morality levels, and promise extensive application potentials.

  11. Dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization

    NASA Astrophysics Data System (ADS)

    Li, Li

    2018-03-01

    In order to extract target from complex background more quickly and accurately, and to further improve the detection effect of defects, a method of dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization was proposed. Firstly, the method of single-threshold selection based on Arimoto entropy was extended to dual-threshold selection in order to separate the target from the background more accurately. Then intermediate variables in formulae of Arimoto entropy dual-threshold selection was calculated by recursion to eliminate redundant computation effectively and to reduce the amount of calculation. Finally, the local search phase of artificial bee colony algorithm was improved by chaotic sequence based on tent mapping. The fast search for two optimal thresholds was achieved using the improved bee colony optimization algorithm, thus the search could be accelerated obviously. A large number of experimental results show that, compared with the existing segmentation methods such as multi-threshold segmentation method using maximum Shannon entropy, two-dimensional Shannon entropy segmentation method, two-dimensional Tsallis gray entropy segmentation method and multi-threshold segmentation method using reciprocal gray entropy, the proposed method can segment target more quickly and accurately with superior segmentation effect. It proves to be an instant and effective method for image segmentation.

  12. Quantum threshold reflection is not a consequence of a region of the long-range attractive potential with rapidly varying de Broglie wavelength

    NASA Astrophysics Data System (ADS)

    Petersen, Jakob; Pollak, Eli; Miret-Artes, Salvador

    2018-04-01

    Quantum threshold reflection is a well-known quantum phenomenon which prescribes that at threshold, except for special circumstances, a quantum particle scattering from any potential, even if attractive at long range, will be reflected with unit probability. In the past, this property had been associated with the so-called badlands region of the potential, where the semiclassical description of the scattering fails due to a rapid spatial variation of the de Broglie wavelength. This badlands region occurs far from the strong interaction region of the potential and has therefore been used to "explain" the quantum reflection phenomenon. In this paper we show that the badlands region of the interaction potential is immaterial. The extremely long wavelength of the scattered particle at threshold is much longer than the spatial extension of the badlands region, which therefore does not affect the scattering. For this purpose, we review and generalize the proof for the existence of quantum threshold reflection to stress that it is only a consequence of continuity and boundary conditions. The nonlocal character of the scattering implies that the whole interaction potential is involved in the phenomenon. We then provide a detailed numerical study of the threshold scattering of a particle by a Morse potential and an Eckart potential, especially in the time domain. We compare exact quantum computations with incoherent results obtained from a classical Wigner approximation. This study shows that close to threshold the time-dependent amplitude of the scattered particle is negligible in the badlands region and is the same whether the potential has a reflecting wall as in the Morse potential or a steplike structure as in the Eckart smooth step potential. The mean flight time of the particle is not shortened due to a local reflection from the badlands region or due to the lower density of the wave function at short distances. This study should serve to definitely rule out the badlands region as a qualitative guide to the properties of quantum threshold reflection.

  13. Accuracy of cancellous bone volume fraction measured by micro-CT scanning.

    PubMed

    Ding, M; Odgaard, A; Hvid, I

    1999-03-01

    Volume fraction, the single most important parameter in describing trabecular microstructure, can easily be calculated from three-dimensional reconstructions of micro-CT images. This study sought to quantify the accuracy of this measurement. One hundred and sixty human cancellous bone specimens which covered a large range of volume fraction (9.8-39.8%) were produced. The specimens were micro-CT scanned, and the volume fraction based on Archimedes' principle was determined as a reference. After scanning, all micro-CT data were segmented using individual thresholds determined by the scanner supplied algorithm (method I). A significant deviation of volume fraction from method I was found: both the y-intercept and the slope of the regression line were significantly different from those of the Archimedes-based volume fraction (p < 0.001). New individual thresholds were determined based on a calibration of volume fraction to the Archimedes-based volume fractions (method II). The mean thresholds of the two methods were applied to segment 20 randomly selected specimens. The results showed that volume fraction using the mean threshold of method I was underestimated by 4% (p = 0.001), whereas the mean threshold of method II yielded accurate values. The precision of the measurement was excellent. Our data show that care must be taken when applying thresholds in generating 3-D data, and that a fixed threshold may be used to obtain reliable volume fraction data. This fixed threshold may be determined from the Archimedes-based volume fraction of a subgroup of specimens. The threshold may vary between different materials, and so it should be determined whenever a study series is performed.

  14. Simulating Flaring Events via an Intelligent Cellular Automata Mechanism

    NASA Astrophysics Data System (ADS)

    Dimitropoulou, M.; Vlahos, L.; Isliker, H.; Georgoulis, M.

    2010-07-01

    We simulate flaring events through a Cellular Automaton (CA) model, in which, for the first time, we use observed vector magnetograms as initial conditions. After non-linear force free extrapolation of the magnetic field from the vector magnetograms, we identify magnetic discontinuities, using two alternative criteria: (1) the average magnetic field gradient, or (2) the normalized magnetic field curl (i.e. the current). Magnetic discontinuities are identified at the grid-sites where the magnetic field gradient or curl exceeds a specified threshold. We then relax the magnetic discontinuities according to the rules of Lu and Hamilton (1991) or Lu et al. (1993), i.e. we redistribute the magnetic field locally so that the discontinuities disappear. In order to simulate the flaring events, we consider several alternative scenarios with regard to: (1) The threshold above which magnetic discontinuities are identified (applying low, high, and height-dependent threshold values); (2) The driving process that occasionally causes new discontinuities (at randomly chosen grid sites, magnetic field increments are added that are perpendicular (or may-be also parallel) to the existing magnetic field). We address the question whether the coronal active region magnetic fields can indeed be considered to be in the state of self-organized criticality (SOC).

  15. Perceived area and the luminosity threshold.

    PubMed

    Bonato, F; Gilchrist, A L

    1999-07-01

    Observers made forced-choice opaque/luminous responses to targets of varying luminance and varying size presented (1) on the wall of a laboratory, (2) as a disk within an annulus, and (3) embedded within a Mondrian array presented within a vision tunnel. Lightness matches were also made for nearby opaque surfaces. The results show that the threshold luminance value at which a target begins to appear self-luminous increases with its size, defined as perceived size, not retinal size. More generally, the larger the target, the more an increase in its luminance induces grayness/blackness into the surround and the less it induces luminosity into the target, and vice versa. Corresponding to this luminosity/grayness tradeoff, there appears to be an invariant: Across a wide variety of conditions, a target begins to appear luminous when its luminance is about 1.7 times that of a surface that would appear white in the same illumination. These results show that the luminosity threshold behaves like a surface lightness value--the maximum lightness value, in fact--and is subject to the same laws of anchoring (such as the area rule proposed by Li & Gilchrist, 1999) as surface lightness.

  16. Topical hindpaw application of L-menthol decreases responsiveness to heat with biphasic effects on cold sensitivity of rat lumbar dorsal horn neurons

    PubMed Central

    Klein, Amanda H.; Sawyer, Carolyn M.; Takechi, Kenichi; Davoodi, Auva; Ivanov, Margaret A.; Carstens, Mirela Iodi; Carstens, E

    2012-01-01

    Menthol is used in pharmaceutical applications because of its desired cooling and analgesic properties. The neural mechanism by which topical application of menthol decreases heat pain is not fully understood. We investigated the effects of topical menthol application on lumbar dorsal horn wide dynamic range and nociceptive-specific neuronal responses to noxious heat and cooling of glaborous hindpaw cutaneous receptive fields. Menthol increased thresholds for responses to noxious heat in a concentration-dependent manner. Menthol had a biphasic effect on cold-evoked responses, reducing the threshold (to warmer temperatures) at a low (1%) concentration and increasing threshold and reducing response magnitude at high (10, 40%) concentrations. Menthol had little effect on responses to innocuous or noxious mechanical stimuli, ruling out a local anesthetic action. Application of 40% menthol to the contralateral hindpaw tended to reduce responses to cooling and noxious heat, suggesting a weak heterosegmental inhibitory effect. These results indicate that menthol has an analgesic effect on heat sensitivity of nociceptive dorsal horn neurons, as well as biphasic effects on cold sensitivity, consistent with previous behavioral observations. PMID:22687951

  17. ConGEMs: Condensed Gene Co-Expression Module Discovery Through Rule-Based Clustering and Its Application to Carcinogenesis.

    PubMed

    Mallik, Saurav; Zhao, Zhongming

    2017-12-28

    For transcriptomic analysis, there are numerous microarray-based genomic data, especially those generated for cancer research. The typical analysis measures the difference between a cancer sample-group and a matched control group for each transcript or gene. Association rule mining is used to discover interesting item sets through rule-based methodology. Thus, it has advantages to find causal effect relationships between the transcripts. In this work, we introduce two new rule-based similarity measures-weighted rank-based Jaccard and Cosine measures-and then propose a novel computational framework to detect condensed gene co-expression modules ( C o n G E M s) through the association rule-based learning system and the weighted similarity scores. In practice, the list of evolved condensed markers that consists of both singular and complex markers in nature depends on the corresponding condensed gene sets in either antecedent or consequent of the rules of the resultant modules. In our evaluation, these markers could be supported by literature evidence, KEGG (Kyoto Encyclopedia of Genes and Genomes) pathway and Gene Ontology annotations. Specifically, we preliminarily identified differentially expressed genes using an empirical Bayes test. A recently developed algorithm-RANWAR-was then utilized to determine the association rules from these genes. Based on that, we computed the integrated similarity scores of these rule-based similarity measures between each rule-pair, and the resultant scores were used for clustering to identify the co-expressed rule-modules. We applied our method to a gene expression dataset for lung squamous cell carcinoma and a genome methylation dataset for uterine cervical carcinogenesis. Our proposed module discovery method produced better results than the traditional gene-module discovery measures. In summary, our proposed rule-based method is useful for exploring biomarker modules from transcriptomic data.

  18. Should we expect population thresholds for wildlife disease?

    USGS Publications Warehouse

    Lloyd-Smith, James O.; Cross, P.C.; Briggs, C.J.; Daugherty, M.; Getz, W.M.; Latto, J.; Sanchez, M.; Smith, A.; Swei, A.

    2005-01-01

    Host population thresholds for invasion or persistence of infectious disease are core concepts of disease ecology, and underlie on-going and controversial disease control policies based on culling and vaccination. Empirical evidence for these thresholds in wildlife populations has been sparse, however, though recent studies have narrowed this gap. Here we review the theoretical bases for population thresholds for disease, revealing why they are difficult to measure and sometimes are not even expected, and identifying important facets of wildlife ecology left out of current theories. We discuss strengths and weaknesses of selected empirical studies that have reported disease thresholds for wildlife, identify recurring obstacles, and discuss implications of our imperfect understanding of wildlife thresholds for disease control policy.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Qishi; Zhu, Mengxia; Rao, Nageswara S

    We propose an intelligent decision support system based on sensor and computer networks that incorporates various component techniques for sensor deployment, data routing, distributed computing, and information fusion. The integrated system is deployed in a distributed environment composed of both wireless sensor networks for data collection and wired computer networks for data processing in support of homeland security defense. We present the system framework and formulate the analytical problems and develop approximate or exact solutions for the subtasks: (i) sensor deployment strategy based on a two-dimensional genetic algorithm to achieve maximum coverage with cost constraints; (ii) data routing scheme tomore » achieve maximum signal strength with minimum path loss, high energy efficiency, and effective fault tolerance; (iii) network mapping method to assign computing modules to network nodes for high-performance distributed data processing; and (iv) binary decision fusion rule that derive threshold bounds to improve system hit rate and false alarm rate. These component solutions are implemented and evaluated through either experiments or simulations in various application scenarios. The extensive results demonstrate that these component solutions imbue the integrated system with the desirable and useful quality of intelligence in decision making.« less

  20. Multiadaptive Bionic Wavelet Transform: Application to ECG Denoising and Baseline Wandering Reduction

    NASA Astrophysics Data System (ADS)

    Sayadi, Omid; Shamsollahi, Mohammad B.

    2007-12-01

    We present a new modified wavelet transform, called the multiadaptive bionic wavelet transform (MABWT), that can be applied to ECG signals in order to remove noise from them under a wide range of variations for noise. By using the definition of bionic wavelet transform and adaptively determining both the center frequency of each scale together with the[InlineEquation not available: see fulltext.]-function, the problem of desired signal decomposition is solved. Applying a new proposed thresholding rule works successfully in denoising the ECG. Moreover by using the multiadaptation scheme, lowpass noisy interference effects on the baseline of ECG will be removed as a direct task. The method was extensively clinically tested with real and simulated ECG signals which showed high performance of noise reduction, comparable to those of wavelet transform (WT). Quantitative evaluation of the proposed algorithm shows that the average SNR improvement of MABWT is 1.82 dB more than the WT-based results, for the best case. Also the procedure has largely proved advantageous over wavelet-based methods for baseline wandering cancellation, including both DC components and baseline drifts.

  1. Methodology for balancing design and process tradeoffs for deep-subwavelength technologies

    NASA Astrophysics Data System (ADS)

    Graur, Ioana; Wagner, Tina; Ryan, Deborah; Chidambarrao, Dureseti; Kumaraswamy, Anand; Bickford, Jeanne; Styduhar, Mark; Wang, Lee

    2011-04-01

    For process development of deep-subwavelength technologies, it has become accepted practice to use model-based simulation to predict systematic and parametric failures. Increasingly, these techniques are being used by designers to ensure layout manufacturability, as an alternative to, or complement to, restrictive design rules. The benefit of model-based simulation tools in the design environment is that manufacturability problems are addressed in a design-aware way by making appropriate trade-offs, e.g., between overall chip density and manufacturing cost and yield. The paper shows how library elements and the full ASIC design flow benefit from eliminating hot spots and improving design robustness early in the design cycle. It demonstrates a path to yield optimization and first time right designs implemented in leading edge technologies. The approach described herein identifies those areas in the design that could benefit from being fixed early, leading to design updates and avoiding later design churn by careful selection of design sensitivities. This paper shows how to achieve this goal by using simulation tools incorporating various models from sparse to rigorously physical, pattern detection and pattern matching, checking and validating failure thresholds.

  2. Route to thermalization in the α-Fermi–Pasta–Ulam system

    PubMed Central

    Onorato, Miguel; Vozella, Lara; Lvov, Yuri V.

    2015-01-01

    We study the original α-Fermi–Pasta–Ulam (FPU) system with N = 16, 32, and 64 masses connected by a nonlinear quadratic spring. Our approach is based on resonant wave–wave interaction theory; i.e., we assume that, in the weakly nonlinear regime (the one in which Fermi was originally interested), the large time dynamics is ruled by exact resonances. After a detailed analysis of the α-FPU equation of motion, we find that the first nontrivial resonances correspond to six-wave interactions. Those are precisely the interactions responsible for the thermalization of the energy in the spectrum. We predict that, for small-amplitude random waves, the timescale of such interactions is extremely large and it is of the order of 1/ϵ8, where ϵ is the small parameter in the system. The wave–wave interaction theory is not based on any threshold: Equipartition is predicted for arbitrary small nonlinearity. Our results are supported by extensive numerical simulations. A key role in our finding is played by the Umklapp (flip-over) resonant interactions, typical of discrete systems. The thermodynamic limit is also briefly discussed. PMID:25805822

  3. Understanding the rural population migration pattern of Uttarakhand using Geophysical, Geological and Socio-Economical BigData

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, Kausik; Chattopadhyay, Pallavi

    2017-04-01

    Uttarakhand, a Himalayan state of India is facing a worst scenario of rural population migration for the past few decades from hill regions to the planes. While urbanization is believed to be one of the major factors for migration, how geo scientific parameters can impact the population to redraw the demographies of the hills is studied in this research. An attempt is made using density based clustering and Apriori association rule mining on 45 derived variables with a time series of 30 years to understand the rural population migration pattern. Both zone identification and origin-destination pair extraction are formulated as spatial-temporal point clustering problem and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) is applied to solve them. Specifically the population migration is formulated as a 4D point clustering problem and the relative distance between two origin - destination pair with a preference factor is used to fine tune the cluster length. In Apriori, threshold values for confidence and J-measure are kept same as for rule extraction. Rules with maximum confidence level and J-measure are obtained for an antecedent window of 18 months, consequent window of 4 months and time lag of 2 months. From the rules extracted, it can be demonstrated that almost all the geoscience indices are occurring as antecedents for migration episodes. The result demonstrates that the three districts that have registered the highest migration rates are also the districts that have witnessed maximum depletion in water sources. Even though some districts have higher number of landslide incidents, their out migration is less compared to other hill districts. However districts experiencing higher number of earthquakes are experiencing higher out migration. Upper hill region with higher precipitation experience higher migration compared to their lower hill counterpart. However this is not true when compared to the counter parts in the plane regions. Even though temperature fluctuation results in seasonal out migration, it does not have any long term impact. Resource and logistical constraints limit the frequency and extent of observations, necessitating the development of a systematic computational framework that objectively represents environmental variability at the desired spatial scale and such comprehensive big data model can be instrumental in arresting the rural migration which has been posing major threat to the livelihood of this Himalayan state.

  4. Collective Movement in the Tibetan Macaques (Macaca thibetana): Early Joiners Write the Rule of the Game.

    PubMed

    Wang, Xi; Sun, Lixing; Li, Jinhua; Xia, Dongpo; Sun, Binghua; Zhang, Dao

    2015-01-01

    Collective behavior has recently attracted a great deal of interest in both natural and social sciences. While the role of leadership has been closely scrutinized, the rules used by joiners in collective decision making have received far less attention. Two main hypotheses have been proposed concerning these rules: mimetism and quorum. Mimetism predicts that individuals are increasingly likely to join collective behavior as the number of participants increases. It can be further divided into selective mimetism, where relationships among the participants affect the process, and anonymous mimetism, where no such effect exists. Quorum predicts that a collective behavior occurs when the number of participants reaches a threshold. To probe into which rule is used in collective decision making, we conducted a study on the joining process in a group of free-ranging Tibetan macaques (Macaca thibetana) in Huangshan, China using a combination of all-occurrence and focal animal sampling methods. Our results show that the earlier individuals joined movements, the more central a role they occupied among the joining network. We also found that when less than three adults participated in the first five minutes of the joining process, no entire group movement occurred subsequently. When the number of these early joiners ranged from three to six, selective mimetism was used. This means higher rank or closer social affiliation of early joiners could be among the factors of deciding whether to participate in movements by group members. When the number of early joiners reached or exceeded seven, which was the simple majority of the group studied, entire group movement always occurred, meaning that the quorum rule was used. Putting together, Macaca thibetana used a combination of selective mimetism and quorum, and early joiners played a key role in deciding which rule should be used.

  5. Threshold-driven optimization for reference-based auto-planning

    NASA Astrophysics Data System (ADS)

    Long, Troy; Chen, Mingli; Jiang, Steve; Lu, Weiguo

    2018-02-01

    We study threshold-driven optimization methodology for automatically generating a treatment plan that is motivated by a reference DVH for IMRT treatment planning. We present a framework for threshold-driven optimization for reference-based auto-planning (TORA). Commonly used voxel-based quadratic penalties have two components for penalizing under- and over-dosing of voxels: a reference dose threshold and associated penalty weight. Conventional manual- and auto-planning using such a function involves iteratively updating the preference weights while keeping the thresholds constant, an unintuitive and often inconsistent method for planning toward some reference DVH. However, driving a dose distribution by threshold values instead of preference weights can achieve similar plans with less computational effort. The proposed methodology spatially assigns reference DVH information to threshold values, and iteratively improves the quality of that assignment. The methodology effectively handles both sub-optimal and infeasible DVHs. TORA was applied to a prostate case and a liver case as a proof-of-concept. Reference DVHs were generated using a conventional voxel-based objective, then altered to be either infeasible or easy-to-achieve. TORA was able to closely recreate reference DVHs in 5-15 iterations of solving a simple convex sub-problem. TORA has the potential to be effective for auto-planning based on reference DVHs. As dose prediction and knowledge-based planning becomes more prevalent in the clinical setting, incorporating such data into the treatment planning model in a clear, efficient way will be crucial for automated planning. A threshold-focused objective tuning should be explored over conventional methods of updating preference weights for DVH-guided treatment planning.

  6. Implementing a Commercial Rule Base as a Medication Order Safety Net

    PubMed Central

    Reichley, Richard M.; Seaton, Terry L.; Resetar, Ervina; Micek, Scott T.; Scott, Karen L.; Fraser, Victoria J.; Dunagan, W. Claiborne; Bailey, Thomas C.

    2005-01-01

    A commercial rule base (Cerner Multum) was used to identify medication orders exceeding recommended dosage limits at five hospitals within BJC HealthCare, an integrated health care system. During initial testing, clinical pharmacists determined that there was an excessive number of nuisance and clinically insignificant alerts, with an overall alert rate of 9.2%. A method for customizing the commercial rule base was implemented to increase rule specificity for problematic rules. The system was subsequently deployed at two facilities and achieved alert rates of less than 1%. Pharmacists screened these alerts and contacted ordering physicians in 21% of cases. Physicians made therapeutic changes in response to 38% of alerts presented to them. By applying simple techniques to customize rules, commercial rule bases can be used to rapidly deploy a safety net to screen drug orders for excessive dosages, while preserving the rule architecture for later implementations of more finely tuned clinical decision support. PMID:15802481

  7. Rule groupings in expert systems using nearest neighbour decision rules, and convex hulls

    NASA Technical Reports Server (NTRS)

    Anastasiadis, Stergios

    1991-01-01

    Expert System shells are lacking in many areas of software engineering. Large rule based systems are not semantically comprehensible, difficult to debug, and impossible to modify or validate. Partitioning a set of rules found in CLIPS (C Language Integrated Production System) into groups of rules which reflect the underlying semantic subdomains of the problem, will address adequately the concerns stated above. Techniques are introduced to structure a CLIPS rule base into groups of rules that inherently have common semantic information. The concepts involved are imported from the field of A.I., Pattern Recognition, and Statistical Inference. Techniques focus on the areas of feature selection, classification, and a criteria of how 'good' the classification technique is, based on Bayesian Decision Theory. A variety of distance metrics are discussed for measuring the 'closeness' of CLIPS rules and various Nearest Neighbor classification algorithms are described based on the above metric.

  8. Unraveling the drivers of community dissimilarity and species extinction in fragmented landscapes.

    PubMed

    Banks-Leite, Cristina; Ewers, Robert M; Metzger, Jean Paul

    2012-12-01

    Communities in fragmented landscapes are often assumed to be structured by species extinction due to habitat loss, which has led to extensive use of the species-area relationship (SAR) in fragmentation studies. However, the use of the SAR presupposes that habitat loss leads species to extinction but does not allow for extinction to be offset by colonization of disturbed-habitat specialists. Moreover, the use of SAR assumes that species richness is a good proxy of community changes in fragmented landscapes. Here, we assessed how communities dwelling in fragmented landscapes are influenced by habitat loss at multiple scales; then we estimated the ability of models ruled by SAR and by species turnover in successfully predicting changes in community composition, and asked whether species richness is indeed an informative community metric. To address these issues, we used a data set consisting of 140 bird species sampled in 65 patches, from six landscapes with different proportions of forest cover in the Atlantic Forest of Brazil. We compared empirical patterns against simulations of over 8 million communities structured by different magnitudes of the power-law SAR and with species-specific rules to assign species to sites. Empirical results showed that, while bird community composition was strongly influenced by habitat loss at the patch and landscape scale, species richness remained largely unaffected. Modeling results revealed that the compositional changes observed in the Atlantic Forest bird metacommunity were only matched by models with either unrealistic magnitudes of the SAR or by models ruled by species turnover, akin to what would be observed along natural gradients. We show that, in the presence of such compositional turnover, species richness is poorly correlated with species extinction, and z values of the SAR strongly underestimate the effects of habitat loss. We suggest that the observed compositional changes are driven by each species reaching its individual extinction threshold: either a threshold of forest cover for species that disappear with habitat loss, or of matrix cover for species that benefit from habitat loss.

  9. Automated revision of CLIPS rule-bases

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick M.; Pazzani, Michael J.

    1994-01-01

    This paper describes CLIPS-R, a theory revision system for the revision of CLIPS rule-bases. CLIPS-R may be used for a variety of knowledge-base revision tasks, such as refining a prototype system, adapting an existing system to slightly different operating conditions, or improving an operational system that makes occasional errors. We present a description of how CLIPS-R revises rule-bases, and an evaluation of the system on three rule-bases.

  10. A Common Fluence Threshold for First Positive and Second Positive Phototropism in Arabidopsis thaliana1

    PubMed Central

    Janoudi, Abdul; Poff, Kenneth L.

    1990-01-01

    The relationship between the amount of light and the amount of response for any photobiological process can be based on the number of incident quanta per unit time (fluence rate-response) or on the number of incident quanta during a given period of irradiation (fluence-response). Fluence-response and fluence rate-response relationships have been measured for second positive phototropism by seedlings of Arabidopsis thaliana. The fluence-response relationships exhibit a single limiting threshold at about 0.01 micromole per square meter when measured at fluence rates from 2.4 × 10−5 to 6.5 × 10−3 micromoles per square meter per second. The threshold values in the fluence rateresponse curves decrease with increasing time of irradiation, but show a common fluence threshold at about 0.01 micromole per square meter. These thresholds are the same as the threshold of about 0.01 micromole per square meter measured for first positive phototropism. Based on these data, it is suggested that second positive curvature has a threshold in time of about 10 minutes. Moreover, if the times of irradiation exceed the time threshold, there is a single limiting fluence threshold at about 0.01 micromole per square meter. Thus, the limiting fluence threshold for second positive phototropism is the same as the fluence threshold for first positive phototropism. Based on these data, we suggest that this common fluence threshold for first positive and second positive phototropism is set by a single photoreceptor pigment system. PMID:11537470

  11. Concurrence of rule- and similarity-based mechanisms in artificial grammar learning.

    PubMed

    Opitz, Bertram; Hofmann, Juliane

    2015-03-01

    A current theoretical debate regards whether rule-based or similarity-based learning prevails during artificial grammar learning (AGL). Although the majority of findings are consistent with a similarity-based account of AGL it has been argued that these results were obtained only after limited exposure to study exemplars, and performance on subsequent grammaticality judgment tests has often been barely above chance level. In three experiments the conditions were investigated under which rule- and similarity-based learning could be applied. Participants were exposed to exemplars of an artificial grammar under different (implicit and explicit) learning instructions. The analysis of receiver operating characteristics (ROC) during a final grammaticality judgment test revealed that explicit but not implicit learning led to rule knowledge. It also demonstrated that this knowledge base is built up gradually while similarity knowledge governed the initial state of learning. Together these results indicate that rule- and similarity-based mechanisms concur during AGL. Moreover, it could be speculated that two different rule processes might operate in parallel; bottom-up learning via gradual rule extraction and top-down learning via rule testing. Crucially, the latter is facilitated by performance feedback that encourages explicit hypothesis testing. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. A structured approach to Exposure Based Waiving of human health endpoints under REACH developed in the OSIRIS project.

    PubMed

    Marquart, Hans; Meijster, Tim; Van de Bovenkamp, Marja; Ter Burg, Wouter; Spaan, Suzanne; Van Engelen, Jacqueline

    2012-03-01

    Exposure Based Waiving (EBW) is one of the options in REACH when there is insufficient hazard data on a specific endpoint. Rules for adaptation of test requirements are specified and a general option for EBW is given via Appendix XI of REACH, allowing waiving of repeated dose toxicity studies, reproductive toxicity studies and carcinogenicity studies under a number of conditions if exposure is very low. A decision tree is described that was developed in the European project OSIRIS (Optimised Strategies for Risk Assessment of Industrial Chemicals through Integration of Non-Test and Test Information) to help decide in what cases EBW can be justified. The decision tree uses specific criteria as well as more general questions. For the latter, guidance on interpretation and resulting conclusions is provided. Criteria and guidance are partly based on an expert elicitation process. Among the specific criteria a number of proposed Thresholds of Toxicological Concern are used. The decision tree, expanded with specific parts on absorption, distribution, metabolism and excretion that are not described in this paper, is implemented in the OSIRIS webtool on integrated testing strategies. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Policy tree optimization for adaptive management of water resources systems

    NASA Astrophysics Data System (ADS)

    Herman, Jonathan; Giuliani, Matteo

    2017-04-01

    Water resources systems must cope with irreducible uncertainty in supply and demand, requiring policy alternatives capable of adapting to a range of possible future scenarios. Recent studies have developed adaptive policies based on "signposts" or "tipping points" that suggest the need of updating the policy. However, there remains a need for a general method to optimize the choice of the signposts to be used and their threshold values. This work contributes a general framework and computational algorithm to design adaptive policies as a tree structure (i.e., a hierarchical set of logical rules) using a simulation-optimization approach based on genetic programming. Given a set of feature variables (e.g., reservoir level, inflow observations, inflow forecasts), the resulting policy defines both the optimal reservoir operations and the conditions under which such operations should be triggered. We demonstrate the approach using Folsom Reservoir (California) as a case study, in which operating policies must balance the risk of both floods and droughts. Numerical results show that the tree-based policies outperform the ones designed via Dynamic Programming. In addition, they display good adaptive capacity to the changing climate, successfully adapting the reservoir operations across a large set of uncertain climate scenarios.

  14. Modelling Waterfall Retreat in Heterogenous Bedrock

    NASA Astrophysics Data System (ADS)

    Attal, M.; Hodge, R. A.; Williams, R.; Baynes, E.

    2016-12-01

    Bedrock rivers are the mediators of environmental change through mountainous landscapes. In response to an increase in uplift rate for example, a "knickpoint" (often materialised as a waterfall) will propagate upstream, separating a domain downstream where the river and its adjacent hillslopes have steepened in response to the change from a "relict" domain upstream which is adjusted to the conditions before the change (Crosby and Whipple 2006). Many studies assume that knickpoint propagation rate scales with drainage area, based on the stream power theory. However, recent studies in a range of locations have found no obvious relationship between knickpoint retreat rate and drainage area, potentially resulting from the stream power law neglecting (i) the influence of sediment on the processes associated with waterfall migration and (ii) thresholds for bedrock detachment (Cook et al. 2013; Mackey et al. 2014; DiBiase et al. 2015; Baynes et al. 2015; Brocard et al. 2016). In this study, we develop a 1D model of waterfall retreat in horizontally bedded bedrock with varying joint spacing. In the model, knickpoint migration is based on two rules: a waterfall will start migrating once the threshold flow depth (a function of knickpoint height and joint spacing) has been exceeded (Lamb and Dietrich 2009), and the migration rate will then be a function of the water-depth-to-waterfall-height ratio, based on experimental results by Baynes (2015). Using a hydrograph based on a Poisson rectangular pulse rainfall simulator (Tucker and Bras 2001), we demonstrate the importance of structure in controlling the speed at which waterfalls migrate but also their number and the length over which they are distributed (Fig. 1). The model is applied to the Jökulsá á Fjöllum, NE Iceland, where rapid migration of waterfalls as a result of discrete events has been identified (Baynes et al. 2015), using new constraints on joint spacing derived from high resolution lidar survey of the gorge walls.

  15. Ab Initio Calculation of Photoionization and Inelastic Photon Scattering Spectra of He below the N=2 Threshold in a dc Electric Field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mihelic, Andrej; Zitnik, Matjaz

    2007-06-15

    We study the Stark effect on doubly excited states of the helium atom below N=2. We present the ab initio photoionization and total inelastic photon scattering cross sections calculated with the method of complex scaling for field strengths F{<=}100 kV/cm. The calculations are compared to the measurements of the ion [Phys. Rev. Lett. 90, 133002 (2003)] and vacuum ultraviolet fluorescence yields [Phys. Rev. Lett. 96, 093001 (2006)]. For the case of photoionization and for incident photons with polarization vector P parallel to the electric field F, we confirm the propensity rule proposed by Tong and Lin [Phys. Rev. Lett. 92,more » 223003 (2004)]. Furthermore, the rule is also shown to apply for F perpendicular P and for the case of the inelastic scattering in both experimental geometries.« less

  16. Detection of pseudosinusoidal epileptic seizure segments in the neonatal EEG by cascading a rule-based algorithm with a neural network.

    PubMed

    Karayiannis, Nicolaos B; Mukherjee, Amit; Glover, John R; Ktonas, Periklis Y; Frost, James D; Hrachovy, Richard A; Mizrahi, Eli M

    2006-04-01

    This paper presents an approach to detect epileptic seizure segments in the neonatal electroencephalogram (EEG) by characterizing the spectral features of the EEG waveform using a rule-based algorithm cascaded with a neural network. A rule-based algorithm screens out short segments of pseudosinusoidal EEG patterns as epileptic based on features in the power spectrum. The output of the rule-based algorithm is used to train and compare the performance of conventional feedforward neural networks and quantum neural networks. The results indicate that the trained neural networks, cascaded with the rule-based algorithm, improved the performance of the rule-based algorithm acting by itself. The evaluation of the proposed cascaded scheme for the detection of pseudosinusoidal seizure segments reveals its potential as a building block of the automated seizure detection system under development.

  17. Smeared spectrum jamming suppression based on generalized S transform and threshold segmentation

    NASA Astrophysics Data System (ADS)

    Li, Xin; Wang, Chunyang; Tan, Ming; Fu, Xiaolong

    2018-04-01

    Smeared Spectrum (SMSP) jamming is an effective jamming in countering linear frequency modulation (LFM) radar. According to the time-frequency distribution difference between jamming and echo, a jamming suppression method based on Generalized S transform (GST) and threshold segmentation is proposed. The sub-pulse period is firstly estimated based on auto correlation function firstly. Secondly, the time-frequency image and the related gray scale image are achieved based on GST. Finally, the Tsallis cross entropy is utilized to compute the optimized segmentation threshold, and then the jamming suppression filter is constructed based on the threshold. The simulation results show that the proposed method is of good performance in the suppression of false targets produced by SMSP.

  18. 77 FR 73498 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-10

    ... Effectiveness of a Proposed Rule Change To Amend Its Rule Related to Multi-Class Broad- Based Index Option... Rule Change The Exchange proposes to amend its rule related to multi-class broad-based index option... is to (i) clarify that the term ``Multi-Class Broad-Based Index Option Spread Order (Multi-Class...

  19. Troponin-only Manchester Acute Coronary Syndromes (T-MACS) decision aid: single biomarker re-derivation and external validation in three cohorts

    PubMed Central

    Body, Richard; Sperrin, Matthew; Lewis, Philip S; Burrows, Gillian; Carley, Simon; McDowell, Garry; Buchan, Iain; Greaves, Kim; Mackway-Jones, Kevin

    2017-01-01

    Background The original Manchester Acute Coronary Syndromes model (MACS) ‘rules in’ and ‘rules out’ acute coronary syndromes (ACS) using high sensitivity cardiac troponin T (hs-cTnT) and heart-type fatty acid binding protein (H-FABP) measured at admission. The latter is not always available. We aimed to refine and validate MACS as Troponin-only Manchester Acute Coronary Syndromes (T-MACS), cutting down the biomarkers to just hs-cTnT. Methods We present secondary analyses from four prospective diagnostic cohort studies including patients presenting to the ED with suspected ACS. Data were collected and hs-cTnT measured on arrival. The primary outcome was ACS, defined as prevalent acute myocardial infarction (AMI) or incident death, AMI or coronary revascularisation within 30 days. T-MACS was built in one cohort (derivation set) and validated in three external cohorts (validation set). Results At the ‘rule out’ threshold, in the derivation set (n=703), T-MACS had 99.3% (95% CI 97.3% to 99.9%) negative predictive value (NPV) and 98.7% (95.3%–99.8%) sensitivity for ACS, ‘ruling out’ 37.7% patients (specificity 47.6%, positive predictive value (PPV) 34.0%). In the validation set (n=1459), T-MACS had 99.3% (98.3%–99.8%) NPV and 98.1% (95.2%–99.5%) sensitivity, ‘ruling out’ 40.4% (n=590) patients (specificity 47.0%, PPV 23.9%). T-MACS would ‘rule in’ 10.1% and 4.7% patients in the respective sets, of which 100.0% and 91.3% had ACS. C-statistics for the original and refined rules were similar (T-MACS 0.91 vs MACS 0.90 on validation). Conclusions T-MACS could ‘rule out’ ACS in 40% of patients, while ‘ruling in’ 5% at highest risk using a single hs-cTnT measurement on arrival. As a clinical decision aid, T-MACS could therefore help to conserve healthcare resources. PMID:27565197

  20. Parallel inferencing method and apparatus for rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Schwuttke, Ursula M. (Inventor); Moldovan, Dan (Inventor); Kuo, Steve (Inventor)

    1993-01-01

    The invention analyzes areas of conditions with an expert knowledge base of rules using plural separate nodes which fire respective rules of said knowledge base, each of said rules upon being fired altering certain of said conditions predicated upon the existence of other said conditions. The invention operates by constructing a P representation of all pairs of said rules which are input dependent or output dependent; constructing a C representation of all pairs of said rules which are communication dependent or input dependent; determining which of the rules are ready to fire by matching the predicate conditions of each rule with the conditions of said set; enabling said node means to simultaneously fire those of the rules ready to fire which are defined by said P representation as being free of input and output dependencies; and communicating from each node enabled by said enabling step the alteration of conditions by the corresponding rule to other nodes whose rules are defined by said C matrix means as being input or communication dependent upon the rule of said enabled node.

  1. Wavelet-based adaptive thresholding method for image segmentation

    NASA Astrophysics Data System (ADS)

    Chen, Zikuan; Tao, Yang; Chen, Xin; Griffis, Carl

    2001-05-01

    A nonuniform background distribution may cause a global thresholding method to fail to segment objects. One solution is using a local thresholding method that adapts to local surroundings. In this paper, we propose a novel local thresholding method for image segmentation, using multiscale threshold functions obtained by wavelet synthesis with weighted detail coefficients. In particular, the coarse-to- fine synthesis with attenuated detail coefficients produces a threshold function corresponding to a high-frequency- reduced signal. This wavelet-based local thresholding method adapts to both local size and local surroundings, and its implementation can take advantage of the fast wavelet algorithm. We applied this technique to physical contaminant detection for poultry meat inspection using x-ray imaging. Experiments showed that inclusion objects in deboned poultry could be extracted at multiple resolutions despite their irregular sizes and uneven backgrounds.

  2. Using Rule-Based Computer Programming to Unify Communication Rules Research.

    ERIC Educational Resources Information Center

    Sanford, David L.; Roach, J. W.

    This paper proposes the use of a rule-based computer programming language as a standard for the expression of rules, arguing that the adoption of a standard would enable researchers to communicate about rules in a consistent and significant way. Focusing on the formal equivalence of artificial intelligence (AI) programming to different types of…

  3. Efficiency in Rule- vs. Plan-Based Movements Is Modulated by Action-Mode.

    PubMed

    Scheib, Jean P P; Stoll, Sarah; Thürmer, J Lukas; Randerath, Jennifer

    2018-01-01

    The rule/plan motor cognition (RPMC) paradigm elicits visually indistinguishable motor outputs, resulting from either plan- or rule-based action-selection, using a combination of essentially interchangeable stimuli. Previous implementations of the RPMC paradigm have used pantomimed movements to compare plan- vs. rule-based action-selection. In the present work we attempt to determine the generalizability of previous RPMC findings to real object interaction by use of a grasp-to-rotate task. In the plan task, participants had to use prospective planning to achieve a comfortable post-handle rotation hand posture. The rule task used implementation intentions (if-then rules) leading to the same comfortable end-state. In Experiment A, we compare RPMC performance of 16 healthy participants in pantomime and real object conditions of the experiment, within-subjects. Higher processing efficiency of rule- vs. plan-based action-selection was supported by diffusion model analysis. Results show a significant response-time increase in the pantomime condition compared to the real object condition and a greater response-time advantage of rule-based vs. plan-based actions in the pantomime compared to the real object condition. In Experiment B, 24 healthy participants performed the real object RPMC task in a task switching vs. a blocked condition. Results indicate that plan-based action-selection leads to longer response-times and less efficient information processing than rule-based action-selection in line with previous RPMC findings derived from the pantomime action-mode. Particularly in the task switching mode, responses were faster in the rule compared to the plan task suggesting a modulating influence of cognitive load. Overall, results suggest an advantage of rule-based action-selection over plan-based action-selection; whereby differential mechanisms appear to be involved depending on the action-mode. We propose that cognitive load is a factor that modulates the advantageous effect of implementation intentions in motor cognition on different levels as illustrated by the varying speed advantages and the variation in diffusion parameters per action-mode or condition, respectively.

  4. Reducing the Conflict Factors Strategies in Question Answering System

    NASA Astrophysics Data System (ADS)

    Suwarningsih, W.; Purwarianti, A.; Supriana, I.

    2017-03-01

    A rule-based system is prone to conflict as new knowledge every time will emerge and indirectly must sign in to the knowledge base that is used by the system. A conflict occurred between the rules in the knowledge base can lead to the errors of reasoning or reasoning circulation. Therefore, when added, the new rules will lead to conflict with other rules, and the only rules that really can be added to the knowledge base. From these conditions, this paper aims to propose a conflict resolution strategy for a medical debriefing system by analyzing scenarios based upon the runtime to improve the efficiency and reliability of systems.

  5. Models of Quantitative Estimations: Rule-Based and Exemplar-Based Processes Compared

    ERIC Educational Resources Information Center

    von Helversen, Bettina; Rieskamp, Jorg

    2009-01-01

    The cognitive processes underlying quantitative estimations vary. Past research has identified task-contingent changes between rule-based and exemplar-based processes (P. Juslin, L. Karlsson, & H. Olsson, 2008). B. von Helversen and J. Rieskamp (2008), however, proposed a simple rule-based model--the mapping model--that outperformed the…

  6. Medicare and Medicaid programs; modifications to the Medicare and Medicaid Electronic Health Record (EHR) Incentive Program for 2014 and other changes to EHR Incentive Program; and health information technology: revision to the certified EHR technology definition and EHR certification changes related to standards. Final rule.

    PubMed

    2014-09-04

    This final rule changes the meaningful use stage timeline and the definition of certified electronic health record technology (CEHRT) to allow options in the use of CEHRT for the EHR reporting period in 2014. It also sets the requirements for reporting on meaningful use objectives and measures as well as clinical quality measure (CQM) reporting in 2014 for providers who use one of the CEHRT options finalized in this rule for their EHR reporting period in 2014. In addition, it finalizes revisions to the Medicare and Medicaid EHR Incentive Programs to adopt an alternate measure for the Stage 2 meaningful use objective for hospitals to provide structured electronic laboratory results to ambulatory providers; to correct the regulation text for the measures associated with the objective for hospitals to provide patients the ability to view online, download, and transmit information about a hospital admission; and to set a case number threshold exemption for CQM reporting applicable for eligible hospitals and critical access hospitals (CAHs) beginning with FY 2013. Finally, this rule finalizes the provisionally adopted replacement of the Data Element Catalog (DEC) and the Quality Reporting Document Architecture (QRDA) Category III standards with updated versions of these standards.

  7. Are false-positive rates leading to an overestimation of noise-induced hearing loss?

    PubMed

    Schlauch, Robert S; Carney, Edward

    2011-04-01

    To estimate false-positive rates for rules proposed to identify early noise-induced hearing loss (NIHL) using the presence of notches in audiograms. Audiograms collected from school-age children in a national survey of health and nutrition (the Third National Health and Nutrition Examination Survey [NHANES III]; National Center for Health Statistics, 1994) were examined using published rules for identifying noise notches at various pass-fail criteria. These results were compared with computer-simulated "flat" audiograms. The proportion of these identified as having a noise notch is an estimate of the false-positive rate for a particular rule. Audiograms from the NHANES III for children 6-11 years of age yielded notched audiograms at rates consistent with simulations, suggesting that this group does not have significant NIHL. Further, pass-fail criteria for rules suggested by expert clinicians, applied to NHANES III audiometric data, yielded unacceptably high false-positive rates. Computer simulations provide an effective method for estimating false-positive rates for protocols used to identify notched audiograms. Audiometric precision could possibly be improved by (a) eliminating systematic calibration errors, including a possible problem with reference levels for TDH-style earphones; (b) repeating and averaging threshold measurements; and (c) using earphones that yield lower variability for 6.0 and 8.0 kHz--2 frequencies critical for identifying noise notches.

  8. SIRE: A Simple Interactive Rule Editor for NICBES

    NASA Technical Reports Server (NTRS)

    Bykat, Alex

    1988-01-01

    To support evolution of domain expertise, and its representation in an expert system knowledge base, a user-friendly rule base editor is mandatory. The Nickel Cadmium Battery Expert System (NICBES), a prototype of an expert system for the Hubble Space Telescope power storage management system, does not provide such an editor. In the following, a description of a Simple Interactive Rule Base Editor (SIRE) for NICBES is described. The SIRE provides a consistent internal representation of the NICBES knowledge base. It supports knowledge presentation and provides a user-friendly and code language independent medium for rule addition and modification. The SIRE is integrated with NICBES via an interface module. This module provides translation of the internal representation to Prolog-type rules (Horn clauses), latter rule assertion, and a simple mechanism for rule selection for its Prolog inference engine.

  9. Cost-effectiveness of different strategies for selecting and treating individuals at increased risk of osteoporosis or osteopenia: a systematic review.

    PubMed

    Müller, Dirk; Pulm, Jannis; Gandjour, Afschin

    2012-01-01

    To compare cost-effectiveness modeling analyses of strategies to prevent osteoporotic and osteopenic fractures either based on fixed thresholds using bone mineral density or based on variable thresholds including bone mineral density and clinical risk factors. A systematic review was performed by using the MEDLINE database and reference lists from previous reviews. On the basis of predefined inclusion/exclusion criteria, we identified relevant studies published since January 2006. Articles included for the review were assessed for their methodological quality and results. The literature search resulted in 24 analyses, 14 of them using a fixed-threshold approach and 10 using a variable-threshold approach. On average, 70% of the criteria for methodological quality were fulfilled, but almost half of the analyses did not include medication adherence in the base case. The results of variable-threshold strategies were more homogeneous and showed more favorable incremental cost-effectiveness ratios compared with those based on a fixed threshold with bone mineral density. For analyses with fixed thresholds, incremental cost-effectiveness ratios varied from €80,000 per quality-adjusted life-year in women aged 55 years to cost saving in women aged 80 years. For analyses with variable thresholds, the range was €47,000 to cost savings. Risk assessment using variable thresholds appears to be more cost-effective than selecting high-risk individuals by fixed thresholds. Although the overall quality of the studies was fairly good, future economic analyses should further improve their methods, particularly in terms of including more fracture types, incorporating medication adherence, and including or discussing unrelated costs during added life-years. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  10. Setting nutrient thresholds to support an ecological assessment based on nutrient enrichment, potential primary production and undesirable disturbance.

    PubMed

    Devlin, Michelle; Painting, Suzanne; Best, Mike

    2007-01-01

    The EU Water Framework Directive recognises that ecological status is supported by the prevailing physico-chemical conditions in each water body. This paper describes an approach to providing guidance on setting thresholds for nutrients taking account of the biological response to nutrient enrichment evident in different types of water. Indices of pressure, state and impact are used to achieve a robust nutrient (nitrogen) threshold by considering each individual index relative to a defined standard, scale or threshold. These indices include winter nitrogen concentrations relative to a predetermined reference value; the potential of the waterbody to support phytoplankton growth (estimated as primary production); and detection of an undesirable disturbance (measured as dissolved oxygen). Proposed reference values are based on a combination of historical records, offshore (limited human influence) nutrient concentrations, literature values and modelled data. Statistical confidence is based on a number of attributes, including distance of confidence limits away from a reference threshold and how well the model is populated with real data. This evidence based approach ensures that nutrient thresholds are based on knowledge of real and measurable biological responses in transitional and coastal waters.

  11. Significance testing of rules in rule-based models of human problem solving

    NASA Technical Reports Server (NTRS)

    Lewis, C. M.; Hammer, J. M.

    1986-01-01

    Rule-based models of human problem solving have typically not been tested for statistical significance. Three methods of testing rules - analysis of variance, randomization, and contingency tables - are presented. Advantages and disadvantages of the methods are also described.

  12. Brief communication: Using averaged soil moisture estimates to improve the performances of a regional-scale landslide early warning system

    NASA Astrophysics Data System (ADS)

    Segoni, Samuele; Rosi, Ascanio; Lagomarsino, Daniela; Fanti, Riccardo; Casagli, Nicola

    2018-03-01

    We communicate the results of a preliminary investigation aimed at improving a state-of-the-art RSLEWS (regional-scale landslide early warning system) based on rainfall thresholds by integrating mean soil moisture values averaged over the territorial units of the system. We tested two approaches. The simplest can be easily applied to improve other RSLEWS: it is based on a soil moisture threshold value under which rainfall thresholds are not used because landslides are not expected to occur. Another approach deeply modifies the original RSLEWS: thresholds based on antecedent rainfall accumulated over long periods are substituted with soil moisture thresholds. A back analysis demonstrated that both approaches consistently reduced false alarms, while the second approach reduced missed alarms as well.

  13. The Interactive Effects of the Availability of Objectives and/or Rules on Computer-Based Learning: A Replication.

    ERIC Educational Resources Information Center

    Merrill, Paul F.; And Others

    To replicate and extend the results of a previous study, this project investigated the effects of behavioral objectives and/or rules on computer-based learning task performance. The 133 subjects were randomly assigned to an example-only, objective-example, rule example, or objective-rule example group. The availability of rules and/or objectives…

  14. Statistical Analysis of SSMIS Sea Ice Concentration Threshold at the Arctic Sea Ice Edge during Summer Based on MODIS and Ship-Based Observational Data.

    PubMed

    Ji, Qing; Li, Fei; Pang, Xiaoping; Luo, Cong

    2018-04-05

    The threshold of sea ice concentration (SIC) is the basis for accurately calculating sea ice extent based on passive microwave (PM) remote sensing data. However, the PM SIC threshold at the sea ice edge used in previous studies and released sea ice products has not always been consistent. To explore the representable value of the PM SIC threshold corresponding on average to the position of the Arctic sea ice edge during summer in recent years, we extracted sea ice edge boundaries from the Moderate-resolution Imaging Spectroradiometer (MODIS) sea ice product (MOD29 with a spatial resolution of 1 km), MODIS images (250 m), and sea ice ship-based observation points (1 km) during the fifth (CHINARE-2012) and sixth (CHINARE-2014) Chinese National Arctic Research Expeditions, and made an overlay and comparison analysis with PM SIC derived from Special Sensor Microwave Imager Sounder (SSMIS, with a spatial resolution of 25 km) in the summer of 2012 and 2014. Results showed that the average SSMIS SIC threshold at the Arctic sea ice edge based on ice-water boundary lines extracted from MOD29 was 33%, which was higher than that of the commonly used 15% discriminant threshold. The average SIC threshold at sea ice edge based on ice-water boundary lines extracted by visual interpretation from four scenes of the MODIS image was 35% when compared to the average value of 36% from the MOD29 extracted ice edge pixels for the same days. The average SIC of 31% at the sea ice edge points extracted from ship-based observations also confirmed that choosing around 30% as the SIC threshold during summer is recommended for sea ice extent calculations based on SSMIS PM data. These results can provide a reference for further studying the variation of sea ice under the rapidly changing Arctic.

  15. WellnessRules: A Web 3.0 Case Study in RuleML-Based Prolog-N3 Profile Interoperation

    NASA Astrophysics Data System (ADS)

    Boley, Harold; Osmun, Taylor Michael; Craig, Benjamin Larry

    An interoperation study, WellnessRules, is described, where rules about wellness opportunities are created by participants in rule languages such as Prolog and N3, and translated within a wellness community using RuleML/XML. The wellness rules are centered around participants, as profiles, encoding knowledge about their activities conditional on the season, the time-of-day, the weather, etc. This distributed knowledge base extends FOAF profiles with a vocabulary and rules about wellness group networking. The communication between participants is organized through Rule Responder, permitting wellness-profile translation and distributed querying across engines. WellnessRules interoperates between rules and queries in the relational (Datalog) paradigm of the pure-Prolog subset of POSL and in the frame (F-logic) paradigm of N3. An evaluation of Rule Responder instantiated for WellnessRules revealed acceptable Web response times.

  16. 78 FR 62417 - Regulatory Capital Rules: Regulatory Capital, Implementation of Basel III, Capital Adequacy...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-22

    ..., Standardized Approach for Risk-Weighted Assets, Market Discipline and Disclosure Requirements, Advanced Approaches Risk-Based Capital Rule, and Market Risk Capital Rule AGENCY: Federal Deposit Insurance... Assets, Market Discipline and Disclosure Requirements, Advanced Approaches Risk-Based Capital Rule, and...

  17. Architecture For The Optimization Of A Machining Process In Real Time Through Rule-Based Expert System

    NASA Astrophysics Data System (ADS)

    Serrano, Rafael; González, Luis Carlos; Martín, Francisco Jesús

    2009-11-01

    Under the project SENSOR-IA which has had financial funding from the Order of Incentives to the Regional Technology Centers of the Counsil of Innovation, Science and Enterprise of Andalusia, an architecture for the optimization of a machining process in real time through rule-based expert system has been developed. The architecture consists of an acquisition system and sensor data processing engine (SATD) from an expert system (SE) rule-based which communicates with the SATD. The SE has been designed as an inference engine with an algorithm for effective action, using a modus ponens rule model of goal-oriented rules.The pilot test demonstrated that it is possible to govern in real time the machining process based on rules contained in a SE. The tests have been done with approximated rules. Future work includes an exhaustive collection of data with different tool materials and geometries in a database to extract more precise rules.

  18. The influence of current direction on phosphene thresholds evoked by transcranial magnetic stimulation.

    PubMed

    Kammer, T; Beck, S; Erb, M; Grodd, W

    2001-11-01

    To quantify phosphene thresholds evoked by transcranial magnetic stimulation (TMS) in the occipital cortex as a function of induced current direction. Phosphene thresholds were determined in 6 subjects. We compared two stimulator types (Medtronic-Dantec and Magstim) with monophasic pulses using the standard figure-of-eight coils and systematically varied hemisphere (left and right) and induced current direction (latero-medial and medio-lateral). Each measurement was made 3 times, with a new stimulation site chosen for each repetition. Only those stimulation sites were investigated where phosphenes were restricted to one visual hemifield. Coil positions were stereotactically registered. Functional magnetic resonance imaging (fMRI) of retinotopic areas was performed in 5 subjects to individually characterize the borders of visual areas; TMS stimulation sites were coregistered with respect to visual areas. Despite large interindividual variance we found a consistent pattern of phosphene thresholds. They were significantly lower if the direction of the induced current was oriented from lateral to medial in the occipital lobe rather than vice versa. No difference with respect to the hemisphere was found. Threshold values normalized to the square root of the stored energy in the stimulators were lower with the Medtronic-Dantec device than with the Magstim device. fMRI revealed that stimulation sites generating unilateral phosphenes were situated at V2 and V3. Variability of phosphene thresholds was low within a cortical patch of 2x2cm(2). Stimulation over V1 yields phosphenes in both visual fields. The excitability of visual cortical areas depends on the direction of the induced current with a preference for latero-medial currents. Although the coil positions used in this study were centered over visual areas V2 and V3, we cannot rule out the possibility that subcortical structures or V1 could actually be the main generator for phosphenes.

  19. Analysis and Simulation of the Simplified Aircraft-Based Paired Approach Concept With the ALAS Alerting Algorithm in Conjunction With Echelon and Offset Strategies

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo; Madden, Michael M.; Butler, Rickey W.; Perry, Raleigh B.

    2014-01-01

    This report presents analytical and simulation results of an investigation into proposed operational concepts for closely spaced parallel runways, including the Simplified Aircraft-based Paired Approach (SAPA) with alerting and an escape maneuver, MITRE?s echelon spacing and no escape maneuver, and a hybrid concept aimed at lowering the visibility minima. We found that the SAPA procedure can be used at 950 ft separations or higher with next-generation avionics and that 1150 ft separations or higher is feasible with current-rule compliant ADS-B OUT. An additional 50 ft reduction in runway separation for the SAPA procedure is possible if different glideslopes are used. For the echelon concept we determined that current generation aircraft cannot conduct paired approaches on parallel paths using echelon spacing on runways less than 1400 ft apart and next-generation aircraft will not be able to conduct paired approach on runways less than 1050 ft apart. The hybrid concept added alerting and an escape maneuver starting 1 NM from the threshold when flying the echelon concept. This combination was found to be effective, but the probability of a collision can be seriously impacted if the turn component of the escape maneuver has to be disengaged near the ground (e.g. 300 ft or below) due to airport buildings and surrounding terrain. We also found that stabilizing the approach path in the straight-in segment was only possible if the merge point was at least 1.5 to 2 NM from the threshold unless the total system error can be sufficiently constrained on the offset path and final turn.

  20. A computer-aided detection (CAD) system with a 3D algorithm for small acute intracranial hemorrhage

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Fernandez, James; Deshpande, Ruchi; Lee, Joon K.; Chan, Tao; Liu, Brent

    2012-02-01

    Acute Intracranial hemorrhage (AIH) requires urgent diagnosis in the emergency setting to mitigate eventual sequelae. However, experienced radiologists may not always be available to make a timely diagnosis. This is especially true for small AIH, defined as lesion smaller than 10 mm in size. A computer-aided detection (CAD) system for the detection of small AIH would facilitate timely diagnosis. A previously developed 2D algorithm shows high false positive rates in the evaluation based on LAC/USC cases, due to the limitation of setting up correct coordinate system for the knowledge-based classification system. To achieve a higher sensitivity and specificity, a new 3D algorithm is developed. The algorithm utilizes a top-hat transformation and dynamic threshold map to detect small AIH lesions. Several key structures of brain are detected and are used to set up a 3D anatomical coordinate system. A rule-based classification of the lesion detected is applied based on the anatomical coordinate system. For convenient evaluation in clinical environment, the CAD module is integrated with a stand-alone system. The CAD is evaluated by small AIH cases and matched normal collected in LAC/USC. The result of 3D CAD and the previous 2D CAD has been compared.

Top