Sample records for basic probability assignment

  1. Advanced Issues in Propensity Scores: Longitudinal and Missing Data

    ERIC Educational Resources Information Center

    Kupzyk, Kevin A.; Beal, Sarah J.

    2017-01-01

    In order to investigate causality in situations where random assignment is not possible, propensity scores can be used in regression adjustment, stratification, inverse-probability treatment weighting, or matching. The basic concepts behind propensity scores have been extensively described. When data are longitudinal or missing, the estimation and…

  2. Routing in Networks with Random Topologies

    NASA Technical Reports Server (NTRS)

    Bambos, Nicholas

    1997-01-01

    We examine the problems of routing and server assignment in networks with random connectivities. In such a network the basic topology is fixed, but during each time slot and for each of tis input queues, each server (node) is either connected to or disconnected from each of its queues with some probability.

  3. A cosmic book. [of physics of early universe

    NASA Technical Reports Server (NTRS)

    Peebles, P. J. E.; Silk, Joseph

    1988-01-01

    A system of assigning odds to the basic elements of cosmological theories is proposed in order to evaluate the strengths and weaknesses of the theories. A figure of merit for the theories is obtained by counting and weighing the plausibility of each of the basic elements that is not substantially supported by observation or mature fundamental theory. The magnetized strong model is found to be the most probable. In order of decreasing probability, the ranking for the rest of the models is: (1) the magnetized string model with no exotic matter and the baryon adiabatic model; (2) the hot dark matter model and the model of cosmic string loops; (3) the canonical cold dark matter model, the cosmic string loops model with hot dark matter, and the baryonic isocurvature model; and (4) the cosmic string loops model with no exotic matter.

  4. Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powell, Danny H; Elwood Jr, Robert H

    2011-01-01

    Analysis of the material protection, control, and accountability (MPC&A) system is necessary to understand the limits and vulnerabilities of the system to internal threats. A self-appraisal helps the facility be prepared to respond to internal threats and reduce the risk of theft or diversion of nuclear material. The material control and accountability (MC&A) system effectiveness tool (MSET) fault tree was developed to depict the failure of the MPC&A system as a result of poor practices and random failures in the MC&A system. It can also be employed as a basis for assessing deliberate threats against a facility. MSET uses faultmore » tree analysis, which is a top-down approach to examining system failure. The analysis starts with identifying a potential undesirable event called a 'top event' and then determining the ways it can occur (e.g., 'Fail To Maintain Nuclear Materials Under The Purview Of The MC&A System'). The analysis proceeds by determining how the top event can be caused by individual or combined lower level faults or failures. These faults, which are the causes of the top event, are 'connected' through logic gates. The MSET model uses AND-gates and OR-gates and propagates the effect of event failure using Boolean algebra. To enable the fault tree analysis calculations, the basic events in the fault tree are populated with probability risk values derived by conversion of questionnaire data to numeric values. The basic events are treated as independent variables. This assumption affects the Boolean algebraic calculations used to calculate results. All the necessary calculations are built into the fault tree codes, but it is often useful to estimate the probabilities manually as a check on code functioning. The probability of failure of a given basic event is the probability that the basic event primary question fails to meet the performance metric for that question. The failure probability is related to how well the facility performs the task identified in that basic event over time (not just one performance or exercise). Fault tree calculations provide a failure probability for the top event in the fault tree. The basic fault tree calculations establish a baseline relative risk value for the system. This probability depicts relative risk, not absolute risk. Subsequent calculations are made to evaluate the change in relative risk that would occur if system performance is improved or degraded. During the development effort of MSET, the fault tree analysis program used was SAPHIRE. SAPHIRE is an acronym for 'Systems Analysis Programs for Hands-on Integrated Reliability Evaluations.' Version 1 of the SAPHIRE code was sponsored by the Nuclear Regulatory Commission in 1987 as an innovative way to draw, edit, and analyze graphical fault trees primarily for safe operation of nuclear power reactors. When the fault tree calculations are performed, the fault tree analysis program will produce several reports that can be used to analyze the MPC&A system. SAPHIRE produces reports showing risk importance factors for all basic events in the operational MC&A system. The risk importance information is used to examine the potential impacts when performance of certain basic events increases or decreases. The initial results produced by the SAPHIRE program are considered relative risk values. None of the results can be interpreted as absolute risk values since the basic event probability values represent estimates of risk associated with the performance of MPC&A tasks throughout the material balance area (MBA). The RRR for a basic event represents the decrease in total system risk that would result from improvement of that one event to a perfect performance level. Improvement of the basic event with the greatest RRR value produces a greater decrease in total system risk than improvement of any other basic event. Basic events with the greatest potential for system risk reduction are assigned performance improvement values, and new fault tree calculations show the improvement in total system risk. The operational impact or cost-effectiveness from implementing the performance improvements can then be evaluated. The improvements being evaluated can be system performance improvements, or they can be potential, or actual, upgrades to the system. The RIR for a basic event represents the increase in total system risk that would result from failure of that one event. Failure of the basic event with the greatest RIR value produces a greater increase in total system risk than failure of any other basic event. Basic events with the greatest potential for system risk increase are assigned failure performance values, and new fault tree calculations show the increase in total system risk. This evaluation shows the importance of preventing performance degradation of the basic events. SAPHIRE identifies combinations of basic events where concurrent failure of the events results in failure of the top event.« less

  5. Kappa and Rater Accuracy: Paradigms and Parameters.

    PubMed

    Conger, Anthony J

    2017-12-01

    Drawing parallels to classical test theory, this article clarifies the difference between rater accuracy and reliability and demonstrates how category marginal frequencies affect rater agreement and Cohen's kappa (κ). Category assignment paradigms are developed: comparing raters to a standard (index) versus comparing two raters to one another (concordance), using both nonstochastic and stochastic category membership. Using a probability model to express category assignments in terms of rater accuracy and random error, it is shown that observed agreement (Po) depends only on rater accuracy and number of categories; however, expected agreement (Pe) and κ depend additionally on category frequencies. Moreover, category frequencies affect Pe and κ solely through the variance of the category proportions, regardless of the specific frequencies underlying the variance. Paradoxically, some judgment paradigms involving stochastic categories are shown to yield higher κ values than their nonstochastic counterparts. Using the stated probability model, assignments to categories were generated for 552 combinations of paradigms, rater and category parameters, category frequencies, and number of stimuli. Observed means and standard errors for Po, Pe, and κ were fully consistent with theory expectations. Guidelines for interpretation of rater accuracy and reliability are offered, along with a discussion of alternatives to the basic model.

  6. Bayesian Probability Theory

    NASA Astrophysics Data System (ADS)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  7. Quantum probability assignment limited by relativistic causality.

    PubMed

    Han, Yeong Deok; Choi, Taeseung

    2016-03-14

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment.

  8. Stress management training for military trainees returned to duty after a mental health evaluation: effect on graduation rates.

    PubMed

    Cigrang, J A; Todd, S L; Carbone, E G

    2000-01-01

    A significant proportion of people entering the military are discharged within the first 6 months of enlistment. Mental health related problems are often cited as the cause of discharge. This study evaluated the utility of stress inoculation training in helping reduce the attrition of a sample of Air Force trainees at risk for discharge from basic military training. Participants were 178 trainees referred for a psychological evaluation from basic training. Participants were randomly assigned to a 2-session stress management group or a usual-care control condition. Compared with past studies that used less rigorous methodology, this study did not find that exposure to stress management information increased the probability of graduating basic military training. Results are discussed in terms of possible reasons for the lack of treatment effects and directions for future research.

  9. Operation Condition Monitoring using Temporal Weighted Dempster-Shafer Theory

    DTIC Science & Technology

    2014-12-23

    are mutually exclusive; A mapping of  : 2 0,1m   , which defines the basic probability assignment( BPA ) of each subset A  of hypotheses and...satisfying ( ) 0;m   ( ) 1 A m A   . The BPA represents a certain piece of ev idence. A rule of D-S evidence combination, which could be used to...yield a new BPA from two independent evidences and their BPAs . There are a number of possible combination rules in application (Sentz, 2002). One

  10. Bayesian data analysis tools for atomic physics

    NASA Astrophysics Data System (ADS)

    Trassinelli, Martino

    2017-10-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested_fit to calculate the different probability distributions and other related quantities. Nested_fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.

  11. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.

  12. Use of genetic data to infer population-specific ecological and phenotypic traits from mixed aggregations

    USGS Publications Warehouse

    Moran, Paul; Bromaghin, Jeffrey F.; Masuda, Michele

    2014-01-01

    Many applications in ecological genetics involve sampling individuals from a mixture of multiple biological populations and subsequently associating those individuals with the populations from which they arose. Analytical methods that assign individuals to their putative population of origin have utility in both basic and applied research, providing information about population-specific life history and habitat use, ecotoxins, pathogen and parasite loads, and many other non-genetic ecological, or phenotypic traits. Although the question is initially directed at the origin of individuals, in most cases the ultimate desire is to investigate the distribution of some trait among populations. Current practice is to assign individuals to a population of origin and study properties of the trait among individuals within population strata as if they constituted independent samples. It seemed that approach might bias population-specific trait inference. In this study we made trait inferences directly through modeling, bypassing individual assignment. We extended a Bayesian model for population mixture analysis to incorporate parameters for the phenotypic trait and compared its performance to that of individual assignment with a minimum probability threshold for assignment. The Bayesian mixture model outperformed individual assignment under some trait inference conditions. However, by discarding individuals whose origins are most uncertain, the individual assignment method provided a less complex analytical technique whose performance may be adequate for some common trait inference problems. Our results provide specific guidance for method selection under various genetic relationships among populations with different trait distributions.

  13. Use of Genetic Data to Infer Population-Specific Ecological and Phenotypic Traits from Mixed Aggregations

    PubMed Central

    Moran, Paul; Bromaghin, Jeffrey F.; Masuda, Michele

    2014-01-01

    Many applications in ecological genetics involve sampling individuals from a mixture of multiple biological populations and subsequently associating those individuals with the populations from which they arose. Analytical methods that assign individuals to their putative population of origin have utility in both basic and applied research, providing information about population-specific life history and habitat use, ecotoxins, pathogen and parasite loads, and many other non-genetic ecological, or phenotypic traits. Although the question is initially directed at the origin of individuals, in most cases the ultimate desire is to investigate the distribution of some trait among populations. Current practice is to assign individuals to a population of origin and study properties of the trait among individuals within population strata as if they constituted independent samples. It seemed that approach might bias population-specific trait inference. In this study we made trait inferences directly through modeling, bypassing individual assignment. We extended a Bayesian model for population mixture analysis to incorporate parameters for the phenotypic trait and compared its performance to that of individual assignment with a minimum probability threshold for assignment. The Bayesian mixture model outperformed individual assignment under some trait inference conditions. However, by discarding individuals whose origins are most uncertain, the individual assignment method provided a less complex analytical technique whose performance may be adequate for some common trait inference problems. Our results provide specific guidance for method selection under various genetic relationships among populations with different trait distributions. PMID:24905464

  14. 41 CFR 102-79.10 - What basic assignment and utilization of space policy governs an Executive agency?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and utilization of space policy governs an Executive agency? 102-79.10 Section 102-79.10 Public... MANAGEMENT REGULATION REAL PROPERTY 79-ASSIGNMENT AND UTILIZATION OF SPACE General Provisions § 102-79.10 What basic assignment and utilization of space policy governs an Executive agency? Executive agencies...

  15. 41 CFR 102-79.10 - What basic assignment and utilization of space policy governs an Executive agency?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and utilization of space policy governs an Executive agency? 102-79.10 Section 102-79.10 Public... MANAGEMENT REGULATION REAL PROPERTY 79-ASSIGNMENT AND UTILIZATION OF SPACE General Provisions § 102-79.10 What basic assignment and utilization of space policy governs an Executive agency? Executive agencies...

  16. 41 CFR 102-79.10 - What basic assignment and utilization of space policy governs an Executive agency?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... and utilization of space policy governs an Executive agency? 102-79.10 Section 102-79.10 Public... MANAGEMENT REGULATION REAL PROPERTY 79-ASSIGNMENT AND UTILIZATION OF SPACE General Provisions § 102-79.10 What basic assignment and utilization of space policy governs an Executive agency? Executive agencies...

  17. 41 CFR 102-79.10 - What basic assignment and utilization of space policy governs an Executive agency?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... and utilization of space policy governs an Executive agency? 102-79.10 Section 102-79.10 Public... MANAGEMENT REGULATION REAL PROPERTY 79-ASSIGNMENT AND UTILIZATION OF SPACE General Provisions § 102-79.10 What basic assignment and utilization of space policy governs an Executive agency? Executive agencies...

  18. An evidential reasoning extension to quantitative model-based failure diagnosis

    NASA Technical Reports Server (NTRS)

    Gertler, Janos J.; Anderson, Kenneth C.

    1992-01-01

    The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.

  19. [Biometric bases: basic concepts of probability calculation].

    PubMed

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  20. GREMEX- GODDARD RESEARCH AND ENGINEERING MANAGEMENT EXERCISE SIMULATION SYSTEM

    NASA Technical Reports Server (NTRS)

    Vaccaro, M. J.

    1994-01-01

    GREMEX is a man-machine management simulation game of a research and development project. It can be used to depict a project from just after the development of the project plan through the final construction phase. The GREMEX computer programs are basically a program evaluation and review technique (PERT) reporting system. In the usual PERT program, the operator inputs each month the amount of work performed on each activity and the computer does the bookkeeping to determine the expected completion date of the project. GREMEX automatically assumes that all activities due to be worked in the current month will be worked. GREMEX predicts new durations (and costs) each month based on management actions taken by the players and the contractor's abilities. Each activity is assigned the usual cost and duration estimates but must also be assigned three parameters that relate to the probability that the time estimate is correct, the probability that the cost estimate is correct, and the probability of technical success. Management actions usually can be expected to change these probabilities. For example, use of overtime or double shifts in research and development work will decrease duration and increase cost by known proportions and will also decrease the probability of technical success due to an increase in the likelihood of accidents or mistakes. These re-estimating future events and assigning probability factors provides life to the model. GREMEX is not a production job for project management. GREMEX is a game that can be used to train management personnel in the administration of research and development type projects. GREMEX poses no 'best way' to manage a project. The emphasis of GREMEX is to expose participants to many of the factors involved in decision making when managing a project in a government research and development environment. A management team can win the game by surpassing cost, schedule, and technical performance goals established when the simulation began. The serious management experimenter can use GREMEX to explore the results of management methods they could not risk in real life. GREMEX can operate with any research and development type project with up to 15 subcontractors and produces reports simulating monthly or quarterly updates of the project PERT network. Included with the program is a data deck for simulation of a fictitious spacecraft project. Instructions for substituting other projects are also included. GREMEX is written in FORTRAN IV for execution in the batch mode and has been implemented on an IBM 360 with a central memory requirement of approximately 350K (decimal) of 8 bit bytes. The GREMEX system was developed in 1973.

  1. Saccade selection when reward probability is dynamically manipulated using Markov chains

    PubMed Central

    Lovejoy, Lee P.; Krauzlis, Richard J.

    2012-01-01

    Markov chains (stochastic processes where probabilities are assigned based on the previous outcome) are commonly used to examine the transitions between behavioral states, such as those that occur during foraging or social interactions. However, relatively little is known about how well primates can incorporate knowledge about Markov chains into their behavior. Saccadic eye movements are an example of a simple behavior influenced by information about probability, and thus are good candidates for testing whether subjects can learn Markov chains. In addition, when investigating the influence of probability on saccade target selection, the use of Markov chains could provide an alternative method that avoids confounds present in other task designs. To investigate these possibilities, we evaluated human behavior on a task in which stimulus reward probabilities were assigned using a Markov chain. On each trial, the subject selected one of four identical stimuli by saccade; after selection, feedback indicated the rewarded stimulus. Each session consisted of 200–600 trials, and on some sessions, the reward magnitude varied. On sessions with a uniform reward, subjects (n = 6) learned to select stimuli at a frequency close to reward probability, which is similar to human behavior on matching or probability classification tasks. When informed that a Markov chain assigned reward probabilities, subjects (n = 3) learned to select the greatest reward probability more often, bringing them close to behavior that maximizes reward. On sessions where reward magnitude varied across stimuli, subjects (n = 6) demonstrated preferences for both greater reward probability and greater reward magnitude, resulting in a preference for greater expected value (the product of reward probability and magnitude). These results demonstrate that Markov chains can be used to dynamically assign probabilities that are rapidly exploited by human subjects during saccade target selection. PMID:18330552

  2. Saccade selection when reward probability is dynamically manipulated using Markov chains.

    PubMed

    Nummela, Samuel U; Lovejoy, Lee P; Krauzlis, Richard J

    2008-05-01

    Markov chains (stochastic processes where probabilities are assigned based on the previous outcome) are commonly used to examine the transitions between behavioral states, such as those that occur during foraging or social interactions. However, relatively little is known about how well primates can incorporate knowledge about Markov chains into their behavior. Saccadic eye movements are an example of a simple behavior influenced by information about probability, and thus are good candidates for testing whether subjects can learn Markov chains. In addition, when investigating the influence of probability on saccade target selection, the use of Markov chains could provide an alternative method that avoids confounds present in other task designs. To investigate these possibilities, we evaluated human behavior on a task in which stimulus reward probabilities were assigned using a Markov chain. On each trial, the subject selected one of four identical stimuli by saccade; after selection, feedback indicated the rewarded stimulus. Each session consisted of 200-600 trials, and on some sessions, the reward magnitude varied. On sessions with a uniform reward, subjects (n = 6) learned to select stimuli at a frequency close to reward probability, which is similar to human behavior on matching or probability classification tasks. When informed that a Markov chain assigned reward probabilities, subjects (n = 3) learned to select the greatest reward probability more often, bringing them close to behavior that maximizes reward. On sessions where reward magnitude varied across stimuli, subjects (n = 6) demonstrated preferences for both greater reward probability and greater reward magnitude, resulting in a preference for greater expected value (the product of reward probability and magnitude). These results demonstrate that Markov chains can be used to dynamically assign probabilities that are rapidly exploited by human subjects during saccade target selection.

  3. Teaching Basic Probability in Undergraduate Statistics or Management Science Courses

    ERIC Educational Resources Information Center

    Naidu, Jaideep T.; Sanford, John F.

    2017-01-01

    Standard textbooks in core Statistics and Management Science classes present various examples to introduce basic probability concepts to undergraduate business students. These include tossing of a coin, throwing a die, and examples of that nature. While these are good examples to introduce basic probability, we use improvised versions of Russian…

  4. A probability space for quantum models

    NASA Astrophysics Data System (ADS)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  5. Threshold-selecting strategy for best possible ground state detection with genetic algorithms

    NASA Astrophysics Data System (ADS)

    Lässig, Jörg; Hoffmann, Karl Heinz

    2009-04-01

    Genetic algorithms are a standard heuristic to find states of low energy in complex state spaces as given by physical systems such as spin glasses but also in combinatorial optimization. The paper considers the problem of selecting individuals in the current population in genetic algorithms for crossover. Many schemes have been considered in literature as possible crossover selection strategies. We show for a large class of quality measures that the best possible probability distribution for selecting individuals in each generation of the algorithm execution is a rectangular distribution over the individuals sorted by their energy values. This means uniform probabilities have to be assigned to a group of the individuals with lowest energy in the population but probabilities equal to zero to individuals which are corresponding to energy values higher than a fixed cutoff, which is equal to a certain rank in the vector sorted by the energy of the states in the current population. The considered strategy is dubbed threshold selecting. The proof applies basic arguments of Markov chains and linear optimization and makes only a few assumptions on the underlying principles and hence applies to a large class of algorithms.

  6. ELECTRICAL AND ELECTRONIC INDUSTRIAL CONTROL. BASIC ELECTRICITY, UNIT 3, ASSIGNMENTS.

    ERIC Educational Resources Information Center

    SUTTON, MACK C.

    THIS GUIDE IS FOR INDIVIDUAL STUDENT USE IN STUDYING BASIC ELECTRICAL FUNDAMENTALS IN ELECTRICAL-ELECTRONIC PROGRAMS. IT WAS DEVELOPED BY AN INSTRUCTIONAL MATERIALS SPECIALIST AND ADVISERS. THE COURSE OBJECTIVE IS TO DEVELOP AN UNDERSTANDING OF DIRECT CURRENT FUNDAMENTALS. EACH OF THE 10 ASSIGNMENT SHEETS PROVIDES THE LESSON SUBJECT, PURPOSE,…

  7. ELECTRICAL AND ELECTRONIC INDUSTRIAL CONTROL. BASIC ELECTRICITY, UNIT 2, ASSIGNMENTS.

    ERIC Educational Resources Information Center

    SUTTON, MACK C.

    THIS GUIDE IS FOR INDIVIDUAL STUDENT USE IN STUDYING BASIC ELECTRICAL FUNDAMENTALS IN ELECTRICAL-ELECTRONIC PROGRAMS. IT WAS DEVELOPED BY AN INSTRUCTIONAL MATERIALS SPECIALIST AND ADVISERS. THE COURSE OBJECTIVE IS TO DEVELOP AN UNDERSTANDING OF DIRECT CURRENT FUNDAMENTALS. EACH OF THE 15 ASSIGNMENT SHEETS PROVIDES THE LESSON SUBJECT, PURPOSE,…

  8. Introduction to Probability, Part 1 - Basic Concepts. Student Text. Revised Edition.

    ERIC Educational Resources Information Center

    Blakeslee, David W.; And Others

    This book is designed to introduce the reader to some fundamental ideas about probability. The mathematical theory of probability plays an increasingly important role in science, government, industry, business, and economics. An understanding of the basic concepts of probability is essential for the study of statistical methods that are widely…

  9. Unders and Overs: Using a Dice Game to Illustrate Basic Probability Concepts

    ERIC Educational Resources Information Center

    McPherson, Sandra Hanson

    2015-01-01

    In this paper, the dice game "Unders and Overs" is described and presented as an active learning exercise to introduce basic probability concepts. The implementation of the exercise is outlined and the resulting presentation of various probability concepts are described.

  10. Loss Aversion in the Classroom: A Nudge towards a Better Grade?

    ERIC Educational Resources Information Center

    Grijalva, Therese; Koford, Brandon C.; Parkhurst, Gregory

    2018-01-01

    Using data from 499 students over 12 sections, 2 courses, and 3 instructors, we estimate the effect of loss aversion on the probability of turning in extra credit assignments and the effect on the overall grade. Regression results indicate no effect of loss aversion on the probability of turning in extra credit assignments and no effect on a…

  11. Beyond the Bridge Metaphor: Rethinking the Place of the Literacy Narrative in the Basic Writing Curriculum

    ERIC Educational Resources Information Center

    Hall, Anne-Marie; Minnix, Christopher

    2012-01-01

    Critical analysis of the literacy narrative assignment within the context of the other genres in a basic writing course complicates understandings of the political import of the assignment. While several advocates of the literacy narrative have argued that it has the power of what Jean-François Lyotard has called petits récits, the authors argue…

  12. A Markov chain model for reliability growth and decay

    NASA Technical Reports Server (NTRS)

    Siegrist, K.

    1982-01-01

    A mathematical model is developed to describe a complex system undergoing a sequence of trials in which there is interaction between the internal states of the system and the outcomes of the trials. For example, the model might describe a system undergoing testing that is redesigned after each failure. The basic assumptions for the model are that the state of the system after a trial depends probabilistically only on the state before the trial and on the outcome of the trial and that the outcome of a trial depends probabilistically only on the state of the system before the trial. It is shown that under these basic assumptions, the successive states form a Markov chain and the successive states and outcomes jointly form a Markov chain. General results are obtained for the transition probabilities, steady-state distributions, etc. A special case studied in detail describes a system that has two possible state ('repaired' and 'unrepaired') undergoing trials that have three possible outcomes ('inherent failure', 'assignable-cause' 'failure' and 'success'). For this model, the reliability function is computed explicitly and an optimal repair policy is obtained.

  13. Automatic seed selection for segmentation of liver cirrhosis in laparoscopic sequences

    NASA Astrophysics Data System (ADS)

    Sinha, Rahul; Marcinczak, Jan Marek; Grigat, Rolf-Rainer

    2014-03-01

    For computer aided diagnosis based on laparoscopic sequences, image segmentation is one of the basic steps which define the success of all further processing. However, many image segmentation algorithms require prior knowledge which is given by interaction with the clinician. We propose an automatic seed selection algorithm for segmentation of liver cirrhosis in laparoscopic sequences which assigns each pixel a probability of being cirrhotic liver tissue or background tissue. Our approach is based on a trained classifier using SIFT and RGB features with PCA. Due to the unique illumination conditions in laparoscopic sequences of the liver, a very low dimensional feature space can be used for classification via logistic regression. The methodology is evaluated on 718 cirrhotic liver and background patches that are taken from laparoscopic sequences of 7 patients. Using a linear classifier we achieve a precision of 91% in a leave-one-patient-out cross-validation. Furthermore, we demonstrate that with logistic probability estimates, seeds with high certainty of being cirrhotic liver tissue can be obtained. For example, our precision of liver seeds increases to 98.5% if only seeds with more than 95% probability of being liver are used. Finally, these automatically selected seeds can be used as priors in Graph Cuts which is demonstrated in this paper.

  14. Wavelength assignment algorithm considering the state of neighborhood links for OBS networks

    NASA Astrophysics Data System (ADS)

    Tanaka, Yu; Hirota, Yusuke; Tode, Hideki; Murakami, Koso

    2005-10-01

    Recently, Optical WDM technology is introduced into backbone networks. On the other hand, as the future optical switching scheme, Optical Burst Switching (OBS) systems become a realistic solution. OBS systems do not consider buffering in intermediate nodes. Thus, it is an important issue to avoid overlapping wavelength reservation between partially interfered paths. To solve this problem, so far, the wavelength assignment scheme which has priority management tables has been proposed. This method achieves the reduction of burst blocking probability. However, this priority management table requires huge memory space. In this paper, we propose a wavelength assignment algorithm that reduces both the number of priority management tables and burst blocking probability. To reduce priority management tables, we allocate and manage them for each link. To reduce burst blocking probability, our method announces information about the change of their priorities to intermediate nodes. We evaluate its performance in terms of the burst blocking probability and the reduction rate of priority management tables.

  15. Anticipating Job Aiding and Training Requirements

    DTIC Science & Technology

    2009-01-01

    own ability to learn with statements like “I know how to do this,” getting right to work on assignments, helping peers, elaborating beyond basic...know how to do this”, getting right to work on assignments, helping peers, elaborating beyond basic understandings) _____ Selecting...older. In was estimated that over 50 percent of the workforce in 2007 was eligible to retire [Ref. 5], thus raising concerns about how to: a) replace

  16. An agglomerative hierarchical clustering approach to visualisation in Bayesian clustering problems

    PubMed Central

    Dawson, Kevin J.; Belkhir, Khalid

    2009-01-01

    Clustering problems (including the clustering of individuals into outcrossing populations, hybrid generations, full-sib families and selfing lines) have recently received much attention in population genetics. In these clustering problems, the parameter of interest is a partition of the set of sampled individuals, - the sample partition. In a fully Bayesian approach to clustering problems of this type, our knowledge about the sample partition is represented by a probability distribution on the space of possible sample partitions. Since the number of possible partitions grows very rapidly with the sample size, we can not visualise this probability distribution in its entirety, unless the sample is very small. As a solution to this visualisation problem, we recommend using an agglomerative hierarchical clustering algorithm, which we call the exact linkage algorithm. This algorithm is a special case of the maximin clustering algorithm that we introduced previously. The exact linkage algorithm is now implemented in our software package Partition View. The exact linkage algorithm takes the posterior co-assignment probabilities as input, and yields as output a rooted binary tree, - or more generally, a forest of such trees. Each node of this forest defines a set of individuals, and the node height is the posterior co-assignment probability of this set. This provides a useful visual representation of the uncertainty associated with the assignment of individuals to categories. It is also a useful starting point for a more detailed exploration of the posterior distribution in terms of the co-assignment probabilities. PMID:19337306

  17. A Survey of Automated Assessment Approaches for Programming Assignments

    ERIC Educational Resources Information Center

    Ala-Mutka, Kirsti M.

    2005-01-01

    Practical programming is one of the basic skills pursued in computer science education. On programming courses, the coursework consists of programming assignments that need to be assessed from different points of view. Since the submitted assignments are executable programs with a formal structure, some features can be assessed automatically. The…

  18. Plagiarism-Proofing Assignments

    ERIC Educational Resources Information Center

    Johnson, Doug

    2004-01-01

    Mr. Johnson has discovered that the higher the level of student engagement and creativity, the lower the probability of plagiarism. For teachers who would like to see such desirable results, he describes the characteristics of assignments that are most likely to produce them. Two scenarios of types of assignments that avoid plagiarism are…

  19. Statistical power for the comparative regression discontinuity design with a nonequivalent comparison group.

    PubMed

    Tang, Yang; Cook, Thomas D; Kisbu-Sakarya, Yasemin

    2018-03-01

    In the "sharp" regression discontinuity design (RD), all units scoring on one side of a designated score on an assignment variable receive treatment, whereas those scoring on the other side become controls. Thus the continuous assignment variable and binary treatment indicator are measured on the same scale. Because each must be in the impact model, the resulting multi-collinearity reduces the efficiency of the RD design. However, untreated comparison data can be added along the assignment variable, and a comparative regression discontinuity design (CRD) is then created. When the untreated data come from a non-equivalent comparison group, we call this CRD-CG. Assuming linear functional forms, we show that power in CRD-CG is (a) greater than in basic RD; (b) less sensitive to the location of the cutoff and the distribution of the assignment variable; and that (c) fewer treated units are needed in the basic RD component within the CRD-CG so that savings can result from having fewer treated cases. The theory we develop is used to make numerical predictions about the efficiency of basic RD and CRD-CG relative to each other and to a randomized control trial. Data from the National Head Start Impact study are used to test these predictions. The obtained estimates are closer to the predicted parameters for CRD-CG than for basic RD and are generally quite close to the parameter predictions, supporting the emerging argument that CRD should be the design of choice in many applications for which basic RD is now used. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. Network anomaly detection system with optimized DS evidence theory.

    PubMed

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network-complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each sensor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly.

  1. Molecular characterization of novel pyridoxal-5'-phosphate-dependent enzymes from the human microbiome.

    PubMed

    Fleischman, Nicholas M; Das, Debanu; Kumar, Abhinav; Xu, Qingping; Chiu, Hsiu-Ju; Jaroszewski, Lukasz; Knuth, Mark W; Klock, Heath E; Miller, Mitchell D; Elsliger, Marc-André; Godzik, Adam; Lesley, Scott A; Deacon, Ashley M; Wilson, Ian A; Toney, Michael D

    2014-08-01

    Pyridoxal-5'-phosphate or PLP, the active form of vitamin B6, is a highly versatile cofactor that participates in a large number of mechanistically diverse enzymatic reactions in basic metabolism. PLP-dependent enzymes account for ∼1.5% of most prokaryotic genomes and are estimated to be involved in ∼4% of all catalytic reactions, making this an important class of enzymes. Here, we structurally and functionally characterize three novel PLP-dependent enzymes from bacteria in the human microbiome: two are from Eubacterium rectale, a dominant, nonpathogenic, fecal, Gram-positive bacteria, and the third is from Porphyromonas gingivalis, which plays a major role in human periodontal disease. All adopt the Type I PLP-dependent enzyme fold and structure-guided biochemical analysis enabled functional assignments as tryptophan, aromatic, and probable phosphoserine aminotransferases. © 2014 The Protein Society.

  2. Network Anomaly Detection System with Optimized DS Evidence Theory

    PubMed Central

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network—complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each senor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly. PMID:25254258

  3. Direct Assistance: USAID Has Taken Positive Action to Assess Afghan Ministries’ Ability to Manage Donor Funds, but Concerns Remain

    DTIC Science & Technology

    2014-01-01

    with the adverse event’s potential impact , ranging from negligible to catastrophic. Appendix V includes a matrix of how USAID/Afghanistan assigns risk...International Development (USAID) assigns risk ratings based on potential impact and probability of occurrence of an identified risk. The impact measures...frequent. Combining impact and probability factors categorize risk clusters of critical, high, medium and low categories. Although subjective, it is

  4. Correlation of probability scores of placenta accreta on magnetic resonance imaging with hemorrhagic morbidity.

    PubMed

    Lim, Grace; Horowitz, Jeanne M; Berggruen, Senta; Ernst, Linda M; Linn, Rebecca L; Hewlett, Bradley; Kim, Jennifer; Chalifoux, Laurie A; McCarthy, Robert J

    2016-11-01

    To evaluate the hypothesis that assigning grades to magnetic resonance imaging (MRI) findings of suspected placenta accreta will correlate with hemorrhagic outcomes. We chose a single-center, retrospective, observational design. Nulliparous or multiparous women who had antenatal placental MRI performed at a tertiary level academic hospital were included. Cases with antenatal placental MRI were included and compared with cases without MRI performed. Two radiologists assigned a probability score for accreta to each study. Estimated blood loss and transfusion requirements were compared among groups by the Kruskal-Wallis H test. Thirty-five cases had placental MRI performed. MRI performance was associated with higher blood loss compared with the non-MRI group (2600 [1400-4500]mL vs 900[600-1500]mL, P<.001). There was no difference in estimated blood loss (P=.31) or transfusion (P=.57) among the MRI probability groups. In cases of suspected placenta accreta, probability scores for antenatal placental MRI may not be associated with increasing degrees of hemorrhage. Continued research is warranted to determine the effectiveness of assigning probability scores for antenatal accreta imaging studies, combined with clinical indices of suspicion, in assisting with antenatal multidisciplinary team planning for operative management of this morbid condition. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Interpretation of the results of statistical measurements. [search for basic probability model

    NASA Technical Reports Server (NTRS)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  6. Developing Basic Math Skills for Marketing. Student Manual and Laboratory Guide.

    ERIC Educational Resources Information Center

    Klewer, Edwin D.

    Field tested with students in grades 10-12, this manual is designed to teach students in marketing courses basic mathematical concepts. The instructional booklet contains seven student assignments covering the following topics: why basic mathematics is so important, whole numbers, fractions, decimals, percentages, weights and measures, and dollars…

  7. Probability sampling in legal cases: Kansas cellphone users

    NASA Astrophysics Data System (ADS)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  8. Weighted Fuzzy Risk Priority Number Evaluation of Turbine and Compressor Blades Considering Failure Mode Correlations

    NASA Astrophysics Data System (ADS)

    Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-06-01

    Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.

  9. Augmenting superpopulation capture-recapture models with population assignment data

    USGS Publications Warehouse

    Wen, Zhi; Pollock, Kenneth; Nichols, James; Waser, Peter

    2011-01-01

    Ecologists applying capture-recapture models to animal populations sometimes have access to additional information about individuals' populations of origin (e.g., information about genetics, stable isotopes, etc.). Tests that assign an individual's genotype to its most likely source population are increasingly used. Here we show how to augment a superpopulation capture-recapture model with such information. We consider a single superpopulation model without age structure, and split each entry probability into separate components due to births in situ and immigration. We show that it is possible to estimate these two probabilities separately. We first consider the case of perfect information about population of origin, where we can distinguish individuals born in situ from immigrants with certainty. Then we consider the more realistic case of imperfect information, where we use genetic or other information to assign probabilities to each individual's origin as in situ or outside the population. We use a resampling approach to impute the true population of origin from imperfect assignment information. The integration of data on population of origin with capture-recapture data allows us to determine the contributions of immigration and in situ reproduction to the growth of the population, an issue of importance to ecologists. We illustrate our new models with capture-recapture and genetic assignment data from a population of banner-tailed kangaroo rats Dipodomys spectabilis in Arizona.

  10. Multiple murder and criminal careers: a latent class analysis of multiple homicide offenders.

    PubMed

    Vaughn, Michael G; DeLisi, Matt; Beaver, Kevin M; Howard, Matthew O

    2009-01-10

    To construct an empirically rigorous typology of multiple homicide offenders (MHOs). The current study conducted latent class analysis of the official records of 160 MHOs sampled from eight states to evaluate their criminal careers. A 3-class solution best fit the data (-2LL=-1123.61, Bayesian Information Criterion (BIC)=2648.15, df=81, L(2)=1179.77). Class 1 (n=64, class assignment probability=.999) was the low-offending group marked by little criminal record and delayed arrest onset. Class 2 (n=51, class assignment probability=.957) was the severe group that represents the most violent and habitual criminals. Class 3 (n=45, class assignment probability=.959) was the moderate group whose offending careers were similar to Class 2. A sustained criminal career with involvement in versatile forms of crime was observed for two of three classes of MHOs. Linkages to extant typologies and recommendations for additional research that incorporates clinical constructs are proffered.

  11. A test of geographic assignment using isotope tracers in feathers of known origin

    USGS Publications Warehouse

    Wunder, Michael B.; Kester, C.L.; Knopf, F.L.; Rye, R.O.

    2005-01-01

    We used feathers of known origin collected from across the breeding range of a migratory shorebird to test the use of isotope tracers for assigning breeding origins. We analyzed δD, δ13C, and δ15N in feathers from 75 mountain plover (Charadrius montanus) chicks sampled in 2001 and from 119 chicks sampled in 2002. We estimated parameters for continuous-response inverse regression models and for discrete-response Bayesian probability models from data for each year independently. We evaluated model predictions with both the training data and by using the alternate year as an independent test dataset. Our results provide weak support for modeling latitude and isotope values as monotonic functions of one another, especially when data are pooled over known sources of variation such as sample year or location. We were unable to make even qualitative statements, such as north versus south, about the likely origin of birds using both δD and δ13C in inverse regression models; results were no better than random assignment. Probability models provided better results and a more natural framework for the problem. Correct assignment rates were highest when considering all three isotopes in the probability framework, but the use of even a single isotope was better than random assignment. The method appears relatively robust to temporal effects and is most sensitive to the isotope discrimination gradients over which samples are taken. We offer that the problem of using isotope tracers to infer geographic origin is best framed as one of assignment, rather than prediction.

  12. Process service quality evaluation based on Dempster-Shafer theory and support vector machine.

    PubMed

    Pei, Feng-Que; Li, Dong-Bo; Tong, Yi-Fei; He, Fei

    2017-01-01

    Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM) and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs) are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.

  13. Multi-Path Transportation Futures Study. Results from Phase 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phil Patterson, Phil; Singh, Margaret; Plotkin, Steve

    2007-03-09

    Presentation reporting Phase 1 results, 3/9/2007. Projecting the future role of advanced drivetrains and fuels in the light vehicle market is inherently difficult, given the uncertainty (and likely volatility) of future oil prices, inadequate understanding of likely consumer response to new technologies, the relative infancy of several important new technologies with inevitable future changes in their performance and costs, and the importance — and uncertainty — of future government marketplace interventions (e.g., new regulatory standards or vehicle purchase incentives). The Multi-Path Transportation Futures (MP) Study has attempted to improve our understanding of this future role by examining several scenarios ofmore » vehicle costs, fuel prices, government subsidies, and other key factors. These are projections, not forecasts, in that they try to answer a series of “what if” questions without assigning probabilities to most of the basic assumptions.« less

  14. Promiscuity and the evolution of sexual transmitted diseases

    NASA Astrophysics Data System (ADS)

    Gonçalves, Sebastián; Kuperman, Marcelo; Ferreira da Costa Gomes, Marcelo

    2003-09-01

    We study the relation between different social behaviors and the onset of epidemics in a model for the dynamics of sexual transmitted diseases. The model considers the society as a system of individual sexuated agents that can be organized in couples and interact with each other. The different social behaviors are incorporated assigning what we call a promiscuity value to each individual agent. The individual promiscuity is taken from a distribution and represents the daily probability of going out to look for a sexual partner, abandoning its eventual mate. In terms of this parameter we find a threshold for the epidemic which is much lower than the classical SIR model prediction, i.e., R0 (basic reproductive number)=1. Different forms for the distribution of the population promiscuity are considered showing that the threshold is weakly sensitive to them. We study the homosexual and the heterosexual case as well.

  15. The impossibility of probabilities

    NASA Astrophysics Data System (ADS)

    Zimmerman, Peter D.

    2017-11-01

    This paper discusses the problem of assigning probabilities to the likelihood of nuclear terrorism events, in particular examining the limitations of using Bayesian priors for this purpose. It suggests an alternate approach to analyzing the threat of nuclear terrorism.

  16. Introduction to Agricultural Sales and Service. Teacher Edition.

    ERIC Educational Resources Information Center

    Kauer, Les

    This Oklahoma curriculum guide contains 12 units. Each instructional unit includes some or all of these components: performance objectives, suggested activities, basic academic skills taxonomy, handouts, information sheets, supplements, transparency masters, activity sheets, assignment sheets, assignment sheet answers, job sheets, practical tests,…

  17. Lessons Learned from Dependency Usage in HERA: Implications for THERP-Related HRA Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    April M. Whaley; Ronald L. Boring; Harold S. Blackman

    Dependency occurs when the probability of success or failure on one action changes the probability of success or failure on a subsequent action. Dependency may serve as a modifier on the human error probabilities (HEPs) for successive actions in human reliability analysis (HRA) models. Discretion should be employed when determining whether or not a dependency calculation is warranted: dependency should not be assigned without strongly grounded reasons. Human reliability analysts may sometimes assign dependency in cases where it is unwarranted. This inappropriate assignment is attributed to a lack of clear guidance to encompass the range of scenarios human reliability analystsmore » are addressing. Inappropriate assignment of dependency produces inappropriately elevated HEP values. Lessons learned about dependency usage in the Human Event Repository and Analysis (HERA) system may provide clarification and guidance for analysts using first-generation HRA methods. This paper presents the HERA approach to dependency assessment and discusses considerations for dependency usage in HRA, including the cognitive basis for dependency, direction for determining when dependency should be assessed, considerations for determining the dependency level, temporal issues to consider when assessing dependency, (e.g., considering task sequence versus overall event sequence, and dependency over long periods of time), and diagnosis and action influences on dependency.« less

  18. Local Structure Theory for Cellular Automata.

    NASA Astrophysics Data System (ADS)

    Gutowitz, Howard Andrew

    The local structure theory (LST) is a generalization of the mean field theory for cellular automata (CA). The mean field theory makes the assumption that iterative application of the rule does not introduce correlations between the states of cells in different positions. This assumption allows the derivation of a simple formula for the limit density of each possible state of a cell. The most striking feature of CA is that they may well generate correlations between the states of cells as they evolve. The LST takes the generation of correlation explicitly into account. It thus has the potential to describe statistical characteristics in detail. The basic assumption of the LST is that though correlation may be generated by CA evolution, this correlation decays with distance. This assumption allows the derivation of formulas for the estimation of the probability of large blocks of states in terms of smaller blocks of states. Given the probabilities of blocks of size n, probabilities may be assigned to blocks of arbitrary size such that these probability assignments satisfy the Kolmogorov consistency conditions and hence may be used to define a measure on the set of all possible (infinite) configurations. Measures defined in this way are called finite (or n-) block measures. A function called the scramble operator of order n maps a measure to an approximating n-block measure. The action of a CA on configurations induces an action on measures on the set of all configurations. The scramble operator is combined with the CA map on measure to form the local structure operator (LSO). The LSO of order n maps the set of n-block measures into itself. It is hypothesised that the LSO applied to n-block measures approximates the rule itself on general measures, and does so increasingly well as n increases. The fundamental advantage of the LSO is that its action is explicitly computable from a finite system of rational recursion equations. Empirical study of a number of CA rules demonstrates the potential of the LST to describe the statistical features of CA. The behavior of some simple rules is derived analytically. Other rules have more complex, chaotic behavior. Even for these rules, the LST yields an accurate portrait of both small and large time statistics.

  19. 5 CFR 870.908 - Annuitants and compensationers.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... REGULATIONS (CONTINUED) FEDERAL EMPLOYEES' GROUP LIFE INSURANCE PROGRAM Assignments of Life Insurance § 870.908 Annuitants and compensationers. (a) If an employee assigns Basic insurance and later becomes...) At the time he/she retires or becomes eligible as a compensationer, the insured individual may elect...

  20. 5 CFR 870.908 - Annuitants and compensationers.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REGULATIONS (CONTINUED) FEDERAL EMPLOYEES' GROUP LIFE INSURANCE PROGRAM Assignments of Life Insurance § 870.908 Annuitants and compensationers. (a) If an employee assigns Basic insurance and later becomes...) At the time he/she retires or becomes eligible as a compensationer, the insured individual may elect...

  1. Growth of left ventricular mass with military basic training in army recruits.

    PubMed

    Batterham, Alan M; George, Keith P; Birch, Karen M; Pennell, Dudley J; Myerson, Saul G

    2011-07-01

    Exercise-induced left ventricular hypertrophy is well documented, but whether this occurs merely in line with concomitant increases in lean body mass is unclear. Our aim was to model the extent of left ventricular hypertrophy associated with increased lean body mass attributable to an exercise training program. Cardiac and whole-body magnetic resonance imaging was performed before and after a 10-wk intensive British Army basic training program in a sample of 116 healthy Caucasian males (aged 17-28 yr). The within-subjects repeated-measures allometric relationship between lean body mass and left ventricular mass was modeled to allow the proper normalization of changes in left ventricular mass for attendant changes in lean body mass. To linearize the general allometric model (Y=aXb), data were log-transformed before analysis; the resulting effects were therefore expressed as percent changes. We quantified the probability that the true population increase in normalized left ventricular mass was greater than a predefined minimum important difference of 0.2 SD, assigning a probabilistic descriptive anchor for magnitude-based inference. The absolute increase in left ventricular mass was 4.8% (90% confidence interval=3.5%-6%), whereas lean body mass increased by 2.6% (2.1%-3.0%). The change in left ventricular mass adjusted for the change in lean body mass was 3.5% (1.9%-5.1%), equivalent to an increase of 0.25 SD (0.14-0.37). The probability that this effect size was greater than or equal to our predefined minimum important change of 0.2 SD was 0.78-likely to be important. After correction for allometric growth rates, left ventricular hypertrophy and lean body mass changes do not occur at the same magnitude in response to chronic exercise.

  2. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  3. Emergency Physician Risk Estimates and Admission Decisions for Chest Pain: A Web-Based Scenario Study.

    PubMed

    Schriger, David L; Menchine, Michael; Wiechmann, Warren; Carmelli, Guy

    2018-04-20

    We conducted this study to better understand how emergency physicians estimate risk and make admission decisions for patients with low-risk chest pain. We created a Web-based survey consisting of 5 chest pain scenarios that included history, physical examination, ECG findings, and basic laboratory studies, including a negative initial troponin-level result. We administered the scenarios in random order to emergency medicine residents and faculty at 11 US emergency medicine residency programs. We randomized respondents to receive questions about 1 of 2 endpoints, acute coronary syndrome or serious complication (death, dysrhythmia, or congestive heart failure within 30 days). For each scenario, the respondent provided a quantitative estimate of the probability of the endpoint, a qualitative estimate of the risk of the endpoint (very low, low, moderate, high, or very high), and an admission decision. Respondents also provided demographic information and completed a 3-item Fear of Malpractice scale. Two hundred eight (65%) of 320 eligible physicians completed the survey, 73% of whom were residents. Ninety-five percent of respondents were wholly consistent (no admitted patient was assigned a lower probability than a discharged patient). For individual scenarios, probability estimates covered at least 4 orders of magnitude; admission rates for scenarios varied from 16% to 99%. The majority of respondents (>72%) had admission thresholds at or below a 1% probability of acute coronary syndrome. Respondents did not fully differentiate the probability of acute coronary syndrome and serious outcome; for each scenario, estimates for the two were quite similar despite a serious outcome being far less likely. Raters used the terms "very low risk" and "low risk" only when their probability estimates were less than 1%. The majority of respondents considered any probability greater than 1% for acute coronary syndrome or serious outcome to be at least moderate risk and warranting admission. Physicians used qualitative terms in ways fundamentally different from how they are used in ordinary conversation, which may lead to miscommunication during shared decisionmaking processes. These data suggest that probability or utility models are inadequate to describe physician decisionmaking for patients with chest pain. Copyright © 2018 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  4. A Weighted Configuration Model and Inhomogeneous Epidemics

    NASA Astrophysics Data System (ADS)

    Britton, Tom; Deijfen, Maria; Liljeros, Fredrik

    2011-12-01

    A random graph model with prescribed degree distribution and degree dependent edge weights is introduced. Each vertex is independently equipped with a random number of half-edges and each half-edge is assigned an integer valued weight according to a distribution that is allowed to depend on the degree of its vertex. Half-edges with the same weight are then paired randomly to create edges. An expression for the threshold for the appearance of a giant component in the resulting graph is derived using results on multi-type branching processes. The same technique also gives an expression for the basic reproduction number for an epidemic on the graph where the probability that a certain edge is used for transmission is a function of the edge weight (reflecting how closely `connected' the corresponding vertices are). It is demonstrated that, if vertices with large degree tend to have large (small) weights on their edges and if the transmission probability increases with the edge weight, then it is easier (harder) for the epidemic to take off compared to a randomized epidemic with the same degree and weight distribution. A recipe for calculating the probability of a large outbreak in the epidemic and the size of such an outbreak is also given. Finally, the model is fitted to three empirical weighted networks of importance for the spread of contagious diseases and it is shown that R 0 can be substantially over- or underestimated if the correlation between degree and weight is not taken into account.

  5. Advertising, a Distributive Education Manual and Answer Book.

    ERIC Educational Resources Information Center

    Martin, Charles H.; Cyrus, Cinda L.

    This revised manual for individualized instruction of distributive education trainees at the high school or junior college level in basic advertising and sales promotion activities includes 15 self-study assignments, teaching suggestions, and a bibliography. Together with a separate answer key, each assignment provides student questions and…

  6. A Sequence of Assignments for Basic Writing: Teaching To Problems "Beyond the Sentence."

    ERIC Educational Resources Information Center

    Wall, Susan V.

    Students in college basic writing courses need to consider their own written language and to compare it with other students' work before they can develop a sense of the symbolic relationship between language and experience. Because of a lack of previous writing experience, basic writers have no sense that the "facts" about which they write are…

  7. A Bayesian method for using simulator data to enhance human error probabilities assigned by existing HRA methods

    DOE PAGES

    Groth, Katrina M.; Smith, Curtis L.; Swiler, Laura P.

    2014-04-05

    In the past several years, several international agencies have begun to collect data on human performance in nuclear power plant simulators [1]. This data provides a valuable opportunity to improve human reliability analysis (HRA), but there improvements will not be realized without implementation of Bayesian methods. Bayesian methods are widely used in to incorporate sparse data into models in many parts of probabilistic risk assessment (PRA), but Bayesian methods have not been adopted by the HRA community. In this article, we provide a Bayesian methodology to formally use simulator data to refine the human error probabilities (HEPs) assigned by existingmore » HRA methods. We demonstrate the methodology with a case study, wherein we use simulator data from the Halden Reactor Project to update the probability assignments from the SPAR-H method. The case study demonstrates the ability to use performance data, even sparse data, to improve existing HRA methods. Furthermore, this paper also serves as a demonstration of the value of Bayesian methods to improve the technical basis of HRA.« less

  8. Basic Facialist. Teacher Edition. Cosmetology Series.

    ERIC Educational Resources Information Center

    Rogers, Jeanette A.

    This Oklahoma curriculum guide contains six units. Each instructional unit includes some or all of these basic components: performance objectives; suggested activities for the teacher; pretest; handouts; information sheets; transparency masters; assignment sheets; job sheets; practical tests; written tests; and answers to pretest, assignment…

  9. A scoring algorithm for predicting the presence of adult asthma: a prospective derivation study.

    PubMed

    Tomita, Katsuyuki; Sano, Hiroyuki; Chiba, Yasutaka; Sato, Ryuji; Sano, Akiko; Nishiyama, Osamu; Iwanaga, Takashi; Higashimoto, Yuji; Haraguchi, Ryuta; Tohda, Yuji

    2013-03-01

    To predict the presence of asthma in adult patients with respiratory symptoms, we developed a scoring algorithm using clinical parameters. We prospectively analysed 566 adult outpatients who visited Kinki University Hospital for the first time with complaints of nonspecific respiratory symptoms. Asthma was comprehensively diagnosed by specialists using symptoms, signs, and objective tools including bronchodilator reversibility and/or the assessment of bronchial hyperresponsiveness (BHR). Multiple logistic regression analysis was performed to categorise patients and determine the accuracy of diagnosing asthma. A scoring algorithm using the symptom-sign score was developed, based on diurnal variation of symptoms (1 point), recurrent episodes (2 points), medical history of allergic diseases (1 point), and wheeze sound (2 points). A score of >3 had 35% sensitivity and 97% specificity for discriminating between patients with and without asthma and assigned a high probability of having asthma (accuracy 90%). A score of 1 or 2 points assigned intermediate probability (accuracy 68%). After providing additional data of forced expiratory volume in 1 second/forced vital capacity (FEV(1)/FVC) ratio <0.7, the post-test probability of having asthma was increased to 93%. A score of 0 points assigned low probability (accuracy 31%). After providing additional data of positive reversibility, the post-test probability of having asthma was increased to 88%. This pragmatic diagnostic algorithm is useful for predicting the presence of adult asthma and for determining the appropriate time for consultation with a pulmonologist.

  10. Mechanical Drafting with CAD. Teacher Edition.

    ERIC Educational Resources Information Center

    McClain, Gerald R.

    This instructor's manual contains 13 units of instruction for a course on mechanical drafting with options for using computer-aided drafting (CAD). Each unit includes some or all of the following basic components of a unit of instruction: objective sheet, suggested activities for the teacher, assignment sheets and answers to assignment sheets,…

  11. Merchandise Display, a Distributive Education Manual and Answer Book.

    ERIC Educational Resources Information Center

    Hatchett, Melvin S.

    This revised manual in basic merchandise display for trainees in distributive education, together with a separate answer key, contains 23 self-study assignments, each with training objectives, questions to answer, and student projects. Developed by a distributive education coordinator, these assignments cover a wide range of topics, from the…

  12. A fast combination method in DSmT and its application to recommender system

    PubMed Central

    Liu, Yihai

    2018-01-01

    In many applications involving epistemic uncertainties usually modeled by belief functions, it is often necessary to approximate general (non-Bayesian) basic belief assignments (BBAs) to subjective probabilities (called Bayesian BBAs). This necessity occurs if one needs to embed the fusion result in a system based on the probabilistic framework and Bayesian inference (e.g. tracking systems), or if one needs to make a decision in the decision making problems. In this paper, we present a new fast combination method, called modified rigid coarsening (MRC), to obtain the final Bayesian BBAs based on hierarchical decomposition (coarsening) of the frame of discernment. Regarding this method, focal elements with probabilities are coarsened efficiently to reduce computational complexity in the process of combination by using disagreement vector and a simple dichotomous approach. In order to prove the practicality of our approach, this new approach is applied to combine users’ soft preferences in recommender systems (RSs). Additionally, in order to make a comprehensive performance comparison, the proportional conflict redistribution rule #6 (PCR6) is regarded as a baseline in a range of experiments. According to the results of experiments, MRC is more effective in accuracy of recommendations compared to original Rigid Coarsening (RC) method and comparable in computational time. PMID:29351297

  13. Sensor data monitoring and decision level fusion scheme for early fire detection

    NASA Astrophysics Data System (ADS)

    Rizogiannis, Constantinos; Thanos, Konstantinos Georgios; Astyakopoulos, Alkiviadis; Kyriazanos, Dimitris M.; Thomopoulos, Stelios C. A.

    2017-05-01

    The aim of this paper is to present the sensor monitoring and decision level fusion scheme for early fire detection which has been developed in the context of the AF3 Advanced Forest Fire Fighting European FP7 research project, adopted specifically in the OCULUS-Fire control and command system and tested during a firefighting field test in Greece with prescribed real fire, generating early-warning detection alerts and notifications. For this purpose and in order to improve the reliability of the fire detection system, a two-level fusion scheme is developed exploiting a variety of observation solutions from air e.g. UAV infrared cameras, ground e.g. meteorological and atmospheric sensors and ancillary sources e.g. public information channels, citizens smartphone applications and social media. In the first level, a change point detection technique is applied to detect changes in the mean value of each measured parameter by the ground sensors such as temperature, humidity and CO2 and then the Rate-of-Rise of each changed parameter is calculated. In the second level the fire event Basic Probability Assignment (BPA) function is determined for each ground sensor using Fuzzy-logic theory and then the corresponding mass values are combined in a decision level fusion process using Evidential Reasoning theory to estimate the final fire event probability.

  14. Sheetmetal. Performance Objectives. Basic Course.

    ERIC Educational Resources Information Center

    Murwin, Roland

    Several intermediate performance objectives and corresponding criterion measures are listed for each of six terminal objectives for a basic high school sheetmetal work course. The titles of the terminal objectives are Orientation, Shop Machinery and Material, Soldering, Measurements and Layouts, Assigned Shop Projects, and Radial and Triangulation…

  15. Factorization of Observables

    NASA Astrophysics Data System (ADS)

    Eliaš, Peter; Frič, Roman

    2017-12-01

    Categorical approach to probability leads to better understanding of basic notions and constructions in generalized (fuzzy, operational, quantum) probability, where observables—dual notions to generalized random variables (statistical maps)—play a major role. First, to avoid inconsistencies, we introduce three categories L, S, and P, the objects and morphisms of which correspond to basic notions of fuzzy probability theory and operational probability theory, and describe their relationships. To illustrate the advantages of categorical approach, we show that two categorical constructions involving observables (related to the representation of generalized random variables via products, or smearing of sharp observables, respectively) can be described as factorizing a morphism into composition of two morphisms having desired properties. We close with a remark concerning products.

  16. Disability Evaluation System and Temporary Limited Duty Assignment Process: A Qualitative Review.

    DTIC Science & Technology

    1998-03-01

    Statement addressing the requirement for monitoring, frequency of treat- ments/ therapy , and the associated operational assignment limitation; Informed...ACC does not exist in the EAIS, ARIS, or the EMF data bases. The system is able to track changes in duty station, but not ACC’s. If a member is on...specific geographic assignment. 4. Requires extensive or prolonged medical therapy . 5. Who through continued military service would probably result in

  17. Risk Importance Measures in the Designand Operation of Nuclear Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vrbanic I.; Samanta P.; Basic, I

    This monograph presents and discusses risk importance measures as quantified by the probabilistic risk assessment (PRA) models of nuclear power plants (NPPs) developed according to the current standards and practices. Usually, PRA tools calculate risk importance measures related to a single ?basic event? representing particular failure mode. This is, then, reflected in many current PRA applications. The monograph focuses on the concept of ?component-level? importance measures that take into account different failure modes of the component including common-cause failures (CCFs). In opening sections the roleof risk assessment in safety analysis of an NPP is introduced and discussion given of ?traditional?,more » mainly deterministic, design principles which have been established to assign a level of importance to a particular system, structure or component. This is followed by an overview of main risk importance measures for risk increase and risk decrease from current PRAs. Basic relations which exist among the measures are shown. Some of the current practical applications of risk importancemeasures from the field of NPP design, operation and regulation are discussed. The core of the monograph provides a discussion on theoreticalbackground and practical aspects of main risk importance measures at the level of ?component? as modeled in a PRA, starting from the simplest case, single basic event, and going toward more complexcases with multiple basic events and involvements in CCF groups. The intent is to express the component-level importance measures via theimportance measures and probabilities of the underlying single basic events, which are the inputs readily available from a PRA model andits results. Formulas are derived and discussed for some typical cases. The formulas and their results are demonstrated through some practicalexamples, done by means of a simplified PRA model developed in and run by RiskSpectrum? tool, which are presented in the appendices. The monograph concludes with discussion of limitations of the use of risk importance measures and a summary of component-level importance cases evaluated.« less

  18. Persuasive Writing, A Curriculum Design: K-12.

    ERIC Educational Resources Information Center

    Bennett, Susan G., Ed.

    In the spirit of the Texas Hill Country Writing Project and in response to the requirements of the Texas Assessment of Basic Skills, this guide presents writing assignments reflecting a commitment to a unified writing program for kindergarten through grade twelve. The framework for the assignments is adopted from the discourse theory of James…

  19. Basic Visual Merchandising. Second Edition. [Student's Manual and] Answer Book/Teacher's Guide.

    ERIC Educational Resources Information Center

    Luter, Robert R.

    This student's manual that features content needed to do tasks related to visual merchandising is intended for students in co-op training stations and entry-level, master employee, and supervisory-level employees. It contains 13 assignments. Each assignment has questions covering specific information and also features activities in which students…

  20. Developing the Inferential Reasoning of Basic Writers.

    ERIC Educational Resources Information Center

    Zeller, Robert

    1987-01-01

    Describes an assignment sequence using photographs to introduce developmental students to conventions of academic inquiry, and to give them practice analyzing and synthesizing. Reports that students link details observed in the photos to inferences drawn about them. Concentrates on the assignment linking a photo of E. B. White with an essay by him…

  1. Captain's Log...The Speech Communication Oral Journal.

    ERIC Educational Resources Information Center

    Strong, William F.

    1983-01-01

    The logic and the benefits of requiring college students in basic speech communication classes to tape-record oral journals are set forth along with a detailed description of the assignment. Instructions to the students explain the mechanics of the assignment as follows: (1) obtain and properly label a quality cassette tape; (2) make seven…

  2. What's Love Got to Do with It? Rethinking Common Sense Assumptions

    ERIC Educational Resources Information Center

    Trachman, Matthew; Bluestone, Cheryl

    2005-01-01

    One of the most basic tasks in introductory social science classes is to get students to reexamine their common sense assumptions concerning human behavior. This article introduces a shared assignment developed for a learning community that paired an introductory sociology and psychology class. The assignment challenges students to rethink the…

  3. 75 FR 69126 - Proposed Information Collection Request (ICR) for the Workforce Investment Act Random Assignment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-10

    ... Random Assignment Impact Evaluation of the Adult and Dislocated Worker Program; Comment Request AGENCY... fragmented system of employment and training programs under JTPA and providing universal access to basic (core) services. To determine whether the adult and dislocated worker services funded by Title I of the...

  4. Introducing chemical biology applications to introductory organic chemistry students using series of weekly assignments.

    PubMed

    Kanin, Maralee R; Pontrello, Jason K

    2016-01-01

    Calls to bring interdisciplinary content and examples into introductory science courses have increased, yet strategies that involve course restructuring often suffer from the need for a significant faculty commitment to motivate change. Minimizing the need for dramatic course reorganization, the structure, reactivity, and chemical biology applications of classes of biological monomers and polymers have been integrated into introductory organic chemistry courses through three series of semester-long weekly assignments that explored (a) Carbohydrates and Oligosaccharides, (b) Amino Acids, Peptides, and Proteins, and (c) Nucleosides, Nucleotides, and Nucleic Acids. Comparisons of unannounced pre- and post tests revealed improved understanding of a reaction introduced in the assignments, and course examinations evaluated cumulative assignment topics. Course surveys revealed that demonstrating biologically relevant applications consistently throughout the semesters enhanced student interest in the connection between basic organic chemistry content and its application to new and unfamiliar bio-related examples. Covering basic material related to these classes of molecules outside of the classroom opened lecture time to allow the instructor to further build on information developed through the weekly assignments, teaching advanced topics and applications typically not covered in an introductory organic chemistry lecture course. Assignments were implemented as homework, either with or without accompanying discussion, in both laboratory and lecture organic courses within the context of the existing course structures. © 2015 The International Union of Biochemistry and Molecular Biology.

  5. Junior High Student Responsibilities for Basic Skills.

    ERIC Educational Resources Information Center

    Parker, Charles C.

    This paper advances the thesis that students should be trained to recognize acceptable and unacceptable performances in basic skill areas and should assume responsibility for attaining proficiency in these areas. Among the topics discussed are the value of having junior high school students check their own assignments, discover their errors, and…

  6. Rhetorical Analysis as Introductory Speech: Jumpstarting Student Engagement

    ERIC Educational Resources Information Center

    Malone, Marc P.

    2012-01-01

    When students enter the basic public speaking classroom,When students enter the basic public speaking classroom, they are asked to develop an introductory speech. This assignment typically focuses on a speech of self-introduction for which there are several pedagogical underpinnings: it provides an immediate and relatively stress-free speaking…

  7. 42 CFR 424.80 - Prohibition of reassignment of claims by suppliers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... HUMAN SERVICES (CONTINUED) MEDICARE PROGRAM CONDITIONS FOR MEDICARE PAYMENT Limitations on Assignment and Reassignment of Claims § 424.80 Prohibition of reassignment of claims by suppliers. (a) Basic... the basic rule—(1) Payment to employer. Medicare may pay the supplier's employer if the supplier is...

  8. Basic Writers as Critical Thinkers.

    ERIC Educational Resources Information Center

    Anstendig, Linda; Kimmel, Isabel

    Teachers of a basic writing course broadened their theme approach from growth and change in adolescence to the theme of language and identity, developed sequenced writing assignments, and worked toward the culminating unit--a mini-research project through which all the language and thinking skills developed throughout the semester could be…

  9. Directed Design of Experiments for Validating Probability of Detection Capability of a Testing System

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2012-01-01

    A method of validating a probability of detection (POD) testing system using directed design of experiments (DOE) includes recording an input data set of observed hit and miss or analog data for sample components as a function of size of a flaw in the components. The method also includes processing the input data set to generate an output data set having an optimal class width, assigning a case number to the output data set, and generating validation instructions based on the assigned case number. An apparatus includes a host machine for receiving the input data set from the testing system and an algorithm for executing DOE to validate the test system. The algorithm applies DOE to the input data set to determine a data set having an optimal class width, assigns a case number to that data set, and generates validation instructions based on the case number.

  10. Grading Practice as Valid Measures of Academic Achievement of Secondary Schools Students for National Development

    ERIC Educational Resources Information Center

    Chiekem, Enwefa

    2015-01-01

    Assigning grades is probably the most important measurement decision that classroom teachers makes. When teachers are provided with some measurement instruction, they still use subjective value judgments when assigning grades to students. This paper therefore, examines the grading practice as valid measures of academic achievement in secondary…

  11. The development and validation of the AMPREDICT model for predicting mobility outcome after dysvascular lower extremity amputation.

    PubMed

    Czerniecki, Joseph M; Turner, Aaron P; Williams, Rhonda M; Thompson, Mary Lou; Landry, Greg; Hakimi, Kevin; Speckman, Rebecca; Norvell, Daniel C

    2017-01-01

    The objective of this study was the development of AMPREDICT-Mobility, a tool to predict the probability of independence in either basic or advanced (iBASIC or iADVANCED) mobility 1 year after dysvascular major lower extremity amputation. Two prospective cohort studies during consecutive 4-year periods (2005-2009 and 2010-2014) were conducted at seven medical centers. Multiple demographic and biopsychosocial predictors were collected in the periamputation period among individuals undergoing their first major amputation because of complications of peripheral arterial disease or diabetes. The primary outcomes were iBASIC and iADVANCED mobility, as measured by the Locomotor Capabilities Index. Combined data from both studies were used for model development and internal validation. Backwards stepwise logistic regression was used to develop the final prediction models. The discrimination and calibration of each model were assessed. Internal validity of each model was assessed with bootstrap sampling. Twelve-month follow-up was reached by 157 of 200 (79%) participants. Among these, 54 (34%) did not achieve iBASIC mobility, 103 (66%) achieved at least iBASIC mobility, and 51 (32%) also achieved iADVANCED mobility. Predictive factors associated with reduced odds of achieving iBASIC mobility were increasing age, chronic obstructive pulmonary disease, dialysis, diabetes, prior history of treatment for depression or anxiety, and very poor to fair self-rated health. Those who were white, were married, and had at least a high-school degree had a higher probability of achieving iBASIC mobility. The odds of achieving iBASIC mobility increased with increasing body mass index up to 30 kg/m 2 and decreased with increasing body mass index thereafter. The prediction model of iADVANCED mobility included the same predictors with the exception of diabetes, chronic obstructive pulmonary disease, and education level. Both models showed strong discrimination with C statistics of 0.85 and 0.82, respectively. The mean difference in predicted probabilities for those who did and did not achieve iBASIC and iADVANCED mobility was 33% and 29%, respectively. Tests for calibration and observed vs predicted plots suggested good fit for both models; however, the precision of the estimates of the predicted probabilities was modest. Internal validation through bootstrapping demonstrated some overoptimism of the original model development, with the optimism-adjusted C statistic for iBASIC and iADVANCED mobility being 0.74 and 0.71, respectively, and the discrimination slope 19% and 16%, respectively. AMPREDICT-Mobility is a user-friendly prediction tool that can inform the patient undergoing a dysvascular amputation and the patient's provider about the probability of independence in either basic or advanced mobility at each major lower extremity amputation level. Copyright © 2016 Society for Vascular Surgery. All rights reserved.

  12. An evaluation of agreement between pectoral spines and otoliths for estimating ages of catfishes

    USGS Publications Warehouse

    Olive, J.A.; Schramm, Harold; Gerard, Patrick D.; Irwin, E.

    2011-01-01

    Otoliths have been shown to provide more accurate ages than pectoral spine sections for several catfish populations; but sampling otoliths requires euthanizing the specimen, whereas spines can be sampled non-lethally. To evaluate whether, and under what conditions, spines provide the same or similar age estimates as otoliths, we examined data sets of individual fish aged from pectoral spines and otoliths for six blue catfish Ictalurus furcatus populations (n=420), 14 channel catfish Ictalurus punctatus populations (n=997), and 10 flathead catfish Pylodictus olivaris populations (n=947) from lotic and lentic waters throughout the central and eastern U.S. Logistic regression determined that agreement between ages estimated from otoliths and spines was consistently related to age, but inconsistently related to growth rate. When modeled at mean growth rate, we found at least 80% probability of no difference in spine- and otolith-assigned ages up to ages 4 and 5 for blue and channel catfish, respectively. For flathead catfish, an 80% probability of agreement between spine- and otolith-assigned ages did not occur at any age due to high incidence of differences in assigned ages even for age-1 fish. Logistic regression models predicted at least 80% probability that spine and otolith ages differed by ≤1 year up to ages 13, 16, and 9 for blue, channel, and flathead catfish, respectively. Age-bias assessment found mean spine-assigned age differed by less than 1 year from otolith-assigned age up to ages 19, 9, and 17 for blue catfish, channel catfish, and flathead catfish, respectively. These results can be used to help guide decisions about which structure is most appropriate for estimating catfish ages for particular populations and management objectives.

  13. A New Fuzzy-Evidential Controller for Stabilization of the Planar Inverted Pendulum System

    PubMed Central

    Tang, Yongchuan; Zhou, Deyun

    2016-01-01

    In order to realize the stability control of the planar inverted pendulum system, which is a typical multi-variable and strong coupling system, a new fuzzy-evidential controller based on fuzzy inference and evidential reasoning is proposed. Firstly, for each axis, a fuzzy nine-point controller for the rod and a fuzzy nine-point controller for the cart are designed. Then, in order to coordinate these two controllers of each axis, a fuzzy-evidential coordinator is proposed. In this new fuzzy-evidential controller, the empirical knowledge for stabilization of the planar inverted pendulum system is expressed by fuzzy rules, while the coordinator of different control variables in each axis is built incorporated with the dynamic basic probability assignment (BPA) in the frame of fuzzy inference. The fuzzy-evidential coordinator makes the output of the control variable smoother, and the control effect of the new controller is better compared with some other work. The experiment in MATLAB shows the effectiveness and merit of the proposed method. PMID:27482707

  14. A New Fuzzy-Evidential Controller for Stabilization of the Planar Inverted Pendulum System.

    PubMed

    Tang, Yongchuan; Zhou, Deyun; Jiang, Wen

    2016-01-01

    In order to realize the stability control of the planar inverted pendulum system, which is a typical multi-variable and strong coupling system, a new fuzzy-evidential controller based on fuzzy inference and evidential reasoning is proposed. Firstly, for each axis, a fuzzy nine-point controller for the rod and a fuzzy nine-point controller for the cart are designed. Then, in order to coordinate these two controllers of each axis, a fuzzy-evidential coordinator is proposed. In this new fuzzy-evidential controller, the empirical knowledge for stabilization of the planar inverted pendulum system is expressed by fuzzy rules, while the coordinator of different control variables in each axis is built incorporated with the dynamic basic probability assignment (BPA) in the frame of fuzzy inference. The fuzzy-evidential coordinator makes the output of the control variable smoother, and the control effect of the new controller is better compared with some other work. The experiment in MATLAB shows the effectiveness and merit of the proposed method.

  15. Change detection of bitemporal multispectral images based on FCM and D-S theory

    NASA Astrophysics Data System (ADS)

    Shi, Aiye; Gao, Guirong; Shen, Shaohong

    2016-12-01

    In this paper, we propose a change detection method of bitemporal multispectral images based on the D-S theory and fuzzy c-means (FCM) algorithm. Firstly, the uncertainty and certainty regions are determined by thresholding method applied to the magnitudes of difference image (MDI) and spectral angle information (SAI) of bitemporal images. Secondly, the FCM algorithm is applied to the MDI and SAI in the uncertainty region, respectively. Then, the basic probability assignment (BPA) functions of changed and unchanged classes are obtained by the fuzzy membership values from the FCM algorithm. In addition, the optimal value of fuzzy exponent of FCM is adaptively determined by conflict degree between the MDI and SAI in uncertainty region. Finally, the D-S theory is applied to obtain the new fuzzy partition matrix for uncertainty region and further the change map is obtained. Experiments on bitemporal Landsat TM images and bitemporal SPOT images validate that the proposed method is effective.

  16. A new method based on Dempster-Shafer theory and fuzzy c-means for brain MRI segmentation

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Lu, Xi; Li, Yunpeng; Chen, Xiaowu; Deng, Yong

    2015-10-01

    In this paper, a new method is proposed to decrease sensitiveness to motion noise and uncertainty in magnetic resonance imaging (MRI) segmentation especially when only one brain image is available. The method is approached with considering spatial neighborhood information by fusing the information of pixels with their neighbors with Dempster-Shafer (DS) theory. The basic probability assignment (BPA) of each single hypothesis is obtained from the membership function of applying fuzzy c-means (FCM) clustering to the gray levels of the MRI. Then multiple hypotheses are generated according to the single hypothesis. Then we update the objective pixel’s BPA by fusing the BPA of the objective pixel and those of its neighbors to get the final result. Some examples in MRI segmentation are demonstrated at the end of the paper, in which our method is compared with some previous methods. The results show that the proposed method is more effective than other methods in motion-blurred MRI segmentation.

  17. Rogue waves and entropy consumption

    NASA Astrophysics Data System (ADS)

    Hadjihoseini, Ali; Lind, Pedro G.; Mori, Nobuhito; Hoffmann, Norbert P.; Peinke, Joachim

    2017-11-01

    Based on data from the Sea of Japan and the North Sea the occurrence of rogue waves is analyzed by a scale-dependent stochastic approach, which interlinks fluctuations of waves for different spacings. With this approach we are able to determine a stochastic cascade process, which provides information of the general multipoint statistics. Furthermore the evolution of single trajectories in scale, which characterize wave height fluctuations in the surroundings of a chosen location, can be determined. The explicit knowledge of the stochastic process enables to assign entropy values to all wave events. We show that for these entropies the integral fluctuation theorem, a basic law of non-equilibrium thermodynamics, is valid. This implies that positive and negative entropy events must occur. Extreme events like rogue waves are characterized as negative entropy events. The statistics of these entropy fluctuations changes with the wave state, thus for the Sea of Japan the statistics of the entropies has a more pronounced tail for negative entropy values, indicating a higher probability of rogue waves.

  18. Statistical primer: propensity score matching and its alternatives.

    PubMed

    Benedetto, Umberto; Head, Stuart J; Angelini, Gianni D; Blackstone, Eugene H

    2018-06-01

    Propensity score (PS) methods offer certain advantages over more traditional regression methods to control for confounding by indication in observational studies. Although multivariable regression models adjust for confounders by modelling the relationship between covariates and outcome, the PS methods estimate the treatment effect by modelling the relationship between confounders and treatment assignment. Therefore, methods based on the PS are not limited by the number of events, and their use may be warranted when the number of confounders is large, or the number of outcomes is small. The PS is the probability for a subject to receive a treatment conditional on a set of baseline characteristics (confounders). The PS is commonly estimated using logistic regression, and it is used to match patients with similar distribution of confounders so that difference in outcomes gives unbiased estimate of treatment effect. This review summarizes basic concepts of the PS matching and provides guidance in implementing matching and other methods based on the PS, such as stratification, weighting and covariate adjustment.

  19. Looking for a Possible Framework to Teach Contemporary Art in Primary School

    ERIC Educational Resources Information Center

    Vahter, Edna

    2016-01-01

    Traditionally, the learning of arts in the Estonian primary school has meant completion of practical assignments given by the teacher. The new national curriculum for basic school adopted in 2010 sets out new requirements for art education where the emphasis, in addition to practical assignments, is on discussion and understanding of art. The…

  20. 29 CFR 548.306 - Average earnings for year or quarter year preceding the current quarter.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... per week. Normally his established basic rate would be computed by dividing 2,318 hours into $4,244... hours of work, work assignments and duties, and the basis of remuneration for employment, were not... period. Significant differences in weekly hours of work, work assignments and duties, the basis of...

  1. A New "Moodle" Module Supporting Automatic Verification of VHDL-Based Assignments

    ERIC Educational Resources Information Center

    Gutierrez, Eladio; Trenas, Maria A.; Ramos, Julian; Corbera, Francisco; Romero, Sergio

    2010-01-01

    This work describes a new "Moodle" module developed to give support to the practical content of a basic computer organization course. This module goes beyond the mere hosting of resources and assignments. It makes use of an automatic checking and verification engine that works on the VHDL designs submitted by the students. The module automatically…

  2. Teaching Assessment of Classroom Learning: Using Scenarios To Teach Basic Tests and Measurement Concepts.

    ERIC Educational Resources Information Center

    Cochran, H. Keith

    This paper contains two scenario-type assignments for students in a university tests and measurements class as well as a collection of materials developed by actual students in response to these assignments. An opening explanation argues that education students, often nearing the end of their program when they take the tests and measurement…

  3. Applied Problems and Use of Technology in an Aligned Way in Basic Courses in Probability and Statistics for Engineering Students--A Way to Enhance Understanding and Increase Motivation

    ERIC Educational Resources Information Center

    Zetterqvist, Lena

    2017-01-01

    Researchers and teachers often recommend motivating exercises and use of mathematics or statistics software for the teaching of basic courses in probability and statistics. Our courses are given to large groups of engineering students at Lund Institute of Technology. We found that the mere existence of real-life data and technology in a course…

  4. Oregon & Federal Basic Income Tax Return Preparation. Student's Manual 1981.

    ERIC Educational Resources Information Center

    Young, Donna, Ed.

    This student manual contains materials for a 20-session course in basic income tax preparation. Each session may include some or all of these components: a reading assignment, a vocabulary list, interview questions pertinent to that session's subject matter, informative/reference materials, problems to work out in class or at home, exercises, and…

  5. Designing a Digital Story Assignment for Basic Writers Using the TPCK Framework

    ERIC Educational Resources Information Center

    Bandi-Rao, Shoba; Sepp, Mary

    2014-01-01

    The process of digital storytelling allows basic writers to take a personal narrative and translate it into a multimodal and multidimensional experience, motivating a diverse group of writers with different learning styles to engage more creatively and meaningfully in the writing process. Digital storytelling has the capacity to contextualize…

  6. Wage and Salary Administration for Smaller Institutions of Higher Education. A Basic Guide to Management Practice.

    ERIC Educational Resources Information Center

    National Association of College and University Business Officers, Washington, DC.

    This manual provides a basic guide to wage and salary administration at smaller institutions of higher education--institutions with 400 or fewer full-time nonacademic employees and a relatively uncomplicated administrative organization. Emphasis is placed on definitions and benefits of the process, assigning responsibility and authority, deciding…

  7. Library Skills for Teachers: A Self-Paced Workbook.

    ERIC Educational Resources Information Center

    Mech, Terrence

    Designed to introduce education students to the basic library resources in the field, this self-paced workbook assumes a basic knowledge of the library and its resources. Each section in the eight-chapter workbook discusses a particular type of reference material and sample entries are provided when appropriate. Eleven assignments (two multiple…

  8. SIGPI. Fault Tree Cut Set System Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patenaude, C.J.

    1992-01-13

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can bemore » input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less

  9. SIGPI. Fault Tree Cut Set System Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patenaude, C.J.

    1992-01-14

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can bemore » input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less

  10. Schedule Risk Assessment

    NASA Technical Reports Server (NTRS)

    Smith, Greg

    2003-01-01

    Schedule risk assessments determine the likelihood of finishing on time. Each task in a schedule has a varying degree of probability of being finished on time. A schedule risk assessment quantifies these probabilities by assigning values to each task. This viewgraph presentation contains a flow chart for conducting a schedule risk assessment, and profiles applicable several methods of data analysis.

  11. Earthquake Rate Model 2.2 of the 2007 Working Group for California Earthquake Probabilities, Appendix D: Magnitude-Area Relationships

    USGS Publications Warehouse

    Stein, Ross S.

    2007-01-01

    Summary To estimate the down-dip coseismic fault dimension, W, the Executive Committee has chosen the Nazareth and Hauksson (2004) method, which uses the 99% depth of background seismicity to assign W. For the predicted earthquake magnitude-fault area scaling used to estimate the maximum magnitude of an earthquake rupture from a fault's length, L, and W, the Committee has assigned equal weight to the Ellsworth B (Working Group on California Earthquake Probabilities, 2003) and Hanks and Bakun (2002) (as updated in 2007) equations. The former uses a single relation; the latter uses a bilinear relation which changes slope at M=6.65 (A=537 km2).

  12. Undergraduate Electronics Projects Based on the Design of an Optical Wireless Audio Transmission System

    ERIC Educational Resources Information Center

    Oliveira, Luis Bica; Paulino, Nuno; Oliveira, João P.; Santos-Tavares, Rui; Pereira, Nuno; Goes, João

    2017-01-01

    The two projects presented in this paper can be used either as two separate assignments in two different semesters or as a final assignment for undergraduate students of electrical engineering. They have two main objectives: first, to teach basic electronic circuit design concepts and, second, to motivate the students to learn more about analog…

  13. Just-in-Time Teaching in Sociology or How I Convinced My Students to Actually Read the Assignment

    ERIC Educational Resources Information Center

    Howard, Jay R.

    2004-01-01

    In the process of collecting assessment data in the author's introductory sociology course, he made a startling and disappointing discovery. For the most part, students simply were not bothering to read the basics version of the introductory survey textbook that he assigned. This discovery presented him with two related challenges. First, he had…

  14. 23 CFR 1340.3 - Basic design requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE § 1340.3 Basic design requirements. Surveys conducted in... requirement. The sample identified for the survey shall have a probability-based design such that estimates... 23 Highways 1 2010-04-01 2010-04-01 false Basic design requirements. 1340.3 Section 1340.3...

  15. 23 CFR 1340.3 - Basic design requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE § 1340.3 Basic design requirements. Surveys conducted in... requirement. The sample identified for the survey shall have a probability-based design such that estimates... 23 Highways 1 2011-04-01 2011-04-01 false Basic design requirements. 1340.3 Section 1340.3...

  16. Introducing Disjoint and Independent Events in Probability.

    ERIC Educational Resources Information Center

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  17. Active Learning? Not with My Syllabus!

    ERIC Educational Resources Information Center

    Ernst, Michael D.

    2012-01-01

    We describe an approach to teaching probability that minimizes the amount of class time spent on the topic while also providing a meaningful (dice-rolling) activity to get students engaged. The activity, which has a surprising outcome, illustrates the basic ideas of informal probability and how probability is used in statistical inference.…

  18. Criteria for Using Technology To Teach the Basic Course in Communication.

    ERIC Educational Resources Information Center

    Eadie, William F.; Andersen, Peter A.; Armas-Matsumoto, Catherine M.; Block, Evan; Martin, Patricia Geist; Goehring, Charles; Good, Jeffrey; Hellweg, Susan A.; Knight, Laura L.; Lubic, Bryan; Spitzberg, Brian H.

    This paper describes the beginnings of a project to remake the oral communication general education course--part of the vision for the course is to use technology to help students learn course content. According to the paper, currently the basic course is taught mostly in traditional format (relatively small sections with set assignments), with…

  19. Public Address, Cultural Diversity, and Tolerance: Teaching Cultural Diversity in Speech Classes.

    ERIC Educational Resources Information Center

    Byrd, Marquita L.

    While speech instructors work to design appropriate diversity goals in the public speaking class, few have the training for such a task. A review of course objectives and assignments for the basic course may be helpful. Suggestions for instructors working to incorporate diversity in the basic course include: (1) recognize the dominance of the…

  20. Fostering First Graders' Fluency with Basic Subtraction and Larger Addition Combinations via Computer-Assisted Instruction

    ERIC Educational Resources Information Center

    Baroody, Arthur J.; Purpura, David J.; Eiland, Michael D.; Reid, Erin E.

    2014-01-01

    Achieving fluency with basic subtraction and add-with-8 or -9 combinations is difficult for primary grade children. A 9-month training experiment entailed evaluating the efficacy of software designed to promote such fluency via guided learning of reasoning strategies. Seventy-five eligible first graders were randomly assigned to one of three…

  1. Relevance in Basic Composition: Writing Assignments for Technical Students.

    ERIC Educational Resources Information Center

    Tichenor, Stuart

    Generally, students in vocational and technical colleges are in writing classes because they must be, not because they want to be. As a rule, students in basic composition classes have been more or less continually exposed to writing classes since middle school where they been asked to keep journals, read articles and short stories, and write…

  2. Uncertainty and Risk in the Predictions of Global Climate Models. (Invited)

    NASA Astrophysics Data System (ADS)

    Winsberg, E.

    2009-12-01

    There has been a great deal of emphasis, in recent years, on developing methods for assigning probabilities, in the form of quantitative margins of uncertainty (QMUs) to the predictions of global climate models. In this paper, I will argue that a large part of the motivation for this activity has been misplaced. Rather than explicit QMUs, climate scientists ought to focus on risk mitigation: offering policy advice about what courses of action need to be taken in order to reduce the risk of negative outcomes to acceptable levels. The advantages of QMUs are clear. QMUs can be an extremely effective tool for dividing our intellectual labor into the epistemic and the normative. If scientists can manage to objectively assign probabilities to various outcomes given certain choices of action, then they can effectively leave decisions about the relative social value of these outcomes out of the work they do as experts. In this way, it is commonly thought, scientists can keep ethical questions—like questions about the relative value of environmental stability vs. the availability of fossil fuels for economic development—separate from the purely scientific questions about the workings of the climate system. It is this line of thinking, or so I argue, that has motivated the large quantity of intellectual labor that has recently been devoted, by both climate scientists and statisticians, to attaching QMUs to the predictions of global climate models. Such an approach, and the attendant division of labor that it affords between those who discover the facts and those who decide what we should value, has obvious advantages. Scientists, after all, are not elected leaders, and they lack the political legitimacy to make decisions on behalf of the public about what is socially valuable. Elected leaders, on the other hand, rarely have the expertise they would need to accurately forecast, for themselves, what the likely outcomes of their policy choices would be. Since it would be disingenuous of climate experts to pretend that they can make forecasts with certainty, the objective assignment of probabilities to the forecasts of climate experts is just what is needed to resolve this tension. All of this, however, is predicated on the assumption that a conceptually coherent methodology is available for calculating QMUs based on the forecasts of complex deterministic models like the global models of climate used by climate scientists. I argue in this paper that, at the present time, no such conceptually coherent method exists, and it is not clear where one will come from. In fact, I argue, the present practice of assigning QMUs provides an artificial precision to the predictions of climate models where no such precision is possible. But it is this very kind of precision which would be required for the method of QMUs to divide our intellectual labor into the epistemic and the normative. And if QMUs cannot do their intended job of dividing our intellectual labor into the epistemic and the normative, then perhaps they ought to be abandoned in favor of an approach in which certain basic assumptions about values are built into the science. Risk mitigation might be such an approach.

  3. UT Biomedical Informatics Lab (BMIL) probability wheel

    NASA Astrophysics Data System (ADS)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  4. Heterogeneous Defensive Naval Weapon Assignment To Swarming Threats In Real Time

    DTIC Science & Technology

    2016-03-01

    threat Damage potential of target t if it hits the ship [integer from 0 to 3] _ ttarget phit Probability that target t hits the ship [probability...secondary weapon systems on target t [integer] _ tsec phit Probability that secondary weapon systems launched from target t hit the ship...pairing. These parameters are calculated as follows: 310 _ _t t tpriority target threat target phit = × × (3.1) 3_ 10 _ _t t tsec priority sec

  5. Schedule Risk Assessment

    NASA Technical Reports Server (NTRS)

    Smith, Greg

    2003-01-01

    Schedule Risk Assessment needs to determine the probability of finishing on or before a given point in time. Task in a schedule should reflect the "most likely" duration for each task. IN reality, each task is different and has a varying degree of probability of finishing within or after the duration specified. Schedule risk assessment attempt to quantify these probabilities by assigning values to each task. Bridges the gap between CPM scheduling and the project's need to know the likelihood of "when".

  6. Probabilistic Surface Characterization for Safe Landing Hazard Detection and Avoidance (HDA)

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E. (Inventor); Ivanov, Tonislav I. (Inventor); Huertas, Andres (Inventor)

    2015-01-01

    Apparatuses, systems, computer programs and methods for performing hazard detection and avoidance for landing vehicles are provided. Hazard assessment takes into consideration the geometry of the lander. Safety probabilities are computed for a plurality of pixels in a digital elevation map. The safety probabilities are combined for pixels associated with one or more aim points and orientations. A worst case probability value is assigned to each of the one or more aim points and orientations.

  7. An improved approximate network blocking probability model for all-optical WDM Networks with heterogeneous link capacities

    NASA Astrophysics Data System (ADS)

    Khan, Akhtar Nawaz

    2017-11-01

    Currently, analytical models are used to compute approximate blocking probabilities in opaque and all-optical WDM networks with the homogeneous link capacities. Existing analytical models can also be extended to opaque WDM networking with heterogeneous link capacities due to the wavelength conversion at each switch node. However, existing analytical models cannot be utilized for all-optical WDM networking with heterogeneous structure of link capacities due to the wavelength continuity constraint and unequal numbers of wavelength channels on different links. In this work, a mathematical model is extended for computing approximate network blocking probabilities in heterogeneous all-optical WDM networks in which the path blocking is dominated by the link along the path with fewer number of wavelength channels. A wavelength assignment scheme is also proposed for dynamic traffic, termed as last-fit-first wavelength assignment, in which a wavelength channel with maximum index is assigned first to a lightpath request. Due to heterogeneous structure of link capacities and the wavelength continuity constraint, the wavelength channels with maximum indexes are utilized for minimum hop routes. Similarly, the wavelength channels with minimum indexes are utilized for multi-hop routes between source and destination pairs. The proposed scheme has lower blocking probability values compared to the existing heuristic for wavelength assignments. Finally, numerical results are computed in different network scenarios which are approximately equal to values obtained from simulations. Since January 2016, he is serving as Head of Department and an Assistant Professor in the Department of Electrical Engineering at UET, Peshawar-Jalozai Campus, Pakistan. From May 2013 to June 2015, he served Department of Telecommunication Engineering as an Assistant Professor at UET, Peshawar-Mardan Campus, Pakistan. He also worked as an International Internship scholar in the Fukuda Laboratory, National Institute of Informatics, Tokyo, Japan on the topic large-scale simulation for internet topology analysis. His research interests include design and analysis of optical WDM networks, network algorithms, network routing, and network resource optimization problems.

  8. Zoonoses action plan Salmonella monitoring programme: an investigation of the sampling protocol.

    PubMed

    Snary, E L; Munday, D K; Arnold, M E; Cook, A J C

    2010-03-01

    The Zoonoses Action Plan (ZAP) Salmonella Programme was established by the British Pig Executive to monitor Salmonella prevalence in quality-assured British pigs at slaughter by testing a sample of pigs with a meat juice enzyme-linked immunosorbent assay for antibodies against group B and C(1) Salmonella. Farms were assigned a ZAP level (1 to 3) depending on the monitored prevalence, and ZAP 2 or 3 farms were required to act to reduce the prevalence. The ultimate goal was to reduce the risk of human salmonellosis attributable to British pork. A mathematical model has been developed to describe the ZAP sampling protocol. Results show that the probability of assigning a farm the correct ZAP level was high, except for farms that had a seroprevalence close to the cutoff points between different ZAP levels. Sensitivity analyses identified that the probability of assigning a farm to the correct ZAP level was dependent on the sensitivity and specificity of the test, the number of batches taken to slaughter each quarter, and the number of samples taken per batch. The variability of the predicted seroprevalence was reduced as the number of batches or samples increased and, away from the cutoff points, the probability of being assigned the correct ZAP level increased as the number of batches or samples increased. In summary, the model described here provided invaluable insight into the ZAP sampling protocol. Further work is required to understand the impact of the program for Salmonella infection in British pig farms and therefore on human health.

  9. Does part-time sick leave help individuals with mental disorders recover lost work capacity?

    PubMed

    Andrén, Daniela

    2014-06-01

    This paper aims to answer the question whether combining sick leave with some hours of work can help employees diagnosed with a mental disorder (MD) increase their probability of returning to work. Given the available data, this paper analyzes the impact of part-time sick leave (PTSL) on the probability of fully recovering lost work capacity for employees diagnosed with an MD. The effects of PTSL on the probability of fully recovering lost work capacity are estimated by a discrete choice one-factor model using data on a nationally representative sample extracted from the register of the National Agency of Social Insurance in Sweden and supplemented with information from questionnaires. All individuals in the sample were 20-64 years old and started a sickness spell of at least 15 days between 1 and 16 February 2001. We selected all employed individuals diagnosed with an MD, with a final sample of 629 individuals. The results show that PTSL is associated with a low likelihood of full recovery, yet the timing of the assignment is important. PTSL's effect is relatively low (0.015) when it is assigned in the beginning of the spell but relatively high (0.387), and statistically significant, when assigned after 60 days of full-time sick leave (FTSL). This suggests efficiency improvements from assigning employees with an MD diagnosis, when possible, to PTSL. The employment gains will be enhanced if employees with an MD diagnosis are encouraged to return to work part-time after 60 days or more of FTSL.

  10. FY17 ISCR Scholar End-of-Assignment Report - Robbie Sadre

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadre, R.

    2017-10-20

    Throughout this internship assignment, I did various tasks that contributed towards the starting of the SASEDS (Safe Active Scanning for Energy Delivery Systems) and CES-21 (California Energy Systems for the 21st Century) projects in the SKYFALL laboratory. The goal of the SKYFALL laboratory is to perform modeling and simulation verification of transmission power system devices, while integrating with high-performance computing. The first thing I needed to do was acquire official Online LabVIEW training from National Instruments. Through these online tutorial modules, I learned the basics of LabVIEW, gaining experience in connecting to NI devices through the DAQmx API as wellmore » as LabVIEW basic programming techniques (structures, loops, state machines, front panel GUI design etc).« less

  11. Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.

    PubMed

    Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian

    2011-01-01

    Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.

  12. Towards a Next-Generation Catalogue Cross-Match Service

    NASA Astrophysics Data System (ADS)

    Pineau, F.; Boch, T.; Derriere, S.; Arches Consortium

    2015-09-01

    We have been developing in the past several catalogue cross-match tools. On one hand the CDS XMatch service (Pineau et al. 2011), able to perform basic but very efficient cross-matches, scalable to the largest catalogues on a single regular server. On the other hand, as part of the European project ARCHES1, we have been developing a generic and flexible tool which performs potentially complex multi-catalogue cross-matches and which computes probabilities of association based on a novel statistical framework. Although the two approaches have been managed so far as different tracks, the need for next generation cross-match services dealing with both efficiency and complexity is becoming pressing with forthcoming projects which will produce huge high quality catalogues. We are addressing this challenge which is both theoretical and technical. In ARCHES we generalize to N catalogues the candidate selection criteria - based on the chi-square distribution - described in Pineau et al. (2011). We formulate and test a number of Bayesian hypothesis which necessarily increases dramatically with the number of catalogues. To assign a probability to each hypotheses, we rely on estimated priors which account for local densities of sources. We validated our developments by comparing the theoretical curves we derived with the results of Monte-Carlo simulations. The current prototype is able to take into account heterogeneous positional errors, object extension and proper motion. The technical complexity is managed by OO programming design patterns and SQL-like functionalities. Large tasks are split into smaller independent pieces for scalability. Performances are achieved resorting to multi-threading, sequential reads and several tree data-structures. In addition to kd-trees, we account for heterogeneous positional errors and object's extension using M-trees. Proper-motions are supported using a modified M-tree we developed, inspired from Time Parametrized R-trees (TPR-tree). Quantitative tests in comparison with the basic cross-match will be presented.

  13. Conversion of Questionnaire Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powell, Danny H; Elwood Jr, Robert H

    During the survey, respondents are asked to provide qualitative answers (well, adequate, needs improvement) on how well material control and accountability (MC&A) functions are being performed. These responses can be used to develop failure probabilities for basic events performed during routine operation of the MC&A systems. The failure frequencies for individual events may be used to estimate total system effectiveness using a fault tree in a probabilistic risk analysis (PRA). Numeric risk values are required for the PRA fault tree calculations that are performed to evaluate system effectiveness. So, the performance ratings in the questionnaire must be converted to relativemore » risk values for all of the basic MC&A tasks performed in the facility. If a specific material protection, control, and accountability (MPC&A) task is being performed at the 'perfect' level, the task is considered to have a near zero risk of failure. If the task is performed at a less than perfect level, the deficiency in performance represents some risk of failure for the event. As the degree of deficiency in performance increases, the risk of failure increases. If a task that should be performed is not being performed, that task is in a state of failure. The failure probabilities of all basic events contribute to the total system risk. Conversion of questionnaire MPC&A system performance data to numeric values is a separate function from the process of completing the questionnaire. When specific questions in the questionnaire are answered, the focus is on correctly assessing and reporting, in an adjectival manner, the actual performance of the related MC&A function. Prior to conversion, consideration should not be given to the numeric value that will be assigned during the conversion process. In the conversion process, adjectival responses to questions on system performance are quantified based on a log normal scale typically used in human error analysis (see A.D. Swain and H.E. Guttmann, 'Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications,' NUREG/CR-1278). This conversion produces the basic event risk of failure values required for the fault tree calculations. The fault tree is a deductive logic structure that corresponds to the operational nuclear MC&A system at a nuclear facility. The conventional Delphi process is a time-honored approach commonly used in the risk assessment field to extract numerical values for the failure rates of actions or activities when statistically significant data is absent.« less

  14. The Probability Approach to English If-Conditional Sentences

    ERIC Educational Resources Information Center

    Wu, Mei

    2012-01-01

    Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…

  15. An application of probability to combinatorics: a proof of Vandermonde identity

    NASA Astrophysics Data System (ADS)

    Paolillo, Bonaventura; Rizzo, Piermichele; Vincenzi, Giovanni

    2017-08-01

    In this paper, we give possible suggestions for a classroom lesson about an application of probability using basic mathematical notions. We will approach to some combinatoric results without using 'induction', 'polynomial identities' nor 'generating functions', and will give a proof of the 'Vandermonde Identity' using elementary notions of probability.

  16. Strategies for Searching. A Self-Paced Workbook for Basic Library Skills. Second Edition.

    ERIC Educational Resources Information Center

    Hales, Celia; And Others

    This self-paced workbook is designed to help students at the University of North Carolina-Charlotte acquire basic skills in using a university library. The workbook, which is used in conjuction with course assignments, is divided into six sections: (l) Introduction; (2) How to Locate Background Information; (3) How to Locate Books; (4) How to…

  17. Comparison of Teachers' and School Psychologists' Accuracy in Assigning Basic Academic Tasks to Underlying CHC-Model Cognitive Abilities

    ERIC Educational Resources Information Center

    Petruccelli, Meredith Lohr; Fiorello, Catherine A.; Thurman, S. Kenneth

    2010-01-01

    Teacher perceptions of their students' cognitive abilities affect the referrals they make and intervention strategies they implement. In this study, teachers and school psychologists were asked to sort basic academic tasks into categories on the basis of the Cattell-Horn-Carroll (CHC) broad cognitive abilities, such as fluid reasoning and…

  18. Effect of Self Regulated Learning Approach on Junior Secondary School Students' Achievement in Basic Science

    ERIC Educational Resources Information Center

    Nwafor, Chika E.; Obodo, Abigail Chikaodinaka; Okafor, Gabriel

    2015-01-01

    This study explored the effect of self-regulated learning approach on junior secondary school students' achievement in basic science. Quasi-experimental design was used for the study.Two co-educational schools were drawn for the study through simple random sampling technique. One school was assigned to the treatment group while the other was…

  19. Effects Of The Contingency For Homework Submission On Homework Submission And Quiz Performance In A College Course

    PubMed Central

    2005-01-01

    Effects of the contingency for submission of homework assignments on the probability of assignment submission and on quiz grades were assessed in an undergraduate psychology course. Under an alternating treatments design, each student was assigned to a points condition for 5 of 10 quiz-related homework assignments corresponding to textbook chapters. Points were available for homework submission under this condition; points were not available under the no-points condition. The group-mean percentage of homework assignments submitted and quiz grades were higher for all chapters under the points condition than in the no-points condition. These findings, which were replicated in Experiment 2, demonstrate that homework submission was not maintained when the only consequences were instructor-provided feedback and expectation of improved quiz performance. PMID:15898476

  20. Relevance feedback for CBIR: a new approach based on probabilistic feature weighting with positive and negative examples.

    PubMed

    Kherfi, Mohammed Lamine; Ziou, Djemel

    2006-04-01

    In content-based image retrieval, understanding the user's needs is a challenging task that requires integrating him in the process of retrieval. Relevance feedback (RF) has proven to be an effective tool for taking the user's judgement into account. In this paper, we present a new RF framework based on a feature selection algorithm that nicely combines the advantages of a probabilistic formulation with those of using both the positive example (PE) and the negative example (NE). Through interaction with the user, our algorithm learns the importance he assigns to image features, and then applies the results obtained to define similarity measures that correspond better to his judgement. The use of the NE allows images undesired by the user to be discarded, thereby improving retrieval accuracy. As for the probabilistic formulation of the problem, it presents a multitude of advantages and opens the door to more modeling possibilities that achieve a good feature selection. It makes it possible to cluster the query data into classes, choose the probability law that best models each class, model missing data, and support queries with multiple PE and/or NE classes. The basic principle of our algorithm is to assign more importance to features with a high likelihood and those which distinguish well between PE classes and NE classes. The proposed algorithm was validated separately and in image retrieval context, and the experiments show that it performs a good feature selection and contributes to improving retrieval effectiveness.

  1. Probabilistic reasoning in data analysis.

    PubMed

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  2. optBINS: Optimal Binning for histograms

    NASA Astrophysics Data System (ADS)

    Knuth, Kevin H.

    2018-03-01

    optBINS (optimal binning) determines the optimal number of bins in a uniform bin-width histogram by deriving the posterior probability for the number of bins in a piecewise-constant density model after assigning a multinomial likelihood and a non-informative prior. The maximum of the posterior probability occurs at a point where the prior probability and the the joint likelihood are balanced. The interplay between these opposing factors effectively implements Occam's razor by selecting the most simple model that best describes the data.

  3. [Development of Markov models for economics evaluation of strategies on hepatitis B vaccination and population-based antiviral treatment in China].

    PubMed

    Yang, P C; Zhang, S X; Sun, P P; Cai, Y L; Lin, Y; Zou, Y H

    2017-07-10

    Objective: To construct the Markov models to reflect the reality of prevention and treatment interventions against hepatitis B virus (HBV) infection, simulate the natural history of HBV infection in different age groups and provide evidence for the economics evaluations of hepatitis B vaccination and population-based antiviral treatment in China. Methods: According to the theory and techniques of Markov chain, the Markov models of Chinese HBV epidemic were developed based on the national data and related literature both at home and abroad, including the settings of Markov model states, allowable transitions and initial and transition probabilities. The model construction, operation and verification were conducted by using software TreeAge Pro 2015. Results: Several types of Markov models were constructed to describe the disease progression of HBV infection in neonatal period, perinatal period or adulthood, the progression of chronic hepatitis B after antiviral therapy, hepatitis B prevention and control in adults, chronic hepatitis B antiviral treatment and the natural progression of chronic hepatitis B in general population. The model for the newborn was fundamental which included ten states, i.e . susceptiblity to HBV, HBsAg clearance, immune tolerance, immune clearance, low replication, HBeAg negative CHB, compensated cirrhosis, decompensated cirrhosis, hepatocellular carcinoma (HCC) and death. The susceptible state to HBV was excluded in the perinatal period model, and the immune tolerance state was excluded in the adulthood model. The model for general population only included two states, survive and death. Among the 5 types of models, there were 9 initial states assigned with initial probabilities, and 27 states for transition probabilities. The results of model verifications showed that the probability curves were basically consistent with the situation of HBV epidemic in China. Conclusion: The Markov models developed can be used in economics evaluation of hepatitis B vaccination and treatment for the elimination of HBV infection in China though the structures and parameters in the model have uncertainty with dynamic natures.

  4. Bayesics

    NASA Astrophysics Data System (ADS)

    Skilling, John

    2005-11-01

    This tutorial gives a basic overview of Bayesian methodology, from its axiomatic foundation through the conventional development of data analysis and model selection to its rôle in quantum mechanics, and ending with some comments on inference in general human affairs. The central theme is that probability calculus is the unique language within which we can develop models of our surroundings that have predictive capability. These models are patterns of belief; there is no need to claim external reality. 1. Logic and probability 2. Probability and inference 3. Probability and model selection 4. Prior probabilities 5. Probability and frequency 6. Probability and quantum mechanics 7. Probability and fundamentalism 8. Probability and deception 9. Prediction and truth

  5. Uncertainty analysis in fault tree models with dependent basic events.

    PubMed

    Pedroni, Nicola; Zio, Enrico

    2013-06-01

    In general, two types of dependence need to be considered when estimating the probability of the top event (TE) of a fault tree (FT): "objective" dependence between the (random) occurrences of different basic events (BEs) in the FT and "state-of-knowledge" (epistemic) dependence between estimates of the epistemically uncertain probabilities of some BEs of the FT model. In this article, we study the effects on the TE probability of objective and epistemic dependences. The well-known Frèchet bounds and the distribution envelope determination (DEnv) method are used to model all kinds of (possibly unknown) objective and epistemic dependences, respectively. For exemplification, the analyses are carried out on a FT with six BEs. Results show that both types of dependence significantly affect the TE probability; however, the effects of epistemic dependence are likely to be overwhelmed by those of objective dependence (if present). © 2012 Society for Risk Analysis.

  6. Robotics and STEM Learning: Students' Achievements in Assignments According to the P3 Task Taxonomy--Practice, Problem Solving, and Projects

    ERIC Educational Resources Information Center

    Barak, Moshe; Assal, Muhammad

    2018-01-01

    This study presents the case of development and evaluation of a STEM-oriented 30-h robotics course for junior high school students (n = 32). Class activities were designed according to the P3 Task Taxonomy, which included: (1) practice-basic closed-ended tasks and exercises; (2) problem solving--small-scale open-ended assignments in which the…

  7. Suggestions for Teaching Mathematics Using Laboratory Approaches. 6. Probability. Experimental Edition.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of Elementary Curriculum Development.

    This guide is the sixth in a series of publications to assist teachers in using a laboratory approach to mathematics. Twenty activities on probability and statistics for the elementary grades are described in terms of purpose, materials needed, and procedures to be used. Objectives of these activities include basic probability concepts; gathering,…

  8. 5 CFR 575.507 - What is the maximum extended assignment incentive that may be paid for a period of service?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... greater of— (1) An amount equal to 25 percent of the annual rate of basic pay of the employee at the... periods equals 546 days, and 546 days divided by 365 days equals 1.50 years. ... rate employees who do not have a scheduled annual rate of basic pay, the annual rate in paragraph (a...

  9. A Universal Model for Evaluating Basic Electronic Courses in Terms of Field Utilization of Training.

    ERIC Educational Resources Information Center

    Air Force Occupational Measurement Center, Lackland AFB, TX.

    The main purpose of the Air Force project was to develop a universal model to evaluate usage of basic electronic principles training. The criterion used by the model to evaluate electronic theory training is a determination of the usefulness of the training vis-a-vis the performance of assigned tasks in the various electronic career fields. Data…

  10. A Comparison of Two Teaching Methodologies for a Course in Basic Reference. Final Report.

    ERIC Educational Resources Information Center

    Gothberg, Helen M.

    The purpose of the investigation was to develop and test an audio-tutorial program for a course in Basic Reference. The design of the investigation was a posttest-only-control group design with 63 students randomly assigned to either an audio-tutorial or a lecture group. Data were collected and analyzed using a t-test for two groups and four…

  11. Gender assignment in patients with disorder of sex development.

    PubMed

    Mendonca, Berenice B

    2014-12-01

    To examine the sex assignment in patients with atypical external genitalia, a particularly challenging situation, especially when the genital appearance is not compatible with the sex chromosome. The most important factors that influence sex assignment include the definite diagnosis, genital appearance, surgical options, potential for fertility, risks of gonadal malignancy and, finally, the perception of the patients and their parents. Full disclosure and complete involvement of the parents in making decisions concerning gender assignment and/or genital surgery must be part of the basic medical care for children with disorder of sex development. Patients with disorder of sex development should receive long-term care provided by multidisciplinary teams in centers of excellence with ample experience in the management of this disorder.

  12. A novel method of fuzzy fault tree analysis combined with VB program to identify and assess the risk of coal dust explosions

    PubMed Central

    Li, Jia; Wang, Deming; Huang, Zonghou

    2017-01-01

    Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents. PMID:28793348

  13. A novel method of fuzzy fault tree analysis combined with VB program to identify and assess the risk of coal dust explosions.

    PubMed

    Wang, Hetang; Li, Jia; Wang, Deming; Huang, Zonghou

    2017-01-01

    Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents.

  14. A Multidisciplinary Approach for Teaching Statistics and Probability

    ERIC Educational Resources Information Center

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  15. Speed in Information Processing with a Computer Driven Visual Display in a Real-time Digital Simulation. M.S. Thesis - Virginia Polytechnic Inst.

    NASA Technical Reports Server (NTRS)

    Kyle, R. G.

    1972-01-01

    Information transfer between the operator and computer-generated display systems is an area where the human factors engineer discovers little useful design data relating human performance to system effectiveness. This study utilized a computer-driven, cathode-ray-tube graphic display to quantify human response speed in a sequential information processing task. The performance criteria was response time to sixteen cell elements of a square matrix display. A stimulus signal instruction specified selected cell locations by both row and column identification. An equal probable number code, from one to four, was assigned at random to the sixteen cells of the matrix and correspondingly required one of four, matched keyed-response alternatives. The display format corresponded to a sequence of diagnostic system maintenance events, that enable the operator to verify prime system status, engage backup redundancy for failed subsystem components, and exercise alternate decision-making judgements. The experimental task bypassed the skilled decision-making element and computer processing time, in order to determine a lower bound on the basic response speed for given stimulus/response hardware arrangement.

  16. Sex determination of Pohnpei Micronesian kingfishers using morphological and molecular genetic techniques

    USGS Publications Warehouse

    Kesler, Dylan C.; Lopes, I.F.; Haig, Susan M.

    2006-01-01

    Conservation-oriented studies of Micronesian Kingfishers (Todiramphus cinnamominus) have been hindered by a lack of basic natural history information, despite the status of the Guam subspecies (T. c. cinnamominus) as one of the most endangered species in the world. We used tissue samples and morphometric measures from museum specimens and wild-captured Pohnpei Micronesian Kingfishers (T. c. reichenbachii) to develop methods for sex determination. We present a modified molecular protocol and a discriminant function that yields the probability that a particular individual is male or female. Our results revealed that females were significantly larger than males, and the discriminant function correctly predicted sex in 73% (30/41) of the individuals. The sex of 86% (18/21) of individuals was correctly assigned when a moderate reliability threshold was set. Sex determination using molecular genetic techniques was more reliable than methods based on morphology. Our results will facilitate recovery efforts for the critically endangered Guam Micronesian Kingfisher and provide a basis for sex determination in the 11 other endangered congeners in the Pacific Basin.

  17. Quantum computational studies, spectroscopic (FT-IR, FT-Raman and UV-Vis) profiling, natural hybrid orbital and molecular docking analysis on 2,4 Dibromoaniline

    NASA Astrophysics Data System (ADS)

    Abraham, Christina Susan; Prasana, Johanan Christian; Muthu, S.; Rizwana B, Fathima; Raja, M.

    2018-05-01

    The research exploration will comprise of investigating the molecular structure, vibrational assignments, bonding and anti-bonding nature, nonlinear optical, electronic and thermodynamic nature of the molecule. The research is conducted at two levels: First level employs the spectroscopic techniques - FT-IR, FT-Raman and UV-Vis characterizing techniques; at second level the data attained experimentally is analyzed through theoretical methods using and Density Function Theories which involves the basic principle of solving the Schrodinger equation for many body systems. A comparison is drawn between the two levels and discussed. The probability of the title molecule being bio-active theoretically proved by the electrophilicity index leads to further property analyzes of the molecule. The target molecule is found to fit well with Centromere associated protein inhibitor using molecular docking techniques. Higher basis set 6-311++G(d,p) is used to attain results more concurrent to the experimental data. The results of the organic amine 2, 4 Dibromoaniline is analyzed and discussed.

  18. Effect of platykurtic and leptokurtic distributions in the random-field Ising model: mean-field approach.

    PubMed

    Duarte Queirós, Sílvio M; Crokidakis, Nuno; Soares-Pinto, Diogo O

    2009-07-01

    The influence of the tail features of the local magnetic field probability density function (PDF) on the ferromagnetic Ising model is studied in the limit of infinite range interactions. Specifically, we assign a quenched random field whose value is in accordance with a generic distribution that bears platykurtic and leptokurtic distributions depending on a single parameter tau<3 to each site. For tau<5/3, such distributions, which are basically Student-t and r distribution extended for all plausible real degrees of freedom, present a finite standard deviation, if not the distribution has got the same asymptotic power-law behavior as a alpha-stable Lévy distribution with alpha=(3-tau)/(tau-1). For every value of tau, at specific temperature and width of the distribution, the system undergoes a continuous phase transition. Strikingly, we impart the emergence of an inflexion point in the temperature-PDF width phase diagrams for distributions broader than the Cauchy-Lorentz (tau=2) which is accompanied with a divergent free energy per spin (at zero temperature).

  19. A Self-Adaptive Dynamic Recognition Model for Fatigue Driving Based on Multi-Source Information and Two Levels of Fusion

    PubMed Central

    Sun, Wei; Zhang, Xiaorui; Peeta, Srinivas; He, Xiaozheng; Li, Yongfu; Zhu, Senlai

    2015-01-01

    To improve the effectiveness and robustness of fatigue driving recognition, a self-adaptive dynamic recognition model is proposed that incorporates information from multiple sources and involves two sequential levels of fusion, constructed at the feature level and the decision level. Compared with existing models, the proposed model introduces a dynamic basic probability assignment (BPA) to the decision-level fusion such that the weight of each feature source can change dynamically with the real-time fatigue feature measurements. Further, the proposed model can combine the fatigue state at the previous time step in the decision-level fusion to improve the robustness of the fatigue driving recognition. An improved correction strategy of the BPA is also proposed to accommodate the decision conflict caused by external disturbances. Results from field experiments demonstrate that the effectiveness and robustness of the proposed model are better than those of models based on a single fatigue feature and/or single-source information fusion, especially when the most effective fatigue features are used in the proposed model. PMID:26393615

  20. Value assignment and uncertainty evaluation for single-element reference solutions

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Bodnar, Olha; Butler, Therese A.; Molloy, John L.; Winchester, Michael R.

    2018-06-01

    A Bayesian statistical procedure is proposed for value assignment and uncertainty evaluation for the mass fraction of the elemental analytes in single-element solutions distributed as NIST standard reference materials. The principal novelty that we describe is the use of information about relative differences observed historically between the measured values obtained via gravimetry and via high-performance inductively coupled plasma optical emission spectrometry, to quantify the uncertainty component attributable to between-method differences. This information is encapsulated in a prior probability distribution for the between-method uncertainty component, and it is then used, together with the information provided by current measurement data, to produce a probability distribution for the value of the measurand from which an estimate and evaluation of uncertainty are extracted using established statistical procedures.

  1. Lognormal Approximations of Fault Tree Uncertainty Distributions.

    PubMed

    El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P

    2018-01-26

    Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.

  2. Architectures and protocols for an integrated satellite-terrestrial mobile system

    NASA Technical Reports Server (NTRS)

    Delre, E.; Dellipriscoli, F.; Iannucci, P.; Menolascino, R.; Settimo, F.

    1993-01-01

    This paper aims to depict some basic concepts related to the definition of an integrated system for mobile communications, consisting of a satellite network and a terrestrial cellular network. In particular three aspects are discussed: (1) architecture definition for the satellite network; (2) assignment strategy of the satellite channels; and (3) definition of 'internetworking procedures' between cellular and satellite network, according to the selected architecture and the satellite channel assignment strategy.

  3. Patterns and determinants of use of pharmacological therapies for intermittent claudication in PAD outpatients: results of the IDOMENEO study.

    PubMed

    Cimminiello, Claudio; Polo Friz, Hernan; Marano, Giuseppe; Arpaia, Guido; Boracchi, Patrizia; Spezzigu, Gabriella; Visonà, Adriana

    2017-06-01

    Peripheral arterial disease (PAD) usually presents with intermittent claudication (IC). The aim of the present study was to assess, in clinical practice, the pattern of use of pharmacological therapies for IC in stable PAD outpatients. A propensity analysis was performed using data from the IDOMENEO study, an observational prospective multicenter cohort study. The association between any pharmacological symptomatic IC therapy with different variables was investigated using generalized linear mixed models with pharmacological therapy as response variable and binomial error. Study population: 213 patients, male sex 147 (69.0%), mean age 70.0±8.6 years. Only 36.6% was under pharmacological treatment for IC, being cilostazol the most used medication (21.6%). Univariate analysis showed a probability of a patient of being assigned to any pharmacological symptomatic IC therapy of 67.0% when Ankle-Brachial Index (ABI) <0.6 and 29.8% when ABI>0.6 (P=0.0048), and a propensity to avoid pharmacological treatment for patients with a high number of drugs to treat cardiovascular risk factors (probability of 55.2% for <4 drugs and 19.6% for >4 drugs, P=0.0317). Multivariate analysis confirmed a higher probability of assigning treatment for ABI<0.6 (P=0.0274), and a trend to a lower probability in patients under polypharmacy (>4 drugs: OR=0.13, P=0.0546). In clinical practice, only one third of stable outpatients with IC used symptomatic pharmacological therapy for IC. We found a propensity of clinicians to assign any symptomatic pharmacological IC therapy to patients with lower values of ABI and a propensity to avoid this kind of treatment in patients under polypharmacy.

  4. Multiple-solution problems in a statistics classroom: an example

    NASA Astrophysics Data System (ADS)

    Chu, Chi Wing; Chan, Kevin L. T.; Chan, Wai-Sum; Kwong, Koon-Shing

    2017-11-01

    The mathematics education literature shows that encouraging students to develop multiple solutions for given problems has a positive effect on students' understanding and creativity. In this paper, we present an example of multiple-solution problems in statistics involving a set of non-traditional dice. In particular, we consider the exact probability mass distribution for the sum of face values. Four different ways of solving the problem are discussed. The solutions span various basic concepts in different mathematical disciplines (sample space in probability theory, the probability generating function in statistics, integer partition in basic combinatorics and individual risk model in actuarial science) and thus promotes upper undergraduate students' awareness of knowledge connections between their courses. All solutions of the example are implemented using the R statistical software package.

  5. VizieR Online Data Catalog: Proper motions of PM2000 open clusters (Krone-Martins+, 2010)

    NASA Astrophysics Data System (ADS)

    Krone-Martins, A.; Soubiran, C.; Ducourant, C.; Teixeira, R.; Le Campion, J. F.

    2010-04-01

    We present lists of proper-motions and kinematic membership probabilities in the region of 49 open clusters or possible open clusters. The stellar proper motions were taken from the Bordeaux PM2000 catalogue. The segregation between cluster and field stars and the assignment of membership probabilities was accomplished by applying a fully automated method based on parametrisations for the probability distribution functions and genetic algorithm optimisation heuristics associated with a derivative-based hill climbing algorithm for the likelihood optimization. (3 data files).

  6. A Gleason-Type Theorem for Any Dimension Based on a Gambling Formulation of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Benavoli, Alessio; Facchini, Alessandro; Zaffalon, Marco

    2017-07-01

    Based on a gambling formulation of quantum mechanics, we derive a Gleason-type theorem that holds for any dimension n of a quantum system, and in particular for n=2. The theorem states that the only logically consistent probability assignments are exactly the ones that are definable as the trace of the product of a projector and a density matrix operator. In addition, we detail the reason why dispersion-free probabilities are actually not valid, or rational, probabilities for quantum mechanics, and hence should be excluded from consideration.

  7. Polarimetric phenomenology in the reflective regime: a case study using polarized hyperspectral data

    NASA Astrophysics Data System (ADS)

    Gibney, Mark

    2016-05-01

    Understanding the phenomenology of polarimetric data is necessary if we want to obtain the maximum benefit when we exploit that data. To first order, polarimetric phenomenology is driven by two things; the target material type (specular or diffuse) and the illuminating source (point (sun) or extended (body emission)). Polarimetric phenomenology can then be broken into three basic categories; ([specular material/sun source], [diffuse/sun], [specular/body]) where we have assigned body emission to the IR passband where materials are generally specular. The task of interest determines the category of interest since the task determines the dominant target material and the illuminating source (eg detecting diffuse targets under trees in VNIR = [diffuse/sun] category). In this paper, a specific case study for the important [diffuse/sun] category will be presented. For the reflective regime (0.3 - 3.0um), the largest polarimetric signal is obtained when the sun illuminates a significant portion of the material BRDF lobe. This naturally points us to problems whose primary target materials are diffuse since the BRDF lobe for specular materials is tiny (low probability of acquiring on the BRDF lobe) and glinty (high probability of saturating the sensor when on lobe). In this case study, we investigated signatures of solar illuminated diffuse paints acquired by a polarimetric hyperspectral sensor. We will discuss the acquisition, reduction and exploitation of that data, and use it to illustrate the primary characteristics of reflective polarimetric phenomenology.

  8. Pigeons, Facebook and the Birthday Problem

    ERIC Educational Resources Information Center

    Russell, Matthew

    2013-01-01

    The unexpectedness of the birthday problem has long been used by teachers of statistics in discussing basic probability calculation. An activity is described that engages students in understanding probability and sampling using the popular Facebook social networking site. (Contains 2 figures and 1 table.)

  9. 20 CFR 638.502 - Job Corps basic education program.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... as a second language (ESL) programs for selected center operators (regional offices shall arrange for the assignment of selected applicants needing ESL programs to the centers where such programs are...

  10. Stereospecific assignment of the asparagine and glutamine sidechain amide protons in proteins from chemical shift analysis.

    PubMed

    Harsch, Tobias; Schneider, Philipp; Kieninger, Bärbel; Donaubauer, Harald; Kalbitzer, Hans Robert

    2017-02-01

    Side chain amide protons of asparagine and glutamine residues in random-coil peptides are characterized by large chemical shift differences and can be stereospecifically assigned on the basis of their chemical shift values only. The bimodal chemical shift distributions stored in the biological magnetic resonance data bank (BMRB) do not allow such an assignment. However, an analysis of the BMRB shows, that a substantial part of all stored stereospecific assignments is not correct. We show here that in most cases stereospecific assignment can also be done for folded proteins using an unbiased artificial chemical shift data base (UACSB). For a separation of the chemical shifts of the two amide resonance lines with differences ≥0.40 ppm for asparagine and differences ≥0.42 ppm for glutamine, the downfield shifted resonance lines can be assigned to H δ21 and H ε21 , respectively, at a confidence level >95%. A classifier derived from UASCB can also be used to correct the BMRB data. The program tool AssignmentChecker implemented in AUREMOL calculates the Bayesian probability for a given stereospecific assignment and automatically corrects the assignments for a given list of chemical shifts.

  11. A brief introduction to probability.

    PubMed

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  12. Principles of project management

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.

  13. A Fuzzy Goal Programming for a Multi-Depot Distribution Problem

    NASA Astrophysics Data System (ADS)

    Nunkaew, Wuttinan; Phruksaphanrat, Busaba

    2010-10-01

    A fuzzy goal programming model for solving a Multi-Depot Distribution Problem (MDDP) is proposed in this research. This effective proposed model is applied for solving in the first step of Assignment First-Routing Second (AFRS) approach. Practically, a basic transportation model is firstly chosen to solve this kind of problem in the assignment step. After that the Vehicle Routing Problem (VRP) model is used to compute the delivery cost in the routing step. However, in the basic transportation model, only depot to customer relationship is concerned. In addition, the consideration of customer to customer relationship should also be considered since this relationship exists in the routing step. Both considerations of relationships are solved using Preemptive Fuzzy Goal Programming (P-FGP). The first fuzzy goal is set by a total transportation cost and the second fuzzy goal is set by a satisfactory level of the overall independence value. A case study is used for describing the effectiveness of the proposed model. Results from the proposed model are compared with the basic transportation model that has previously been used in this company. The proposed model can reduce the actual delivery cost in the routing step owing to the better result in the assignment step. Defining fuzzy goals by membership functions are more realistic than crisps. Furthermore, flexibility to adjust goals and an acceptable satisfactory level for decision maker can also be increased and the optimal solution can be obtained.

  14. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  15. Case complexity scores in congenital heart surgery: a comparative study of the Aristotle Basic Complexity score and the Risk Adjustment in Congenital Heart Surgery (RACHS-1) system.

    PubMed

    Al-Radi, Osman O; Harrell, Frank E; Caldarone, Christopher A; McCrindle, Brian W; Jacobs, Jeffrey P; Williams, M Gail; Van Arsdell, Glen S; Williams, William G

    2007-04-01

    The Aristotle Basic Complexity score and the Risk Adjustment in Congenital Heart Surgery system were developed by consensus to compare outcomes of congenital cardiac surgery. We compared the predictive value of the 2 systems. Of all index congenital cardiac operations at our institution from 1982 to 2004 (n = 13,675), we were able to assign an Aristotle Basic Complexity score, a Risk Adjustment in Congenital Heart Surgery score, and both scores to 13,138 (96%), 11,533 (84%), and 11,438 (84%) operations, respectively. Models of in-hospital mortality and length of stay were generated for Aristotle Basic Complexity and Risk Adjustment in Congenital Heart Surgery using an identical data set in which both Aristotle Basic Complexity and Risk Adjustment in Congenital Heart Surgery scores were assigned. The likelihood ratio test for nested models and paired concordance statistics were used. After adjustment for year of operation, the odds ratios for Aristotle Basic Complexity score 3 versus 6, 9 versus 6, 12 versus 6, and 15 versus 6 were 0.29, 2.22, 7.62, and 26.54 (P < .0001). Similarly, odds ratios for Risk Adjustment in Congenital Heart Surgery categories 1 versus 2, 3 versus 2, 4 versus 2, and 5/6 versus 2 were 0.23, 1.98, 5.80, and 20.71 (P < .0001). Risk Adjustment in Congenital Heart Surgery added significant predictive value over Aristotle Basic Complexity (likelihood ratio chi2 = 162, P < .0001), whereas Aristotle Basic Complexity contributed much less predictive value over Risk Adjustment in Congenital Heart Surgery (likelihood ratio chi2 = 13.4, P = .009). Neither system fully adjusted for the child's age. The Risk Adjustment in Congenital Heart Surgery scores were more concordant with length of stay compared with Aristotle Basic Complexity scores (P < .0001). The predictive value of Risk Adjustment in Congenital Heart Surgery is higher than that of Aristotle Basic Complexity. The use of Aristotle Basic Complexity or Risk Adjustment in Congenital Heart Surgery as risk stratification and trending tools to monitor outcomes over time and to guide risk-adjusted comparisons may be valuable.

  16. Probability of Accurate Heart Failure Diagnosis and the Implications for Hospital Readmissions.

    PubMed

    Carey, Sandra A; Bass, Kyle; Saracino, Giovanna; East, Cara A; Felius, Joost; Grayburn, Paul A; Vallabhan, Ravi C; Hall, Shelley A

    2017-04-01

    Heart failure (HF) is a complex syndrome with inherent diagnostic challenges. We studied the scope of possibly inaccurately documented HF in a large health care system among patients assigned a primary diagnosis of HF at discharge. Through a retrospective record review and a classification schema developed from published guidelines, we assessed the probability of the documented HF diagnosis being accurate and determined factors associated with HF-related and non-HF-related hospital readmissions. An arbitration committee of 3 experts reviewed a subset of records to corroborate the results. We assigned a low probability of accurate diagnosis to 133 (19%) of the 712 patients. A subset of patients were also reviewed by an expert panel, which concluded that 13% to 35% of patients probably did not have HF (inter-rater agreement, kappa = 0.35). Low-probability HF was predictive of being readmitted more frequently for non-HF causes (p = 0.018), as well as documented arrhythmias (p = 0.023), and age >60 years (p = 0.006). Documented sleep apnea (p = 0.035), percutaneous coronary intervention (p = 0.006), non-white race (p = 0.047), and B-type natriuretic peptide >400 pg/ml (p = 0.007) were determined to be predictive of HF readmissions in this cohort. In conclusion, approximately 1 in 5 patients documented to have HF were found to have a low probability of actually having it. Moreover, the determination of low-probability HF was twice as likely to result in readmission for non-HF causes and, thus, should be considered a determinant for all-cause readmissions in this population. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Dental age estimation: the role of probability estimates at the 10 year threshold.

    PubMed

    Lucas, Victoria S; McDonald, Fraser; Neil, Monica; Roberts, Graham

    2014-08-01

    The use of probability at the 18 year threshold has simplified the reporting of dental age estimates for emerging adults. The availability of simple to use widely available software has enabled the development of the probability threshold for individual teeth in growing children. Tooth development stage data from a previous study at the 10 year threshold were reused to estimate the probability of developing teeth being above or below the 10 year thresh-hold using the NORMDIST Function in Microsoft Excel. The probabilities within an individual subject are averaged to give a single probability that a subject is above or below 10 years old. To test the validity of this approach dental panoramic radiographs of 50 female and 50 male children within 2 years of the chronological age were assessed with the chronological age masked. Once the whole validation set of 100 radiographs had been assessed the masking was removed and the chronological age and dental age compared. The dental age was compared with chronological age to determine whether the dental age correctly or incorrectly identified a validation subject as above or below the 10 year threshold. The probability estimates correctly identified children as above or below on 94% of occasions. Only 2% of the validation group with a chronological age of less than 10 years were assigned to the over 10 year group. This study indicates the very high accuracy of assignment at the 10 year threshold. Further work at other legally important age thresholds is needed to explore the value of this approach to the technique of age estimation. Copyright © 2014. Published by Elsevier Ltd.

  18. Brick tunnel randomization and the momentum of the probability mass.

    PubMed

    Kuznetsova, Olga M

    2015-12-30

    The allocation space of an unequal-allocation permuted block randomization can be quite wide. The development of unequal-allocation procedures with a narrower allocation space, however, is complicated by the need to preserve the unconditional allocation ratio at every step (the allocation ratio preserving (ARP) property). When the allocation paths are depicted on the K-dimensional unitary grid, where allocation to the l-th treatment is represented by a step along the l-th axis, l = 1 to K, the ARP property can be expressed in terms of the center of the probability mass after i allocations. Specifically, for an ARP allocation procedure that randomizes subjects to K treatment groups in w1 :⋯:wK ratio, w1 +⋯+wK =1, the coordinates of the center of the mass are (w1 i,…,wK i). In this paper, the momentum with respect to the center of the probability mass (expected imbalance in treatment assignments) is used to compare ARP procedures in how closely they approximate the target allocation ratio. It is shown that the two-arm and three-arm brick tunnel randomizations (BTR) are the ARP allocation procedures with the tightest allocation space among all allocation procedures with the same allocation ratio; the two-arm BTR is the minimum-momentum two-arm ARP allocation procedure. Resident probabilities of two-arm and three-arm BTR are analytically derived from the coordinates of the center of the probability mass; the existence of the respective transition probabilities is proven. Probability of deterministic assignments with BTR is found generally acceptable. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Interpretation of diagnostic data: 6. How to do it with more complex maths.

    PubMed

    1983-11-15

    We have now shown you how to use decision analysis in making those rare, tough diagnostic decisions that are not soluble through other, easier routes. In summary, to "use more complex maths" the following steps will be useful: Create a decision tree or map of all the pertinent courses of action and their consequences. Assign probabilities to the branches of each chance node. Assign utilities to each of the potential outcomes shown on the decision tree. Combine the probabilities and utilities for each node on the decision tree. Pick the decision that leads to the highest expected utility. Test your decision for its sensitivity to clinically sensible changes in probabilities and utilities. That concludes this series of clinical epidemiology rounds. You've come a long way from "doing it with pictures" and are now able to extract most of the diagnostic information that can be provided from signs, symptoms and laboratory investigations. We would appreciate learning whether you have found this series useful and how we can do a better job of presenting these and other elements of "the science of the art of medicine".

  20. Morality Principles for Risk Modelling: Needs and Links with the Origins of Plausible Inference

    NASA Astrophysics Data System (ADS)

    Solana-Ortega, Alberto; Solana, Vicente

    2009-12-01

    In comparison with the foundations of probability calculus, the inescapable and controversial issue of how to assign probabilities has only recently become a matter of formal study. The introduction of information as a technical concept was a milestone, but the most promising entropic assignment methods still face unsolved difficulties, manifesting the incompleteness of plausible inference theory. In this paper we examine the situation faced by risk analysts in the critical field of extreme events modelling, where the former difficulties are especially visible, due to scarcity of observational data, the large impact of these phenomena and the obligation to assume professional responsibilities. To respond to the claim for a sound framework to deal with extremes, we propose a metafoundational approach to inference, based on a canon of extramathematical requirements. We highlight their strong moral content, and show how this emphasis in morality, far from being new, is connected with the historic origins of plausible inference. Special attention is paid to the contributions of Caramuel, a contemporary of Pascal, unfortunately ignored in the usual mathematical accounts of probability.

  1. A new routing enhancement scheme based on node blocking state advertisement in wavelength-routed WDM networks

    NASA Astrophysics Data System (ADS)

    Hu, Peigang; Jin, Yaohui; Zhang, Chunlei; He, Hao; Hu, WeiSheng

    2005-02-01

    The increasing switching capacity brings the optical node with considerable complexity. Due to the limitation in cost and technology, an optical node is often designed with partial switching capability and partial resource sharing. It means that the node is of blocking to some extent, for example multi-granularity switching node, which in fact is a structure using pass wavelength to reduce the dimension of OXC, and partial sharing wavelength converter (WC) OXC. It is conceivable that these blocking nodes will have great effects on the problem of routing and wavelength assignment. Some previous works studied the blocking case, partial WC OXC, using complicated wavelength assignment algorithm. But the complexities of these schemes decide them to be not in practice in real networks. In this paper, we propose a new scheme based on the node blocking state advertisement to reduce the retry or rerouting probability and improve the efficiency of routing in the networks with blocking nodes. In the scheme, node blocking state are advertised to the other nodes in networks, which will be used for subsequent route calculation to find a path with lowest blocking probability. The performance of the scheme is evaluated using discrete event model in 14-node NSFNET, all the nodes of which employ a kind of partial sharing WC OXC structure. In the simulation, a simple First-Fit wavelength assignment algorithm is used. The simulation results demonstrate that the new scheme considerably reduces the retry or rerouting probability in routing process.

  2. General characteristics of preliminary data processing in the Copernicus experiment

    NASA Technical Reports Server (NTRS)

    Ziolkovski, K.; Kossatski, K.

    1975-01-01

    Data from the 'Copernicus' experiment is processed in four stages: setting up of basic arrays, data calibration, graphical display of results, and assignment of results to navigation parameters. Each stage is briefly discussed.

  3. On-Orbit Collision Hazard Analysis in Low Earth Orbit Using the Poisson Probability Distribution (Version 1.0)

    DOT National Transportation Integrated Search

    1992-08-26

    This document provides the basic information needed to estimate a general : probability of collision in Low Earth Orbit (LEO). Although the method : described in this primer is a first order approximation, its results are : reasonable. Furthermore, t...

  4. An Application of Probability to Combinatorics: A Proof of Vandermonde Identity

    ERIC Educational Resources Information Center

    Paolillo, Bonaventura; Rizzo, Piermichele; Vincenzi, Giovanni

    2017-01-01

    In this paper, we give possible suggestions for a classroom lesson about an application of probability using basic mathematical notions. We will approach to some combinatoric results without using "induction", "polynomial identities" nor "generating functions", and will give a proof of the "Vandermonde…

  5. The Notions of Chance and Probabilities in Preschoolers

    ERIC Educational Resources Information Center

    Nikiforidou, Zoi; Pange, Jenny

    2010-01-01

    Chance, randomness and probability constitute statistical notions that are interrelated and characterize the logicomathematical thinking of children. Traditional theories support that probabilistic thinking evolves after the age of 7. However, recent research has underlined that children, as young as 4, may possess and develop basic notions,…

  6. A Comparative Study of the Reliability and Validity of the "Degrees of Reading Power" and the "Iowa Tests of Basic Skills."

    ERIC Educational Resources Information Center

    Hildebrand, Myrene; Hoover, H. D.

    This study compared the reliability and validity of two different measures of reading ability, the Degrees of Reading Power (DRP) and the Iowa Tests of Basic Skills (ITBS) Reading test and the ITBS Vocabulary test. The data consisted of scores of 377 grade 5 and grade 6 students on these tests, along with their assigned reading levels in the…

  7. Assignment of protonation states in proteins and ligands: combining pKa prediction with hydrogen bonding network optimization.

    PubMed

    Krieger, Elmar; Dunbrack, Roland L; Hooft, Rob W W; Krieger, Barbara

    2012-01-01

    Among the many applications of molecular modeling, drug design is probably the one with the highest demands on the accuracy of the underlying structures. During lead optimization, the position of every atom in the binding site should ideally be known with high precision to identify those chemical modifications that are most likely to increase drug affinity. Unfortunately, X-ray crystallography at common resolution yields an electron density map that is too coarse, since the chemical elements and their protonation states cannot be fully resolved.This chapter describes the steps required to fill in the missing knowledge, by devising an algorithm that can detect and resolve the ambiguities. First, the pK (a) values of acidic and basic groups are predicted. Second, their potential protonation states are determined, including all permutations (considering for example protons that can jump between the oxygens of a phosphate group). Third, those groups of atoms are identified that can adopt alternative but indistinguishable conformations with essentially the same electron density. Fourth, potential hydrogen bond donors and acceptors are located. Finally, all these data are combined in a single "configuration energy function," whose global minimum is found with the SCWRL algorithm, which employs dead-end elimination and graph theory. As a result, one obtains a complete model of the protein and its bound ligand, with ambiguous groups rotated to the best orientation and with protonation states assigned considering the current pH and the H-bonding network. An implementation of the algorithm has been available since 2008 as part of the YASARA modeling & simulation program.

  8. Frequency Assignments for HFDF Receivers in a Search and Rescue Network

    DTIC Science & Technology

    1990-03-01

    SAR problem where whether or not a signal is detected by RS or HFDF at the various stations is described by probabilities. Daskin assumes the...allows the problem to be formulated with a linear objective function (6:52-53). Daskin also developed a heuristic solution algorithm to solve this...en CM in o CM CM < I Q < - -.~- -^ * . . . ■ . ,■ . :ST.-.r . 5 Frequency Assignments for HFDF Receivers in a Search and

  9. Part-time sick leave as a treatment method for individuals with musculoskeletal disorders.

    PubMed

    Andrén, Daniela; Svensson, Mikael

    2012-09-01

    There is increasing evidence that staying active is an important part of a recovery process for individuals on sick leave due to musculoskeletal disorders (MSDs). It has been suggested that using part-time sick-leave rather than full-time sick leave will enhance the possibility of full recovery to the workforce, and several countries actively favor this policy. The aim of this paper is to examine if it is beneficial for individuals on sick leave due to MSDs to be on part-time sick leave compared to full-time sick leave. A sample of 1,170 employees from the RFV-LS (register) database of the Social Insurance Agency of Sweden is used. The effect of being on part-time sick leave compared to full-time sick leave is estimated for the probability of returning to work with full recovery of lost work capacity. A two-stage recursive bivariate probit model is used to deal with the endogeneity problem. The results indicate that employees assigned to part-time sick leave do recover to full work capacity with a higher probability than those assigned to full-time sick leave. The average treatment effect of part-time sick leave is 25 percentage points. Considering that part-time sick leave may also be less expensive than assigning individuals to full-time sick leave, this would imply efficiency improvements from assigning individuals, when possible, to part-time sick leave.

  10. Development of STS/Centaur failure probabilities liftoff to Centaur separation

    NASA Technical Reports Server (NTRS)

    Hudson, J. M.

    1982-01-01

    The results of an analysis to determine STS/Centaur catastrophic vehicle response probabilities for the phases of vehicle flight from STS liftoff to Centaur separation from the Orbiter are presented. The analysis considers only category one component failure modes as contributors to the vehicle response mode probabilities. The relevant component failure modes are grouped into one of fourteen categories of potential vehicle behavior. By assigning failure rates to each component, for each of its failure modes, the STS/Centaur vehicle response probabilities in each phase of flight can be calculated. The results of this study will be used in a DOE analysis to ascertain the hazard from carrying a nuclear payload on the STS.

  11. Severely Aggressive Children Receiving Stimulant Medication Versus Stimulant and Risperidone: 12-Month Follow-Up of the TOSCA Trial

    PubMed Central

    Gadow, Kenneth D.; Brown, Nicole V.; Arnold, L. Eugene; Buchan-Page, Kristin A.; Bukstein, Oscar G.; Butter, Eric; Farmer, Cristan A.; Findling, Robert L.; Kolko, David J.; Molina, Brooke S.G.; Rice, Robert R.; Schneider, Jayne; Aman, Michael G.

    2016-01-01

    Objective To evaluate 52-week clinical outcomes of children with co-occurring attention-deficit/hyperactivity disorder (ADHD), disruptive behavior disorder, and serious physical aggression who participated in a prospective, longitudinal study that began with a controlled, 9-week clinical trial comparing the relative efficacy of parent training + stimulant medication + placebo (Basic; n=84) versus parent training + stimulant + risperidone (Augmented; n=84). Method Almost two-thirds (n=108; 64%) of families in the 9-week study participated in Week 52 follow-ups (Basic, n=55; Augmented, n=53), and they were representative of the initial study sample. The assessment battery included caregiver and clinician ratings and laboratory tests. Results Only 43% of Augmented and 36% of Basic still adhered to their assigned regimen (not significant [ns]); 23% of Augmented and 11% of Basic were taking no medication (ns). Both randomized groups improved baseline to follow-up, but the three primary parent-reported behavioral outcomes showed no significant between-group differences. Exploratory analyses indicated Augmented (65%) was more likely (p=.02) to have a Clinical Global Impressions (CGI) severity score of 1-3 (normal to mildly ill) at follow-up than Basic (42%). Parents rated 45% of children as impaired often or very often from ADHD, noncompliant, or aggressive behavior. Augmented had elevated prolactin levels, and Basic decreased in weight over time. Findings were generally similar whether groups were defined by randomized assignment or follow-up treatment status. Conclusion Both treatment strategies were associated with clinical improvement at follow-up, and primary behavioral outcomes did not differ significantly. Many children evidenced lingering mental health concerns, suggesting the need for additional research into more effective interventions. PMID:27238065

  12. 41 CFR 102-79.10 - What basic assignment and utilization of space policy governs an Executive agency?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... must provide a quality workplace environment that supports program operations, preserves the value of... fitness facilities in the workplace when adequately justified. An Executive agency must promote maximum...

  13. Learning across the curriculum: connecting the pharmaceutical sciences to practice in the first professional year.

    PubMed

    Brown, Bethanne; Skau, Kenneth; Wall, Andrea

    2009-04-07

    To facilitate the student's ability to make the connection of the core foundational basic science courses to the practice of pharmacy. In 2000, 10 faculty members from basic science and practice courses created and implemented an integrated Patient Care Project for which students chose a volunteer patient and completed 15 different assignments Evidence of student learning, such as grades and reflective comments along with collected evaluative data, indicated an enhancement in students' perceived understanding of the connection between basic science and patient care. The Patient Care Project provided students an opportunity to use knowledge gained in their first-year foundational courses to the care of a patient, solidifying their understanding of the connection between basic science and patient care.

  14. Some approaches to optimal cluster labeling of aerospace imagery

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B.

    1980-01-01

    Some approaches are presented to the problem of labeling clusters using information from a given set of labeled and unlabeled aerospace imagery patterns. The assignment of class labels to the clusters is formulated as the determination of the best assignment over all possible ones with respect to some criterion. Cluster labeling is also viewed as the probability of correct labeling with a maximization of likelihood function. Results of the application of these techniques in the processing of remotely sensed multispectral scanner imagery data are presented.

  15. Introduction to Python for CMF Authority Users

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pritchett-Sheats, Lori A.

    This talk is a very broad over view of Python that highlights key features in the language used in the Common Model Framework (CMF). I assume that the audience has some programming experience in a shell scripting language (C shell, Bash, PERL) or other high level language (C/C++/ Fortran). The talk will cover Python data types, classes (objects) and basic programming constructs. The talk concludes with slides describing how I developed the basic classes for a TITANS homework assignment.

  16. Handbook of Basic Atomic Spectroscopic Data

    National Institute of Standards and Technology Data Gateway

    SRD 108 Handbook of Basic Atomic Spectroscopic Data (Web, free access)   This handbook provides a selection of the most important and frequently used atomic spectroscopic data. The compilation includes data for the neutral and singly-ionized atoms of all elements hydrogen through einsteinium (Z = 1-99). The wavelengths, intensities, and spectrum assignments are given for each element, and the data for the approximately 12,000 lines of all elements are also collected into a single table.

  17. Entanglement-enhanced Neyman-Pearson target detection using quantum illumination

    NASA Astrophysics Data System (ADS)

    Zhuang, Quntao; Zhang, Zheshen; Shapiro, Jeffrey H.

    2017-08-01

    Quantum illumination (QI) provides entanglement-based target detection---in an entanglement-breaking environment---whose performance is significantly better than that of optimum classical-illumination target detection. QI's performance advantage was established in a Bayesian setting with the target presumed equally likely to be absent or present and error probability employed as the performance metric. Radar theory, however, eschews that Bayesian approach, preferring the Neyman-Pearson performance criterion to avoid the difficulties of accurately assigning prior probabilities to target absence and presence and appropriate costs to false-alarm and miss errors. We have recently reported an architecture---based on sum-frequency generation (SFG) and feedforward (FF) processing---for minimum error-probability QI target detection with arbitrary prior probabilities for target absence and presence. In this paper, we use our results for FF-SFG reception to determine the receiver operating characteristic---detection probability versus false-alarm probability---for optimum QI target detection under the Neyman-Pearson criterion.

  18. Integrating writing into an introductory environmental science curriculum: Perspectives from biology and physics

    NASA Astrophysics Data System (ADS)

    Selkin, P. A.; Cline, E. T.; Beaufort, A.

    2008-12-01

    In the University of Washington, Tacoma's Environmental Science program, we are implementing a curriculum-wide, scaffolded strategy to teach scientific writing. Writing in an introductory science course is a powerful means to make students feel part of the scientific community, an important goal in our environmental science curriculum. Writing is already an important component of the UW Tacoma environmental science program at the upper levels: our approach is designed to prepare students for the writing-intensive junior- and senior-level seminars. The approach is currently being tested in introductory biology and physics before it is incorporated in the rest of the introductory environmental science curriculum. The centerpiece of our approach is a set of research and writing assignments woven throughout the biology and physics course sequences. The assignments progress in their degree of complexity and freedom through the sequence of introductory science courses. Each assignment is supported by a number of worksheets and short written exercises designed to teach writing and critical thought skills. The worksheets are focused on skills identified both by research in science writing and the instructors' experience with student writing. Students see the assignments as a way to personalize their understanding of basic science concepts, and to think critically about ideas that interest them. We find that these assignments provide a good way to assess student comprehension of some of the more difficult ideas in the basic sciences, as well as a means to engage students with the challenging concepts of introductory science courses. Our experience designing these courses can inform efforts to integrate writing throughout a geoscience or environmental science curriculum, as opposed to on a course-by-course basis.

  19. Interleaved Training and Training-Based Transmission Design for Hybrid Massive Antenna Downlink

    NASA Astrophysics Data System (ADS)

    Zhang, Cheng; Jing, Yindi; Huang, Yongming; Yang, Luxi

    2018-06-01

    In this paper, we study the beam-based training design jointly with the transmission design for hybrid massive antenna single-user (SU) and multiple-user (MU) systems where outage probability is adopted as the performance measure. For SU systems, we propose an interleaved training design to concatenate the feedback and training procedures, thus making the training length adaptive to the channel realization. Exact analytical expressions are derived for the average training length and the outage probability of the proposed interleaved training. For MU systems, we propose a joint design for the beam-based interleaved training, beam assignment, and MU data transmissions. Two solutions for the beam assignment are provided with different complexity-performance tradeoff. Analytical results and simulations show that for both SU and MU systems, the proposed joint training and transmission designs achieve the same outage performance as the traditional full-training scheme but with significant saving in the training overhead.

  20. A human reliability based usability evaluation method for safety-critical software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, R. L.; Tran, T. Q.; Gertman, D. I.

    2006-07-01

    Boring and Gertman (2005) introduced a novel method that augments heuristic usability evaluation methods with that of the human reliability analysis method of SPAR-H. By assigning probabilistic modifiers to individual heuristics, it is possible to arrive at the usability error probability (UEP). Although this UEP is not a literal probability of error, it nonetheless provides a quantitative basis to heuristic evaluation. This method allows one to seamlessly prioritize and identify usability issues (i.e., a higher UEP requires more immediate fixes). However, the original version of this method required the usability evaluator to assign priority weights to the final UEP, thusmore » allowing the priority of a usability issue to differ among usability evaluators. The purpose of this paper is to explore an alternative approach to standardize the priority weighting of the UEP in an effort to improve the method's reliability. (authors)« less

  1. Effect on injuries of assigning shoes based on foot shape in air force basic training.

    PubMed

    Knapik, Joseph J; Brosch, Lorie C; Venuto, Margaret; Swedler, David I; Bullock, Steven H; Gaines, Lorraine S; Murphy, Ryan J; Tchandja, Juste; Jones, Bruce H

    2010-01-01

    This study examined whether assigning running shoes based on the shape of the bottom of the foot (plantar surface) influenced injury risk in Air Force Basic Military Training (BMT) and examined risk factors for injury in BMT. Data were collected from BMT recruits during 2007; analysis took place during 2008. After foot examinations, recruits were randomly consigned to either an experimental group (E, n=1042 men, 375 women) or a control group (C, n=913 men, 346 women). Experimental group recruits were assigned motion control, stability, or cushioned shoes for plantar shapes indicative of low, medium, or high arches, respectively. Control group recruits received a stability shoe regardless of plantar shape. Injuries during BMT were determined from outpatient visits provided from the Defense Medical Surveillance System. Other injury risk factors (fitness, smoking, physical activity, prior injury, menstrual history, and demographics) were obtained from a questionnaire, existing databases, or BMT units. Multivariate Cox regression controlling for other risk factors showed little difference in injury risk between the groups among men (hazard ratio [E/C]=1.11, 95% CI=0.89-1.38) or women (hazard ratio [E/C]=1.20, 95% CI= 0.90-1.60). Independent injury risk factors among both men and women included low aerobic fitness and cigarette smoking. This prospective study demonstrated that assigning running shoes based on the shape of the plantar surface had little influence on injury risk in BMT even after controlling for other injury risk factors. Published by Elsevier Inc.

  2. Determination of the location of positive charges in gas-phase polypeptide polycations by tandem mass spectrometry

    NASA Astrophysics Data System (ADS)

    Kjeldsen, Frank; Savitski, Mikhail M.; Adams, Christopher M.; Zubarev, Roman A.

    2006-06-01

    Location of protonated sites in electrospray-ionized gas-phase peptides and proteins was performed with tandem mass spectrometry using ion activation by both electron capture dissociation (ECD) and collisional activation dissociation (CAD). Charge-carrying sites were assigned based on the increment in the charge state of fragment ions compared to that of the previous fragment in the same series. The property of ECD to neutralize preferentially the least basic site was confirmed by the analysis of three thousand ECD mass spectra of doubly charged tryptic peptides. Multiply charged cations of bradykinin, neurotensin and melittin were studied in detail. For n+ precursors, ECD revealed the positions of (n - 1) most basic sites, while CAD could in principle locate alln charges. However, ECD introduced minimal proton mobilization and produced more conclusive data than CAD, for which N- and C-terminal data often disagreed. Consistent with the dominance of one charge conformer and its preservation in ECD, the average charge states of complementary fragments of n+ ions almost always added up to (n - 1)+, while the similar figure in CAD often deviated from n+, indicating extensive charge isomerization under collisional excitation. For bradykinin and neurotensin, the charge assignments were largely in agreement with the intrinsic gas-phase basicity of the respective amino acid residues. For melittin ions in higher charge states, ECD revealed the charging at both intrinsically basic as well as at less basic residues, which was attributed to charge sharing with other groups due to the presence of secondary and higher order structures in this larger polypeptide.

  3. Just-in-Time Support

    ERIC Educational Resources Information Center

    Rollins, Suzy Pepper

    2016-01-01

    Most students have gaps in their background knowledge and basic skills-gaps that can stand in the way of learning new concepts. For example, a student may be excited about studying probability--until he realizes that today's lesson on probability will require him to use fractions. As his brain searches frantically for his dim recollection of the…

  4. Approximating Integrals Using Probability

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.; Caudle, Kyle A.

    2005-01-01

    As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…

  5. A mechanism producing power law etc. distributions

    NASA Astrophysics Data System (ADS)

    Li, Heling; Shen, Hongjun; Yang, Bin

    2017-07-01

    Power law distribution is playing an increasingly important role in the complex system study. Based on the insolvability of complex systems, the idea of incomplete statistics is utilized and expanded, three different exponential factors are introduced in equations about the normalization condition, statistical average and Shannon entropy, with probability distribution function deduced about exponential function, power function and the product form between power function and exponential function derived from Shannon entropy and maximal entropy principle. So it is shown that maximum entropy principle can totally replace equal probability hypothesis. Owing to the fact that power and probability distribution in the product form between power function and exponential function, which cannot be derived via equal probability hypothesis, can be derived by the aid of maximal entropy principle, it also can be concluded that maximal entropy principle is a basic principle which embodies concepts more extensively and reveals basic principles on motion laws of objects more fundamentally. At the same time, this principle also reveals the intrinsic link between Nature and different objects in human society and principles complied by all.

  6. Chance-Constrained Missile-Procurement and Deployment Models for Naval Surface Warfare

    DTIC Science & Technology

    2005-03-01

    II iv . Missions in period II are assigned so that period-II scenarios are satisfied with a user-specified probability IIsp , which may depend on...feasible solution of RFFAM will successfully cover the period-II demands with probability IIsp if the MAP is followed (see Corollary 1 in Chapter...given set of scenarios D , and in particular any IIsp -feasible subset. Therefore, assume that the remainder vectors ˆsjr are listed in non-increasing

  7. Generative adversarial networks for brain lesion detection

    NASA Astrophysics Data System (ADS)

    Alex, Varghese; Safwan, K. P. Mohammed; Chennamsetty, Sai Saketh; Krishnamurthi, Ganapathy

    2017-02-01

    Manual segmentation of brain lesions from Magnetic Resonance Images (MRI) is cumbersome and introduces errors due to inter-rater variability. This paper introduces a semi-supervised technique for detection of brain lesion from MRI using Generative Adversarial Networks (GANs). GANs comprises of a Generator network and a Discriminator network which are trained simultaneously with the objective of one bettering the other. The networks were trained using non lesion patches (n=13,000) from 4 different MR sequences. The network was trained on BraTS dataset and patches were extracted from regions excluding tumor region. The Generator network generates data by modeling the underlying probability distribution of the training data, (PData). The Discriminator learns the posterior probability P (Label Data) by classifying training data and generated data as "Real" or "Fake" respectively. The Generator upon learning the joint distribution, produces images/patches such that the performance of the Discriminator on them are random, i.e. P (Label Data = GeneratedData) = 0.5. During testing, the Discriminator assigns posterior probability values close to 0.5 for patches from non lesion regions, while patches centered on lesion arise from a different distribution (PLesion) and hence are assigned lower posterior probability value by the Discriminator. On the test set (n=14), the proposed technique achieves whole tumor dice score of 0.69, sensitivity of 91% and specificity of 59%. Additionally the generator network was capable of generating non lesion patches from various MR sequences.

  8. Know the risk, take the win: how executive functions and probability processing influence advantageous decision making under risk conditions.

    PubMed

    Brand, Matthias; Schiebener, Johannes; Pertl, Marie-Theres; Delazer, Margarete

    2014-01-01

    Recent models on decision making under risk conditions have suggested that numerical abilities are important ingredients of advantageous decision-making performance, but empirical evidence is still limited. The results of our first study show that logical reasoning and basic mental calculation capacities predict ratio processing and that ratio processing predicts decision making under risk. In the second study, logical reasoning together with executive functions predicted probability processing (numeracy and probability knowledge), and probability processing predicted decision making under risk. These findings suggest that increasing an individual's understanding of ratios and probabilities should lead to more advantageous decisions under risk conditions.

  9. A modeling of dynamic storage assignment for order picking in beverage warehousing with Drive-in Rack system

    NASA Astrophysics Data System (ADS)

    Hadi, M. Z.; Djatna, T.; Sugiarto

    2018-04-01

    This paper develops a dynamic storage assignment model to solve storage assignment problem (SAP) for beverages order picking in a drive-in rack warehousing system to determine the appropriate storage location and space for each beverage products dynamically so that the performance of the system can be improved. This study constructs a graph model to represent drive-in rack storage position then combine association rules mining, class-based storage policies and an arrangement rule algorithm to determine an appropriate storage location and arrangement of the product according to dynamic orders from customers. The performance of the proposed model is measured as rule adjacency accuracy, travel distance (for picking process) and probability a product become expiry using Last Come First Serve (LCFS) queue approach. Finally, the proposed model is implemented through computer simulation and compare the performance for different storage assignment methods as well. The result indicates that the proposed model outperforms other storage assignment methods.

  10. Hidden Markov models for evolution and comparative genomics analysis.

    PubMed

    Bykova, Nadezda A; Favorov, Alexander V; Mironov, Andrey A

    2013-01-01

    The problem of reconstruction of ancestral states given a phylogeny and data from extant species arises in a wide range of biological studies. The continuous-time Markov model for the discrete states evolution is generally used for the reconstruction of ancestral states. We modify this model to account for a case when the states of the extant species are uncertain. This situation appears, for example, if the states for extant species are predicted by some program and thus are known only with some level of reliability; it is common for bioinformatics field. The main idea is formulation of the problem as a hidden Markov model on a tree (tree HMM, tHMM), where the basic continuous-time Markov model is expanded with the introduction of emission probabilities of observed data (e.g. prediction scores) for each underlying discrete state. Our tHMM decoding algorithm allows us to predict states at the ancestral nodes as well as to refine states at the leaves on the basis of quantitative comparative genomics. The test on the simulated data shows that the tHMM approach applied to the continuous variable reflecting the probabilities of the states (i.e. prediction score) appears to be more accurate then the reconstruction from the discrete states assignment defined by the best score threshold. We provide examples of applying our model to the evolutionary analysis of N-terminal signal peptides and transcription factor binding sites in bacteria. The program is freely available at http://bioinf.fbb.msu.ru/~nadya/tHMM and via web-service at http://bioinf.fbb.msu.ru/treehmmweb.

  11. Using the Bayesian Improved Surname Geocoding Method (BISG) to create a working classification of race and ethnicity in a diverse managed care population: a validation study.

    PubMed

    Adjaye-Gbewonyo, Dzifa; Bednarczyk, Robert A; Davis, Robert L; Omer, Saad B

    2014-02-01

    To validate classification of race/ethnicity based on the Bayesian Improved Surname Geocoding method (BISG) and assess variations in validity by gender and age. Secondary data on members of Kaiser Permanente Georgia, an integrated managed care organization, through 2010. For 191,494 members with self-reported race/ethnicity, probabilities for belonging to each of six race/ethnicity categories predicted from the BISG algorithm were used to assign individuals to a race/ethnicity category over a range of cutoffs greater than a probability of 0.50. Overall as well as gender- and age-stratified sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were calculated. Receiver operating characteristic (ROC) curves were generated and used to identify optimal cutoffs for race/ethnicity assignment. The overall cutoffs for assignment that optimized sensitivity and specificity ranged from 0.50 to 0.57 for the four main racial/ethnic categories (White, Black, Asian/Pacific Islander, Hispanic). Corresponding sensitivity, specificity, PPV, and NPV ranged from 64.4 to 81.4 percent, 80.8 to 99.7 percent, 75.0 to 91.6 percent, and 79.4 to 98.0 percent, respectively. Accuracy of assignment was better among males and individuals of 65 years or older. BISG may be useful for classifying race/ethnicity of health plan members when needed for health care studies. © Health Research and Educational Trust.

  12. Quantum probability and Hilbert's sixth problem

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi

    2018-04-01

    With the birth of quantum mechanics, the two disciplines that Hilbert proposed to axiomatize, probability and mechanics, became entangled and a new probabilistic model arose in addition to the classical one. Thus, to meet Hilbert's challenge, an axiomatization should account deductively for the basic features of all three disciplines. This goal was achieved within the framework of quantum probability. The present paper surveys the quantum probabilistic axiomatization. This article is part of the themed issue `Hilbert's sixth problem'.

  13. Task Assignment Heuristics for Parallel and Distributed CFD Applications

    NASA Technical Reports Server (NTRS)

    Lopez-Benitez, Noe; Djomehri, M. Jahed; Biswas, Rupak

    2003-01-01

    This paper proposes a task graph (TG) model to represent a single discrete step of multi-block overset grid computational fluid dynamics (CFD) applications. The TG model is then used to not only balance the computational workload across the overset grids but also to reduce inter-grid communication costs. We have developed a set of task assignment heuristics based on the constraints inherent in this class of CFD problems. Two basic assignments, the smallest task first (STF) and the largest task first (LTF), are first presented. They are then systematically costs. To predict the performance of the proposed task assignment heuristics, extensive performance evaluations are conducted on a synthetic TG with tasks defined in terms of the number of grid points in predetermined overlapping grids. A TG derived from a realistic problem with eight million grid points is also used as a test case.

  14. Person-Organization Fit and Its Effect on Retention of Army Officers with Less Than Eight Years of Active Duty Service

    DTIC Science & Technology

    2015-06-12

    insight into the physical and mental well-being of employees, and identify and develop desired leadership traits in employees. Person -organization fit...advocate general) will be considered as self-select for person - vocation fit and therefore will be excluded. 17 Table 2. Basic Branches Grouped...considered as self-select for person - vocation fit and may skew the results for personnel assigned to the basic branches. Studies examining related concepts

  15. The Systems Approach to Functional Job Analysis. Task Analysis of the Physician's Assistant: Volume II--Curriculum and Phase I Basic Core Courses and Volume III--Phases II and III--Clinical Clerkships and Assignments.

    ERIC Educational Resources Information Center

    Wake Forest Univ., Winston Salem, NC. Bowman Gray School of Medicine.

    This publication contains a curriculum developed through functional job analyses for a 24-month physician's assistant training program. Phase 1 of the 3-phase program is a 6-month basic course program in clinical and bioscience principles and is required of all students regardless of their specialty interest. Phase 2 is a 6 to 10 month period of…

  16. 76 FR 14442 - 60-Day Notice of Proposed Information Collection: DS 6561 Pre-Assignment for Overseas Duty for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-16

    ... automated collection techniques or other forms of technology. Abstract of proposed collection: The DS 6561 form provides a concise summary of basic medical history, lab tests and physical examination. Since...

  17. Zero-Based Budgeting Redux.

    ERIC Educational Resources Information Center

    Geiger, Philip E.

    1993-01-01

    Zero-based, programmatic budgeting involves four basic steps: (1) define what needs to be done; (2) specify the resources required; (3) determine the assessment procedures and standards to use in evaluating the effectiveness of various programs; and (4) assign dollar figures to this information. (MLF)

  18. Basic concepts in three-part quantitative assessments of undiscovered mineral resources

    USGS Publications Warehouse

    Singer, D.A.

    1993-01-01

    Since 1975, mineral resource assessments have been made for over 27 areas covering 5??106 km2 at various scales using what is now called the three-part form of quantitative assessment. In these assessments, (1) areas are delineated according to the types of deposits permitted by the geology,(2) the amount of metal and some ore characteristics are estimated using grade and tonnage models, and (3) the number of undiscovered deposits of each type is estimated. Permissive boundaries are drawn for one or more deposit types such that the probability of a deposit lying outside the boundary is negligible, that is, less than 1 in 100,000 to 1,000,000. Grade and tonnage models combined with estimates of the number of deposits are the fundamental means of translating geologists' resource assessments into a language that economists can use. Estimates of the number of deposits explicitly represent the probability (or degree of belief) that some fixed but unknown number of undiscovered deposits exist in the delineated tracts. Estimates are by deposit type and must be consistent with the grade and tonnage model. Other guidelines for these estimates include (1) frequency of deposits from well-explored areas, (2) local deposit extrapolations, (3) counting and assigning probabilities to anomalies and occurrences, (4) process constraints, (5) relative frequencies of related deposit types, and (6) area spatial limits. In most cases, estimates are made subjectively, as they are in meteorology, gambling, and geologic interpretations. In three-part assessments, the estimates are internally consistent because delineated tracts are consistent with descriptive models, grade and tonnage models are consistent with descriptive models, as well as with known deposits in the area, and estimates of number of deposits are consistent with grade and tonnage models. All available information is used in the assessment, and uncertainty is explicitly represented. ?? 1993 Oxford University Press.

  19. A Novel Mittag-Leffler Kernel Based Hybrid Fault Diagnosis Method for Wheeled Robot Driving System.

    PubMed

    Yuan, Xianfeng; Song, Mumin; Zhou, Fengyu; Chen, Zhumin; Li, Yan

    2015-01-01

    The wheeled robots have been successfully applied in many aspects, such as industrial handling vehicles, and wheeled service robots. To improve the safety and reliability of wheeled robots, this paper presents a novel hybrid fault diagnosis framework based on Mittag-Leffler kernel (ML-kernel) support vector machine (SVM) and Dempster-Shafer (D-S) fusion. Using sensor data sampled under different running conditions, the proposed approach initially establishes multiple principal component analysis (PCA) models for fault feature extraction. The fault feature vectors are then applied to train the probabilistic SVM (PSVM) classifiers that arrive at a preliminary fault diagnosis. To improve the accuracy of preliminary results, a novel ML-kernel based PSVM classifier is proposed in this paper, and the positive definiteness of the ML-kernel is proved as well. The basic probability assignments (BPAs) are defined based on the preliminary fault diagnosis results and their confidence values. Eventually, the final fault diagnosis result is archived by the fusion of the BPAs. Experimental results show that the proposed framework not only is capable of detecting and identifying the faults in the robot driving system, but also has better performance in stability and diagnosis accuracy compared with the traditional methods.

  20. A Novel Mittag-Leffler Kernel Based Hybrid Fault Diagnosis Method for Wheeled Robot Driving System

    PubMed Central

    Yuan, Xianfeng; Song, Mumin; Chen, Zhumin; Li, Yan

    2015-01-01

    The wheeled robots have been successfully applied in many aspects, such as industrial handling vehicles, and wheeled service robots. To improve the safety and reliability of wheeled robots, this paper presents a novel hybrid fault diagnosis framework based on Mittag-Leffler kernel (ML-kernel) support vector machine (SVM) and Dempster-Shafer (D-S) fusion. Using sensor data sampled under different running conditions, the proposed approach initially establishes multiple principal component analysis (PCA) models for fault feature extraction. The fault feature vectors are then applied to train the probabilistic SVM (PSVM) classifiers that arrive at a preliminary fault diagnosis. To improve the accuracy of preliminary results, a novel ML-kernel based PSVM classifier is proposed in this paper, and the positive definiteness of the ML-kernel is proved as well. The basic probability assignments (BPAs) are defined based on the preliminary fault diagnosis results and their confidence values. Eventually, the final fault diagnosis result is archived by the fusion of the BPAs. Experimental results show that the proposed framework not only is capable of detecting and identifying the faults in the robot driving system, but also has better performance in stability and diagnosis accuracy compared with the traditional methods. PMID:26229526

  1. NASA Astrophysics Data System (ADS)

    2018-05-01

    Eigenvalues and eigenvectors, together, constitute the eigenstructure of the system. The design of vibrating systems aimed at satisfying specifications on eigenvalues and eigenvectors, which is commonly known as eigenstructure assignment, has drawn increasing interest over the recent years. The most natural mathematical framework for such problems is constituted by the inverse eigenproblems, which consist in the determination of the system model that features a desired set of eigenvalues and eigenvectors. Although such a problem is intrinsically challenging, several solutions have been proposed in the literature. The approaches to eigenstructure assignment can be basically divided into passive control and active control.

  2. F-100A on lakebed

    NASA Technical Reports Server (NTRS)

    1955-01-01

    North American F-100A (52-5778) Super Sabre is parked on the Rogers Dry Lakebed at Edwards Air Force Base, California, 1955. This photo shows the large tail on the F-100A. When the basic research was completed on this F-100A another program was assigned. On March 5, 1957 two aeronautical engineers and a test pilot from NACA High-Speed Flight Station took the airplane to participate in a Gunnery Operations program at Nellis Air Force Base, Nevada. When the program was completed the aircraft returned for other assignments to NACA, at Edwards, California.

  3. A Mediation Model to Explain the Role of Mathematics Skills and Probabilistic Reasoning on Statistics Achievement

    ERIC Educational Resources Information Center

    Primi, Caterina; Donati, Maria Anna; Chiesi, Francesca

    2016-01-01

    Among the wide range of factors related to the acquisition of statistical knowledge, competence in basic mathematics, including basic probability, has received much attention. In this study, a mediation model was estimated to derive the total, direct, and indirect effects of mathematical competence on statistics achievement taking into account…

  4. Pedigrees, Prizes, and Prisoners: The Misuse of Conditional Probability

    ERIC Educational Resources Information Center

    Carlton, Matthew A.

    2005-01-01

    We present and discuss three examples of misapplication of the notion of conditional probability. In each example, we present the problem along with a published and/or well-known incorrect--but seemingly plausible--solution. We then give a careful treatment of the correct solution, in large part to show how careful application of basic probability…

  5. Blind Students' Learning of Probability through the Use of a Tactile Model

    ERIC Educational Resources Information Center

    Vita, Aida Carvalho; Kataoka, Verônica Yumi

    2014-01-01

    The objective of this paper is to discuss how blind students learn basic concepts of probability using the tactile model proposed by Vita (2012). Among the activities were part of the teaching sequence "Jefferson's Random Walk", in which students built a tree diagram (using plastic trays, foam cards, and toys), and pictograms in 3D…

  6. Predicting traffic load impact of alternative recreation developments

    Treesearch

    Gary H. Elsner; Ronald A. Oliveira

    1973-01-01

    Traffic load changes as a result of expansion of recreation facilities may be predicted through computations based on estimates of (a) drawing power of the recreation attracttions, overnight accommodations, and in- or out-terminals; (b) probable types of travel; (c) probable routes of travel; and (d) total number of cars in the recreation system. Once the basic model...

  7. Reliability computation using fault tree analysis

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.

    1971-01-01

    A method is presented for calculating event probabilities from an arbitrary fault tree. The method includes an analytical derivation of the system equation and is not a simulation program. The method can handle systems that incorporate standby redundancy and it uses conditional probabilities for computing fault trees where the same basic failure appears in more than one fault path.

  8. Knowledge.

    PubMed

    Jost, Jürgen

    2017-06-01

    We investigate the basic principles of structural knowledge. Structural knowledge underlies cognition, and it organizes, selects and assigns meaning to information. It is the result of evolutionary, cultural and developmental processes. Because of its own constraints, it needs to discover and exploit regularities and thereby achieve a complexity reduction.

  9. Clinical decision rules for termination of resuscitation in out-of-hospital cardiac arrest.

    PubMed

    Sherbino, Jonathan; Keim, Samuel M; Davis, Daniel P

    2010-01-01

    Out-of-hospital cardiac arrest (OHCA) has a low probability of survival to hospital discharge. Four clinical decision rules (CDRs) have been validated to identify patients with no probability of survival. Three of these rules focus on exclusive prehospital basic life support care for OHCA, and two of these rules focus on prehospital advanced life support care for OHCA. Can a CDR for the termination of resuscitation identify a patient with no probability of survival in the setting of OHCA? Six validation studies were selected from a PubMed search. A structured review of each of the studies is presented. In OHCA receiving basic life support care, the BLS-TOR (basic life support termination of resuscitation) rule has a positive predictive value for death of 99.5% (95% confidence interval 98.9-99.8%), and decreases the transportation of all patients by 62.6%. This rule has been appropriately validated for widespread use. In OHCA receiving advanced life support care, no current rule has been appropriately validated for widespread use. The BLS-TOR rule is a simple rule that identifies patients who will not survive OHCA. Further research is required to identify similarly robust CDRs for patients receiving advanced life support care in the setting of OHCA. Copyright 2010 Elsevier Inc. All rights reserved.

  10. Data Analysis Techniques for Physical Scientists

    NASA Astrophysics Data System (ADS)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  11. Nonadditive entropies yield probability distributions with biases not warranted by the data.

    PubMed

    Pressé, Steve; Ghosh, Kingshuk; Lee, Julian; Dill, Ken A

    2013-11-01

    Different quantities that go by the name of entropy are used in variational principles to infer probability distributions from limited data. Shore and Johnson showed that maximizing the Boltzmann-Gibbs form of the entropy ensures that probability distributions inferred satisfy the multiplication rule of probability for independent events in the absence of data coupling such events. Other types of entropies that violate the Shore and Johnson axioms, including nonadditive entropies such as the Tsallis entropy, violate this basic consistency requirement. Here we use the axiomatic framework of Shore and Johnson to show how such nonadditive entropy functions generate biases in probability distributions that are not warranted by the underlying data.

  12. Interdisciplinary Subject "Yakugaku Nyumon" for First-year Students Constructed with Lectures and Problem-based Learning.

    PubMed

    Yamaki, Kouya; Ueda, Masafumi; Ueda, Kumiko; Emoto, Noriaki; Mizutani, Nobuaki; Ikeda, Koji; Yagi, Keiko; Tanaka, Masafumi; Habu, Yasushi; Nakayama, Yoshiaki; Takeda, Norihiko; Moriwaki, Kensuke; Kitagawa, Shuji

    2016-01-01

    In 2013, Kobe Pharmaceutical University established "Yakugaku Nyumon", an interdisciplinary course, which consists of omnibus lectures and problem-based learning (PBL) on topics ranging from basic to clinical subjects. The themes of the PBL were original ones; "Study from package inserts of aspirin", which aimed to reinforce the contents of the interdisciplinary lectures, and "Let's think about aspirin derivatives (super-aspirin)", which aimed to engender an interest in studying pharmacy. The PBL featured questions from teachers to help with study and was therefore referred to as "question-led PBL" (Q-PBL). The Q-PBL regarding aspirin derivatives began with preparing answers to the questions for a small group discussion (SGD) as an assignment, followed by a SGD, a presentation, and peer-feedback. From an analysis of the questionnaire survey, it was found that students considered the Q-PBL satisfying and that they had achieved the 4 aims: (1) to increase the motivation to study, (2) to enhance an understanding of the relations and significance of basic and clinical sciences, (3) to comprehend the learning content, and (4) to recognize the importance of communication. The Q-PBL with assignments has two favorable points. One is that the first-year students can challenge difficult and high-level questions when they are given these as assignments. The other is that students, who are unfamiliar with SGD can engage in discussions with other students using the knowledge gained from the assignment. The introduction of omnibus lectures and Q-PBL, along with these improvements in theme, application, and review process, promises increased learning efficacy at the university.

  13. 14 CFR 91.1081 - Crewmember training requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... particular assignment of the crewmember: (1) Basic indoctrination ground training for newly hired crewmembers... 14 Aeronautics and Space 2 2010-01-01 2010-01-01 false Crewmember training requirements. 91.1081... Operations Program Management § 91.1081 Crewmember training requirements. (a) Each program manager must...

  14. An artificial system for selecting the optimal surgical team.

    PubMed

    Saberi, Nahid; Mahvash, Mohsen; Zenati, Marco

    2015-01-01

    We introduce an intelligent system to optimize a team composition based on the team's historical outcomes and apply this system to compose a surgical team. The system relies on a record of the procedures performed in the past. The optimal team composition is the one with the lowest probability of unfavorable outcome. We use the theory of probability and the inclusion exclusion principle to model the probability of team outcome for a given composition. A probability value is assigned to each person of database and the probability of a team composition is calculated from them. The model allows to determine the probability of all possible team compositions even if there is no recoded procedure for some team compositions. From an analytical perspective, assembling an optimal team is equivalent to minimizing the overlap of team members who have a recurring tendency to be involved with procedures of unfavorable results. A conceptual example shows the accuracy of the proposed system on obtaining the optimal team.

  15. Sensing Attribute Weights: A Novel Basic Belief Assignment Method

    PubMed Central

    Jiang, Wen; Zhuang, Miaoyan; Xie, Chunhe; Wu, Jun

    2017-01-01

    Dempster–Shafer evidence theory is widely used in many soft sensors data fusion systems on account of its good performance for handling the uncertainty information of soft sensors. However, how to determine basic belief assignment (BBA) is still an open issue. The existing methods to determine BBA do not consider the reliability of each attribute; at the same time, they cannot effectively determine BBA in the open world. In this paper, based on attribute weights, a novel method to determine BBA is proposed not only in the closed world, but also in the open world. The Gaussian model of each attribute is built using the training samples firstly. Second, the similarity between the test sample and the attribute model is measured based on the Gaussian membership functions. Then, the attribute weights are generated using the overlap degree among the classes. Finally, BBA is determined according to the sensed attribute weights. Several examples with small datasets show the validity of the proposed method. PMID:28358325

  16. Sensing Attribute Weights: A Novel Basic Belief Assignment Method.

    PubMed

    Jiang, Wen; Zhuang, Miaoyan; Xie, Chunhe; Wu, Jun

    2017-03-30

    Dempster-Shafer evidence theory is widely used in many soft sensors data fusion systems on account of its good performance for handling the uncertainty information of soft sensors. However, how to determine basic belief assignment (BBA) is still an open issue. The existing methods to determine BBA do not consider the reliability of each attribute; at the same time, they cannot effectively determine BBA in the open world. In this paper, based on attribute weights, a novel method to determine BBA is proposed not only in the closed world, but also in the open world. The Gaussian model of each attribute is built using the training samples firstly. Second, the similarity between the test sample and the attribute model is measured based on the Gaussian membership functions. Then, the attribute weights are generated using the overlap degree among the classes. Finally, BBA is determined according to the sensed attribute weights. Several examples with small datasets show the validity of the proposed method.

  17. Practices and Problems of Adult Basic Education in Rural Areas.

    ERIC Educational Resources Information Center

    Richardson, E. Gordon

    The percentages of adults needing adult basic education (ABE) programs in rural areas may not differ from those found in metropolitan areas, but the delivery of the system may be different. For example, the rural ABE teaching staff probably will be recruited from the ranks of the regular elementary or high school teachers to teach at night also,…

  18. Network-level reproduction number and extinction threshold for vector-borne diseases.

    PubMed

    Xue, Ling; Scoglio, Caterina

    2015-06-01

    The basic reproduction number of deterministic models is an essential quantity to predict whether an epidemic will spread or not. Thresholds for disease extinction contribute crucial knowledge of disease control, elimination, and mitigation of infectious diseases. Relationships between basic reproduction numbers of two deterministic network-based ordinary differential equation vector-host models, and extinction thresholds of corresponding stochastic continuous-time Markov chain models are derived under some assumptions. Numerical simulation results for malaria and Rift Valley fever transmission on heterogeneous networks are in agreement with analytical results without any assumptions, reinforcing that the relationships may always exist and proposing a mathematical problem for proving existence of the relationships in general. Moreover, numerical simulations show that the basic reproduction number does not monotonically increase or decrease with the extinction threshold. Consistent trends of extinction probability observed through numerical simulations provide novel insights into mitigation strategies to increase the disease extinction probability. Research findings may improve understandings of thresholds for disease persistence in order to control vector-borne diseases.

  19. A Probabilistic Framework for Peptide and Protein Quantification from Data-Dependent and Data-Independent LC-MS Proteomics Experiments

    PubMed Central

    Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.

    2013-01-01

    A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168

  20. Earthquake Rate Model 2 of the 2007 Working Group for California Earthquake Probabilities, Magnitude-Area Relationships

    USGS Publications Warehouse

    Stein, Ross S.

    2008-01-01

    The Working Group for California Earthquake Probabilities must transform fault lengths and their slip rates into earthquake moment-magnitudes. First, the down-dip coseismic fault dimension, W, must be inferred. We have chosen the Nazareth and Hauksson (2004) method, which uses the depth above which 99% of the background seismicity occurs to assign W. The product of the observed or inferred fault length, L, with the down-dip dimension, W, gives the fault area, A. We must then use a scaling relation to relate A to moment-magnitude, Mw. We assigned equal weight to the Ellsworth B (Working Group on California Earthquake Probabilities, 2003) and Hanks and Bakun (2007) equations. The former uses a single logarithmic relation fitted to the M=6.5 portion of data of Wells and Coppersmith (1994); the latter uses a bilinear relation with a slope change at M=6.65 (A=537 km2) and also was tested against a greatly expanded dataset for large continental transform earthquakes. We also present an alternative power law relation, which fits the newly expanded Hanks and Bakun (2007) data best, and captures the change in slope that Hanks and Bakun attribute to a transition from area- to length-scaling of earthquake slip. We have not opted to use the alternative relation for the current model. The selections and weights were developed by unanimous consensus of the Executive Committee of the Working Group, following an open meeting of scientists, a solicitation of outside opinions from additional scientists, and presentation of our approach to the Scientific Review Panel. The magnitude-area relations and their assigned weights are unchanged from that used in Working Group (2003).

  1. An Examination of the Levels of Cognitive Demand Required by Probability Tasks in Middle Grades Mathematics Textbooks

    ERIC Educational Resources Information Center

    Jones, Dustin L.; Tarr, James E.

    2007-01-01

    We analyze probability content within middle grades (6, 7, and 8) mathematics textbooks from a historical perspective. Two series, one popular and the other alternative, from four recent eras of mathematics education (New Math, Back to Basics, Problem Solving, and Standards) were analyzed using the Mathematical Tasks Framework (Stein, Smith,…

  2. A bottom-up robust optimization framework for identifying river basin development pathways under deep climate uncertainty

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Ray, P.; Brown, C.

    2016-12-01

    Hydroclimatic nonstationarity due to climate change poses challenges for long-term water infrastructure planning in river basin systems. While designing strategies that are flexible or adaptive hold intuitive appeal, development of well-performing strategies requires rigorous quantitative analysis that address uncertainties directly while making the best use of scientific information on the expected evolution of future climate. Multi-stage robust optimization (RO) offers a potentially effective and efficient technique for addressing the problem of staged basin-level planning under climate change, however the necessity of assigning probabilities to future climate states or scenarios is an obstacle to implementation, given that methods to reliably assign probabilities to future climate states are not well developed. We present a method that overcomes this challenge by creating a bottom-up RO-based framework that decreases the dependency on probability distributions of future climate and rather employs them after optimization to aid selection amongst competing alternatives. The iterative process yields a vector of `optimal' decision pathways each under the associated set of probabilistic assumptions. In the final phase, the vector of optimal decision pathways is evaluated to identify the solutions that are least sensitive to the scenario probabilities and are most-likely conditional on the climate information. The framework is illustrated for the planning of new dam and hydro-agricultural expansions projects in the Niger River Basin over a 45-year planning period from 2015 to 2060.

  3. Stimulus discriminability may bias value-based probabilistic learning.

    PubMed

    Schutte, Iris; Slagter, Heleen A; Collins, Anne G E; Frank, Michael J; Kenemans, J Leon

    2017-01-01

    Reinforcement learning tasks are often used to assess participants' tendency to learn more from the positive or more from the negative consequences of one's action. However, this assessment often requires comparison in learning performance across different task conditions, which may differ in the relative salience or discriminability of the stimuli associated with more and less rewarding outcomes, respectively. To address this issue, in a first set of studies, participants were subjected to two versions of a common probabilistic learning task. The two versions differed with respect to the stimulus (Hiragana) characters associated with reward probability. The assignment of character to reward probability was fixed within version but reversed between versions. We found that performance was highly influenced by task version, which could be explained by the relative perceptual discriminability of characters assigned to high or low reward probabilities, as assessed by a separate discrimination experiment. Participants were more reliable in selecting rewarding characters that were more discriminable, leading to differences in learning curves and their sensitivity to reward probability. This difference in experienced reinforcement history was accompanied by performance biases in a test phase assessing ability to learn from positive vs. negative outcomes. In a subsequent large-scale web-based experiment, this impact of task version on learning and test measures was replicated and extended. Collectively, these findings imply a key role for perceptual factors in guiding reward learning and underscore the need to control stimulus discriminability when making inferences about individual differences in reinforcement learning.

  4. Interference in the classical probabilistic model and its representation in complex Hilbert space

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.

    2005-10-01

    The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.

  5. The effect of reading assignments in guided inquiry learning on students’ critical thinking skills

    NASA Astrophysics Data System (ADS)

    Syarkowi, A.

    2018-05-01

    The purpose of this study was to determine the effect of reading assignment in guided inquiry learning on senior high school students’ critical thinking skills. The research method which was used in this research was quasi-experiment research method with reading task as the treatment. Topic of inquiry process was Kirchhoff law. The instrument was used for this research was 25 multiple choice interpretive exercises with justification. The multiple choice test was divided on 3 categories such as involve basic clarification, the bases for a decision and inference skills. The result of significance test proved the improvement of students’ critical thinking skills of experiment class was significantly higher when compared with the control class, so it could be concluded that reading assignment can improve students’ critical thinking skills.

  6. Hepatitis disease detection using Bayesian theory

    NASA Astrophysics Data System (ADS)

    Maseleno, Andino; Hidayati, Rohmah Zahroh

    2017-02-01

    This paper presents hepatitis disease diagnosis using a Bayesian theory for better understanding of the theory. In this research, we used a Bayesian theory for detecting hepatitis disease and displaying the result of diagnosis process. Bayesian algorithm theory is rediscovered and perfected by Laplace, the basic idea is using of the known prior probability and conditional probability density parameter, based on Bayes theorem to calculate the corresponding posterior probability, and then obtained the posterior probability to infer and make decisions. Bayesian methods combine existing knowledge, prior probabilities, with additional knowledge derived from new data, the likelihood function. The initial symptoms of hepatitis which include malaise, fever and headache. The probability of hepatitis given the presence of malaise, fever, and headache. The result revealed that a Bayesian theory has successfully identified the existence of hepatitis disease.

  7. Why anthropic reasoning cannot predict Lambda.

    PubMed

    Starkman, Glenn D; Trotta, Roberto

    2006-11-17

    We revisit anthropic arguments purporting to explain the measured value of the cosmological constant. We argue that different ways of assigning probabilities to candidate universes lead to totally different anthropic predictions. As an explicit example, we show that weighting different universes by the total number of possible observations leads to an extremely small probability for observing a value of Lambda equal to or greater than what we now measure. We conclude that anthropic reasoning within the framework of probability as frequency is ill-defined and that in the absence of a fundamental motivation for selecting one weighting scheme over another the anthropic principle cannot be used to explain the value of Lambda, nor, likely, any other physical parameters.

  8. Vibrational spectroscopic study of the antimonate mineral bindheimite Pb 2Sb 2O 6(O,OH)

    NASA Astrophysics Data System (ADS)

    Bahfenne, Silmarilly; Frost, Ray L.

    2009-09-01

    Raman spectroscopy complimented with infrared spectroscopy has been used to characterise the antimonate mineral bindheimite Pb 2Sb 2O 6(O,OH). The mineral is characterised by an intense Raman band at 656 cm -1 assigned to SbO stretching vibrations. Other lower intensity bands at 664, 749 and 814 cm -1 are also assigned to stretching vibrations. This observation suggests the non-equivalence of SbO units in the structure. Low intensity Raman bands at 293, 312 and 328 cm -1 are assigned to the OSbO bending vibrations. Infrared bands at 979, 1008, 1037 and 1058 cm -1 may be assigned to δOH deformation modes of SbOH units. Infrared bands at 1603 and 1640 cm -1 are assigned to water bending vibrations, suggesting that water is involved in the bindheimite structure. Broad infrared bands centred upon 3250 cm -1 supports this concept. Thus the true formula of bindheimite is questioned and probably should be written as Pb 2Sb 2O 6(O,OH,H 2O).

  9. An integrated voice and data multiple-access scheme for a land-mobile satellite system

    NASA Technical Reports Server (NTRS)

    Li, V. O. K.; Yan, T.-Y.

    1984-01-01

    An analytical study is performed of the satellite requirements for a land mobile satellite system (LMSS). The spacecraft (MSAT-X) would be in GEO and would be compatible with multiple access by mobile radios and antennas and fixed stations. The FCC has received a petition from NASA to reserve the 821-825 and 866-870 MHz frequencies for the LMSS, while communications with fixed earth stations would be in the Ku band. MSAT-X transponders would alter the frequencies of signal and do no processing in the original configuration considered. Channel use would be governed by an integrated demand-assigned, multiple access protocol, which would divide channels into reservation and information channels, governed by a network management center. Further analyses will cover tradeoffs between data and voice users, probability of blocking, and the performance impacts of on-board switching and variable bandwidth assignment. Initial calculations indicate that a large traffic volume can be handled with acceptable delays and voice blocking probabilities.

  10. An integrated voice and data multiple-access scheme for a land-mobile satellite system

    NASA Astrophysics Data System (ADS)

    Li, V. O. K.; Yan, T.-Y.

    1984-11-01

    An analytical study is performed of the satellite requirements for a land mobile satellite system (LMSS). The spacecraft (MSAT-X) would be in GEO and would be compatible with multiple access by mobile radios and antennas and fixed stations. The FCC has received a petition from NASA to reserve the 821-825 and 866-870 MHz frequencies for the LMSS, while communications with fixed earth stations would be in the Ku band. MSAT-X transponders would alter the frequencies of signal and do no processing in the original configuration considered. Channel use would be governed by an integrated demand-assigned, multiple access protocol, which would divide channels into reservation and information channels, governed by a network management center. Further analyses will cover tradeoffs between data and voice users, probability of blocking, and the performance impacts of on-board switching and variable bandwidth assignment. Initial calculations indicate that a large traffic volume can be handled with acceptable delays and voice blocking probabilities.

  11. Muscular Dystrophy Surveillance Tracking and Research Network (MD STARnet): case definition in surveillance for childhood-onset Duchenne/Becker muscular dystrophy.

    PubMed

    Mathews, Katherine D; Cunniff, Chris; Kantamneni, Jiji R; Ciafaloni, Emma; Miller, Timothy; Matthews, Dennis; Cwik, Valerie; Druschel, Charlotte; Miller, Lisa; Meaney, F John; Sladky, John; Romitti, Paul A

    2010-09-01

    The Muscular Dystrophy Surveillance Tracking and Research Network (MD STARnet) is a multisite collaboration to determine the prevalence of childhood-onset Duchenne/Becker muscular dystrophy and to characterize health care and health outcomes in this population. MD STARnet uses medical record abstraction to identify patients with Duchenne/Becker muscular dystrophy born January 1, 1982 or later who resided in 1 of the participating sites. Critical diagnostic elements of each abstracted record are reviewed independently by >4 clinicians and assigned to 1 of 6 case definition categories (definite, probable, possible, asymptomatic, female, not Duchenne/Becker muscular dystrophy) by consensus. As of November 2009, 815 potential cases were reviewed. Of the cases included in analysis, 674 (82%) were either ''definite'' or ''probable'' Duchenne/Becker muscular dystrophy. These data reflect a change in diagnostic testing, as case assignment based on genetic testing increased from 67% in the oldest cohort (born 1982-1987) to 94% in the cohort born 2004 to 2009.

  12. Affective and cognitive factors influencing sensitivity to probabilistic information.

    PubMed

    Tyszka, Tadeusz; Sawicki, Przemyslaw

    2011-11-01

    In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.

  13. A Random Variable Transformation Process.

    ERIC Educational Resources Information Center

    Scheuermann, Larry

    1989-01-01

    Provides a short BASIC program, RANVAR, which generates random variates for various theoretical probability distributions. The seven variates include: uniform, exponential, normal, binomial, Poisson, Pascal, and triangular. (MVL)

  14. Comprehensive Small Engine Repair.

    ERIC Educational Resources Information Center

    Hires, Bill; And Others

    This curriculum guide contains the basic information needed to repair all two- and four-stroke cycle engines. The curriculum covers four areas, each consisting of one or more units of instruction that include performance objectives, suggested activities for teacher and students, information sheets, assignment sheets, job sheets, visual aids,…

  15. Graded SPSS Exercises.

    ERIC Educational Resources Information Center

    Allen, Mary J.

    The attached materials have been developed for use on the CSU CYBER Computer's Statistical Package for the Social Sciences (SPSSONL). The assignments are graded in difficulty and gradually introduce new commands and require the practice of previously learned commands. The handouts begin with basic instructions for logging on; then XEDIT is taught…

  16. Design Document. EKG Interpretation Program.

    ERIC Educational Resources Information Center

    Webb, Sandra M.

    This teaching plan is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in acquainting students with the basic skills needed to perform electrocardiographic (ECG or EKG) interpretations. The first part of the teaching plan contains a statement of purpose; audience recommendations; a flow chart detailing…

  17. Tandem mass spectrometry of human tryptic blood peptides calculated by a statistical algorithm and captured by a relational database with exploration by a general statistical analysis system.

    PubMed

    Bowden, Peter; Beavis, Ron; Marshall, John

    2009-11-02

    A goodness of fit test may be used to assign tandem mass spectra of peptides to amino acid sequences and to directly calculate the expected probability of mis-identification. The product of the peptide expectation values directly yields the probability that the parent protein has been mis-identified. A relational database could capture the mass spectral data, the best fit results, and permit subsequent calculations by a general statistical analysis system. The many files of the Hupo blood protein data correlated by X!TANDEM against the proteins of ENSEMBL were collected into a relational database. A redundant set of 247,077 proteins and peptides were correlated by X!TANDEM, and that was collapsed to a set of 34,956 peptides from 13,379 distinct proteins. About 6875 distinct proteins were only represented by a single distinct peptide, 2866 proteins showed 2 distinct peptides, and 3454 proteins showed at least three distinct peptides by X!TANDEM. More than 99% of the peptides were associated with proteins that had cumulative expectation values, i.e. probability of false positive identification, of one in one hundred or less. The distribution of peptides per protein from X!TANDEM was significantly different than those expected from random assignment of peptides.

  18. Infrared spectroscopy of isoprene in noble gas matrices

    NASA Astrophysics Data System (ADS)

    Ito, Fumiyuki

    2018-06-01

    In this study, the infrared absorption spectra of 2-methyl-1,3-butadiene (isoprene) in noble gas matrices (Ar, Kr, and Xe) have been reported. The vibrational structure observed at cryogenic temperature, in combination with anharmonic vibrational calculations using density functional theory, helped in unambiguously assigning the fundamental modes of isoprene unresolved in the previous gas phase measurements, which would be of basic importance in the remote sensing of this molecule. A careful comparison with the most recent gas phase study [Brauer et al., Atmos. Meas. Tech. 7 (2014) 3839-3847.] led us to alternative assignments of the weak bands.

  19. Lightning Characteristics and Lightning Strike Peak Current Probabilities as Related to Aerospace Vehicle Operations

    NASA Technical Reports Server (NTRS)

    Johnson, Dale L.; Vaughan, William W.

    1998-01-01

    A summary is presented of basic lightning characteristics/criteria for current and future NASA aerospace vehicles. The paper estimates the probability of occurrence of a 200 kA peak lightning return current, should lightning strike an aerospace vehicle in various operational phases, i.e., roll-out, on-pad, launch, reenter/land, and return-to-launch site. A literature search was conducted for previous work concerning occurrence and measurement of peak lighting currents, modeling, and estimating probabilities of launch vehicles/objects being struck by lightning. This paper presents these results.

  20. The Paradoxical Value of Privacy

    DTIC Science & Technology

    2003-03-14

    on occurrence of identity theft correlated with consumer behavior so that probabilities of at least such clear privacy problems could be assigned to...now. And, the market typically needs to learn from experience, so consumer behavior is likely to lag behind any current reality. So one answer is that

  1. Hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis method for mid-frequency analysis of built-up systems with epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yin, Shengwen; Yu, Dejie; Yin, Hui; Lü, Hui; Xia, Baizhan

    2017-09-01

    Considering the epistemic uncertainties within the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model when it is used for the response analysis of built-up systems in the mid-frequency range, the hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis (ETFE/SEA) model is established by introducing the evidence theory. Based on the hybrid ETFE/SEA model and the sub-interval perturbation technique, the hybrid Sub-interval Perturbation and Evidence Theory-based Finite Element/Statistical Energy Analysis (SIP-ETFE/SEA) approach is proposed. In the hybrid ETFE/SEA model, the uncertainty in the SEA subsystem is modeled by a non-parametric ensemble, while the uncertainty in the FE subsystem is described by the focal element and basic probability assignment (BPA), and dealt with evidence theory. Within the hybrid SIP-ETFE/SEA approach, the mid-frequency response of interest, such as the ensemble average of the energy response and the cross-spectrum response, is calculated analytically by using the conventional hybrid FE/SEA method. Inspired by the probability theory, the intervals of the mean value, variance and cumulative distribution are used to describe the distribution characteristics of mid-frequency responses of built-up systems with epistemic uncertainties. In order to alleviate the computational burdens for the extreme value analysis, the sub-interval perturbation technique based on the first-order Taylor series expansion is used in ETFE/SEA model to acquire the lower and upper bounds of the mid-frequency responses over each focal element. Three numerical examples are given to illustrate the feasibility and effectiveness of the proposed method.

  2. Geriatric Service Worker.

    ERIC Educational Resources Information Center

    Seton Hill Coll., Greensburg, PA.

    This curriculum for training geriatric service workers is designed to incorporate additional communication and group skills along with the basic knowledge and skills necessary to work with older adults. The curriculum is organized in four modules. Each module is assigned a time frame and a credit unit base. The modules are divided into four major…

  3. Anatomy of a Book Controversy.

    ERIC Educational Resources Information Center

    Homstad, Wayne

    A major controversy arose in 1987 in a midwestern school district, after a middle school teacher assigned the novel "Go Ask Alice" to her seventh-grade class. This book describes the district's attempt to answer two basic questions: What should students read? and Who should decide what students read? The book controversy is first…

  4. Animal Guts as Ideal Reactors: An Open-Ended Project for a Course in Kinetics and Reactor Design.

    ERIC Educational Resources Information Center

    Carlson, Eric D.; Gast, Alice P.

    1998-01-01

    Presents an open-ended project tailored for a senior kinetics and reactor design course in which basic reactor design equations are used to model the digestive systems of several animals. Describes the assignment as well as the results. (DDR)

  5. Examining Beliefs about Interpersonal Communication and Relationships across Generations: An Assignment of Social Constructionism

    ERIC Educational Resources Information Center

    Aleman, Melissa Wood; Aleman, Carlos Galvan

    2007-01-01

    A basic premise of social approaches to studying communication is that theories of interpersonal communication and personal relationships are reflexively defined, socially constructed, and historically situated. In contrast to the tradition of psychological models of relational processes and message transmission, social approaches encourage…

  6. The Basic Principle of Calculus?

    ERIC Educational Resources Information Center

    Hardy, Michael

    2011-01-01

    A simple partial version of the Fundamental Theorem of Calculus can be presented on the first day of the first-year calculus course, and then relied upon repeatedly in assigned problems throughout the course. With that experience behind them, students can use the partial version to understand the full-fledged Fundamental Theorem, with further…

  7. Designing and Building a Collaborative Library Intranet for All

    ERIC Educational Resources Information Center

    Battles, Jason J.

    2010-01-01

    Intranets should provide quick and easy access to organizational information. The University of Alabama Libraries' intranet was only partially satisfying this basic expectation. Librarians could use it to find forms, policies, committee assignments, and meeting minutes, but navigating the libraries' intranet was neither quick nor easy, and it was…

  8. Profile of Specialist Teachers in Elementary Schools: Fine Arts.

    ERIC Educational Resources Information Center

    Bachelor, Barry G.

    The characteristics of elementary teachers of music, art, and drama in 14 southern California counties are profiled. The source data for the profile were the Professional Assignment Information Forms of the California Basic Educational Data System, completed annually by every certificated employee in California's public schools. The…

  9. A Consumer Education Strategy for the Primary Grades.

    ERIC Educational Resources Information Center

    Schofer, Gill

    1979-01-01

    Provides five exercises designed to include real world experiences in consumer education for primary students. The identification of basic food items, development of a shopping list, assignment of shopping behavior (careful, careless), and a field trip to the supermarket precede filling out a chart with comparative prices, brand names, and total…

  10. What Do They Know? An Assessment of Undergraduate Library Skills.

    ERIC Educational Resources Information Center

    Kunkel, Lilith R.; And Others

    1996-01-01

    Discusses a study conducted at Kent State University's (Ohio) regional campuses that measured the basic library competencies of incoming college freshmen. Results show that the frequency of student assignments was the best predictor of scores on the test measuring library skills. Implications for bibliographic instruction are discussed. (LRW)

  11. Solar Energy Education Packet for Elementary & Secondary Students.

    ERIC Educational Resources Information Center

    Center for Renewable Resources, Washington, DC.

    The arrangement of this packet is essentially evolutionary, with a conscious effort to alternate reading assignments, activities and experiments. It begins with solar energy facts and terminology as background to introduce the reader to basic concepts. It progresses into a discussion of passive solar systems. This is followed by several projects…

  12. Using a Population-Ecology Simulation in College Courses.

    ERIC Educational Resources Information Center

    Hinze, Kenneth E.

    1984-01-01

    Describes instructional use of a microcomputer version of the WORLD2 global population-ecology simulation. Reactions of students and instructors are discussed and a WORLD2 simulation assignment is appended. The BASIC version used by the author runs on Apple II, DOS 3.3, with 80 column board. (MBR)

  13. Tag Team Public Speaking

    ERIC Educational Resources Information Center

    Brigance, Linda Czuba

    2004-01-01

    Designing and presenting a speech is a solitary task. By definition, public speaking involves one person speaking to a group, which sets it apart from other types of communication situations, such as interpersonal and small group communication. Due to the inherently individualistic nature of assignments in the basic course, students rarely profit…

  14. Basic Vision Events.

    DTIC Science & Technology

    1982-08-01

    6 4 Resonance Raman Spectrum of Astaxanthin .............................. 8 5 Excitation Profile of CC1 4...Solution of Astaxanthin ........................ 9 LIST OF TABLES Table Page I Suggested Vibrational Assignments Frequencies (cm- 1...past year. These measurements on the carotenoid, astaxanthin , showed an alteration in the C=C stretching vibration from 1517 cm ŕ , under the action

  15. 14 CFR Sec. 1-4 - System of accounts coding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... General Accounting Provisions Sec. 1-4 System of accounts coding. (a) A four digit control number is assigned for each balance sheet and profit and loss account. Each balance sheet account is numbered sequentially, within blocks, designating basic balance sheet classifications. The first two digits of the four...

  16. HATS: A Design Procedure for Routine Business Documents.

    ERIC Educational Resources Information Center

    Baker, William H.

    2001-01-01

    Describes an approach to teaching students a basic design process for routine business documents like memos, letters, and reports. Outlines the design principles of HATS (Headings, Access, Typography, and Spacing), how they apply in before-and-after fashion to various documents, and discusses an assignment in which students redesign an existing…

  17. Using a Commercial Simulator to Teach Sorption Separations

    ERIC Educational Resources Information Center

    Wankat, Phillip C.

    2006-01-01

    The commercial simulator Aspen Chromatography was used in the computer laboratory of a dual-level course. The lab assignments used a cookbook approach to teach basic simulator operation and open-ended exploration to understand adsorption. The students learned theory better than in previous years despite having less lecture time. Students agreed…

  18. A Field-Based Technique for Teaching about Habitat Fragmentation and Edge Effects

    ERIC Educational Resources Information Center

    Resler, Lynn M.; Kolivras, Korine N.

    2009-01-01

    This article presents a field technique that exposes students to the indirect effects of habitat fragmentation on plant distributions through studying edge effects. This assignment, suited for students in an introductory biogeography or resource geography class, increases students' knowledge of basic biogeographic concepts such as environmental…

  19. Reclaiming Deviance as a Unique Course from Criminology Re-Revisited: Entering Delinquency into the Equation.

    ERIC Educational Resources Information Center

    Pino, Nathan W.

    2003-01-01

    Offers ideas for developing distinct deviance, delinquency, and criminology curricula. Discusses how to reduce theoretical and content overlap, paper assignments, course readings, and departmental issues. Finds overlap and review of basic theories were helpful to students. Recommends deviance, criminology, and delinquency courses be theoretically…

  20. Psychiatric Psychopathology: A Practicum Approach.

    ERIC Educational Resources Information Center

    Keller, John W.; Piotrowski, Chris

    This paper describes the University of West Florida graduate level, didactic/experiential course in psychopathology which has been offered since 1975 to introduce clinical psychology students to the applied and practical aspects of psychiatry. Elements of the basic practicum course are described: (1) each student is assigned to a psychiatrist on a…

  1. Apparel Marketing. [Student Manual] and Answer Book/Teacher's Guide.

    ERIC Educational Resources Information Center

    Gaskill, Melissa Lynn

    This document on apparel marketing contains both a student's manual and an answer book/teacher's guide. The student's manual contains the following 16 assignments: (1) introduction to fashion and fashion merchandising; (2) current fashion; (3) careers in fashion; (4) buying; (5) retailing; (6) merchandise basics; (7) merchandise--promotion and…

  2. Imaginative Approaches to Teaching the Basic Public Speaking Course: Roundtable Discussion.

    ERIC Educational Resources Information Center

    Johnson, Orin G.

    This paper presents 10 extemporaneous presentation assignments which are designed to review, test, and give students practice on every aspect necessary to become an effective public speaker in many situations. The paper begins with a description of the introductory communication course requirements and guidelines for grading of practice…

  3. 25 CFR 38.6 - Basic compensation for educators and education positions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... pay or benefits because of elections made under this section. (4) Stipends for extracurricular activities. An employee, if assigned to sponsor an approved extracurricular activity, may elect annually at... employee shall be paid the stipend in equal payments over the period of the extracurricular activity. [53...

  4. 25 CFR 38.6 - Basic compensation for educators and education positions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... pay or benefits because of elections made under this section. (4) Stipends for extracurricular activities. An employee, if assigned to sponsor an approved extracurricular activity, may elect annually at... employee shall be paid the stipend in equal payments over the period of the extracurricular activity. [53...

  5. 25 CFR 38.6 - Basic compensation for educators and education positions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... pay or benefits because of elections made under this section. (4) Stipends for extracurricular activities. An employee, if assigned to sponsor an approved extracurricular activity, may elect annually at... employee shall be paid the stipend in equal payments over the period of the extracurricular activity. [53...

  6. Injury Reduction Effectiveness of Assigning Running Shoes Based on Plantar Shape in Marine Corps Basic Training

    DTIC Science & Technology

    2010-06-24

    female recruits. Men and women were trained by drill instructors of their own sex . Because of logistical and geographical reasons alone, the training...CA. Negative first- tenn outcomes associated with lower extremity injury during recruit training among female Marine Corps graduates. Mif Med

  7. Small Business Course for Older Americans. Student Handbook.

    ERIC Educational Resources Information Center

    American Association of Community and Junior Colleges, Washington, DC.

    This student handbook was designed for a course which offers people aged 55 and older guidance in starting and operating a small business. Following introductory remarks concerning small businesses, information and assignment sheets related to each of the course's basic units are presented. Course units include the following: (1) Small Business…

  8. Surgical Techniques. Second Edition. Teacher Edition.

    ERIC Educational Resources Information Center

    Bushey, Vicki; And Others

    This instructor's manual contains 18 units of instruction for a course on surgical technology designed to include the entry-level competencies students need as a surgical technologist. Each unit includes some or all of the following basic components of a unit of instruction: objective sheet, suggested activities for the teacher, assignment sheets…

  9. 48 CFR 204.7003 - Basic PII number.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... procurement instrument is issued or awarded. (3) Position 9. Indicate the type of instrument by entering one...) Invitations for bids—B (iii) Contracts of all types except indefinite delivery contracts, sales contracts, and...—N (xv) Do not use—O (xvi) Purchase order—automated (assign V when numbering capacity of P is...

  10. Deducing chemical structure from crystallographically determined atomic coordinates

    PubMed Central

    Bruno, Ian J.; Shields, Gregory P.; Taylor, Robin

    2011-01-01

    An improved algorithm has been developed for assigning chemical structures to incoming entries to the Cambridge Structural Database, using only the information available in the deposited CIF. Steps in the algorithm include detection of bonds, selection of polymer unit, resolution of disorder, and assignment of bond types and formal charges. The chief difficulty is posed by the large number of metallo-organic crystal structures that must be processed, given our aspiration that assigned chemical structures should accurately reflect properties such as the oxidation states of metals and redox-active ligands, metal coordination numbers and hapticities, and the aromaticity or otherwise of metal ligands. Other complications arise from disorder, especially when it is symmetry imposed or modelled with the SQUEEZE algorithm. Each assigned structure is accompanied by an estimate of reliability and, where necessary, diagnostic information indicating probable points of error. Although the algorithm was written to aid building of the Cambridge Structural Database, it has the potential to develop into a general-purpose tool for adding chemical information to newly determined crystal structures. PMID:21775812

  11. Web-Based Problem-Solving Assignment and Grading System

    NASA Astrophysics Data System (ADS)

    Brereton, Giles; Rosenberg, Ronald

    2014-11-01

    In engineering courses with very specific learning objectives, such as fluid mechanics and thermodynamics, it is conventional to reinforce concepts and principles with problem-solving assignments and to measure success in problem solving as an indicator of student achievement. While the modern-day ease of copying and searching for online solutions can undermine the value of traditional assignments, web-based technologies also provide opportunities to generate individualized well-posed problems with an infinite number of different combinations of initial/final/boundary conditions, so that the probability of any two students being assigned identical problems in a course is vanishingly small. Such problems can be designed and programmed to be: single or multiple-step, self-grading, allow students single or multiple attempts; provide feedback when incorrect; selectable according to difficulty; incorporated within gaming packages; etc. In this talk, we discuss the use of a homework/exam generating program of this kind in a single-semester course, within a web-based client-server system that ensures secure operation.

  12. Particles, Waves, and the Interpretation of Quantum Mechanics

    ERIC Educational Resources Information Center

    Christoudouleas, N. D.

    1975-01-01

    Presents an explanation, without mathematical equations, of the basic principles of quantum mechanics. Includes wave-particle duality, the probability character of the wavefunction, and the uncertainty relations. (MLH)

  13. Complex and unpredictable Cardano

    NASA Astrophysics Data System (ADS)

    Ekert, Artur

    2008-08-01

    This purely recreational paper is about one of the most colorful characters of the Italian Renaissance, Girolamo Cardano, and the discovery of two basic ingredients of quantum theory, probability and complex numbers.

  14. “C.R.E.A.T.E.”-ing Unique Primary-Source Research Paper Assignments for a Pleasure and Pain Course Teaching Neuroscientific Principles in a Large General Education Undergraduate Course

    PubMed Central

    Bodnar, Richard J.; Rotella, Francis M.; Loiacono, Ilyssa; Coke, Tricia; Olsson, Kerstin; Barrientos, Alicia; Blachorsky, Lauren; Warshaw, Deena; Buras, Agata; Sanchez, Ciara M.; Azad, Raihana; Stellar, James R.

    2016-01-01

    A large (250 registrants) General Education lecture course, Pleasure and Pain, presented basic neuroscience principles as they related to animal and human models of pleasure and pain by weaving basic findings related to food and drug addiction and analgesic states with human studies examining empathy, social neuroscience and neuroeconomics. In its first four years, the course grade was based on weighted scores from two multiple-choice exams and a five-page review of three unique peer-reviewed research articles. Although well-registered and well-received, 18% of the students received Incomplete grades, primarily due to failing to submit the paper that went largely unresolved and eventually resulted in a failing grade. To rectify this issue, a modified version of the C.R.E.A.T.E. (Consider, Read, Elucidate hypotheses, Analyze and interpret data, Think of the next Experiment) method replaced the paper with eight structured assignments focusing on an initial general-topic article, the introduction-methods, and results-discussion of each of three related peer-review neuroscience-related articles, and a final summary. Compliance in completing these assignments was very high, resulting in only 11 INC grades out of 228 students. Thus, use of the C.R.E.A.T.E. method reduced the percentage of problematic INC grades from 18% to 4.8%, a 73% decline, without changing the overall grade distribution. Other analyses suggested the students achieved a deeper understanding of the scientific process using the C.R.E.A.T.E. method relative to the original term paper assignment. PMID:27385918

  15. "C.R.E.A.T.E."-ing Unique Primary-Source Research Paper Assignments for a Pleasure and Pain Course Teaching Neuroscientific Principles in a Large General Education Undergraduate Course.

    PubMed

    Bodnar, Richard J; Rotella, Francis M; Loiacono, Ilyssa; Coke, Tricia; Olsson, Kerstin; Barrientos, Alicia; Blachorsky, Lauren; Warshaw, Deena; Buras, Agata; Sanchez, Ciara M; Azad, Raihana; Stellar, James R

    2016-01-01

    A large (250 registrants) General Education lecture course, Pleasure and Pain, presented basic neuroscience principles as they related to animal and human models of pleasure and pain by weaving basic findings related to food and drug addiction and analgesic states with human studies examining empathy, social neuroscience and neuroeconomics. In its first four years, the course grade was based on weighted scores from two multiple-choice exams and a five-page review of three unique peer-reviewed research articles. Although well-registered and well-received, 18% of the students received Incomplete grades, primarily due to failing to submit the paper that went largely unresolved and eventually resulted in a failing grade. To rectify this issue, a modified version of the C.R.E.A.T.E. (Consider, Read, Elucidate hypotheses, Analyze and interpret data, Think of the next Experiment) method replaced the paper with eight structured assignments focusing on an initial general-topic article, the introduction-methods, and results-discussion of each of three related peer-review neuroscience-related articles, and a final summary. Compliance in completing these assignments was very high, resulting in only 11 INC grades out of 228 students. Thus, use of the C.R.E.A.T.E. method reduced the percentage of problematic INC grades from 18% to 4.8%, a 73% decline, without changing the overall grade distribution. Other analyses suggested the students achieved a deeper understanding of the scientific process using the C.R.E.A.T.E. method relative to the original term paper assignment.

  16. Developing a rich definition of the person/residence to support person-oriented models of consumer product usage

    EPA Science Inventory

    Person Oriented Models (POMs) provide a basis for simulating aggregate chemical exposures in a population over time (Price and Chaisson, 2005). POMs assign characteristics to simulated individuals that are used to determine the individual’s probability of interacting with e...

  17. Tracking the genetic stability of a honeybee breeding program with genetic markers

    USDA-ARS?s Scientific Manuscript database

    A genetic stock identification (GSI) assay was developed in 2008 to distinguish Russian honey bees from other honey bee stocks that are commercially produced in the United States. Probability of assignment (POA) values have been collected and maintained since the stock release in 2008 to the Russian...

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curry, J J; Gallagher, D W; Modarres, M

    Appendices are presented concerning isolation condenser makeup; vapor suppression system; station air system; reactor building closed cooling water system; turbine building secondary closed water system; service water system; emergency service water system; fire protection system; emergency ac power; dc power system; event probability estimation; methodology of accident sequence quantification; and assignment of dominant sequences to release categories.

  19. Quantum-Bayesian coherence

    NASA Astrophysics Data System (ADS)

    Fuchs, Christopher A.; Schack, Rüdiger

    2013-10-01

    In the quantum-Bayesian interpretation of quantum theory (or QBism), the Born rule cannot be interpreted as a rule for setting measurement-outcome probabilities from an objective quantum state. But if not, what is the role of the rule? In this paper, the argument is given that it should be seen as an empirical addition to Bayesian reasoning itself. Particularly, it is shown how to view the Born rule as a normative rule in addition to usual Dutch-book coherence. It is a rule that takes into account how one should assign probabilities to the consequences of various intended measurements on a physical system, but explicitly in terms of prior probabilities for and conditional probabilities consequent upon the imagined outcomes of a special counterfactual reference measurement. This interpretation is exemplified by representing quantum states in terms of probabilities for the outcomes of a fixed, fiducial symmetric informationally complete measurement. The extent to which the general form of the new normative rule implies the full state-space structure of quantum mechanics is explored.

  20. Non-Kolmogorovian Approach to the Context-Dependent Systems Breaking the Classical Probability Law

    NASA Astrophysics Data System (ADS)

    Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Yamato, Ichiro

    2013-07-01

    There exist several phenomena breaking the classical probability laws. The systems related to such phenomena are context-dependent, so that they are adaptive to other systems. In this paper, we present a new mathematical formalism to compute the joint probability distribution for two event-systems by using concepts of the adaptive dynamics and quantum information theory, e.g., quantum channels and liftings. In physics the basic example of the context-dependent phenomena is the famous double-slit experiment. Recently similar examples have been found in biological and psychological sciences. Our approach is an extension of traditional quantum probability theory, and it is general enough to describe aforementioned contextual phenomena outside of quantum physics.

  1. Benchmarking PARTISN with Analog Monte Carlo: Moments of the Neutron Number and the Cumulative Fission Number Probability Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Rourke, Patrick Francis

    The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.

  2. Selfish routing equilibrium in stochastic traffic network: A probability-dominant description.

    PubMed

    Zhang, Wenyi; He, Zhengbing; Guan, Wei; Ma, Rui

    2017-01-01

    This paper suggests a probability-dominant user equilibrium (PdUE) model to describe the selfish routing equilibrium in a stochastic traffic network. At PdUE, travel demands are only assigned to the most dominant routes in the same origin-destination pair. A probability-dominant rerouting dynamic model is proposed to explain the behavioral mechanism of PdUE. To facilitate applications, the logit formula of PdUE is developed, of which a well-designed route set is not indispensable and the equivalent varitional inequality formation is simple. Two routing strategies, i.e., the probability-dominant strategy (PDS) and the dominant probability strategy (DPS), are discussed through a hypothetical experiment. It is found that, whether out of insurance or striving for perfection, PDS is a better choice than DPS. For more general cases, the conducted numerical tests lead to the same conclusion. These imply that PdUE (rather than the conventional stochastic user equilibrium) is a desirable selfish routing equilibrium for a stochastic network, given that the probability distributions of travel time are available to travelers.

  3. Selfish routing equilibrium in stochastic traffic network: A probability-dominant description

    PubMed Central

    Zhang, Wenyi; Guan, Wei; Ma, Rui

    2017-01-01

    This paper suggests a probability-dominant user equilibrium (PdUE) model to describe the selfish routing equilibrium in a stochastic traffic network. At PdUE, travel demands are only assigned to the most dominant routes in the same origin-destination pair. A probability-dominant rerouting dynamic model is proposed to explain the behavioral mechanism of PdUE. To facilitate applications, the logit formula of PdUE is developed, of which a well-designed route set is not indispensable and the equivalent varitional inequality formation is simple. Two routing strategies, i.e., the probability-dominant strategy (PDS) and the dominant probability strategy (DPS), are discussed through a hypothetical experiment. It is found that, whether out of insurance or striving for perfection, PDS is a better choice than DPS. For more general cases, the conducted numerical tests lead to the same conclusion. These imply that PdUE (rather than the conventional stochastic user equilibrium) is a desirable selfish routing equilibrium for a stochastic network, given that the probability distributions of travel time are available to travelers. PMID:28829834

  4. Culture and Probability Judgment Accuracy: The Influence of Holistic Reasoning.

    PubMed

    Lechuga, Julia; Wiebe, John S

    2011-08-01

    A well-established phenomenon in the judgment and decision-making tradition is the overconfidence one places in the amount of knowledge that one possesses. Overconfidence or probability judgment accuracy varies not only individually but also across cultures. However, research efforts to explain cross-cultural variations in the overconfidence phenomenon have seldom been made. In Study 1, the authors compared the probability judgment accuracy of U.S. Americans (N = 108) and Mexican participants (N = 100). In Study 2, they experimentally primed culture by randomly assigning English/Spanish bilingual Mexican Americans (N = 195) to response language. Results of both studies replicated the cross-cultural variation of probability judgment accuracy previously observed in other cultural groups. U.S. Americans displayed less overconfidence when compared to Mexicans. These results were then replicated in bilingual participants, when culture was experimentally manipulated with language priming. Holistic reasoning did not account for the cross-cultural variation of overconfidence. Suggestions for future studies are discussed.

  5. Outcomes of Fort Jackson's Physical Training and Rehabilitation Program in army basic combat training: return to training, graduation, and 2-year retention.

    PubMed

    Hauret, Keith G; Knapik, Joseph J; Lange, Jeffrey L; Heckel, Heidi A; Coval, Dana L; Duplessis, David H

    2004-07-01

    Basic trainees at Fort Jackson, South Carolina, who were unable to continue basic combat training (BCT) because of a serious injury were assigned to the Physical Training and Rehabilitation Program (PTRP). Between January 3, 1998 and July 24, 2001, 4258 trainees were assigned to the PTRP. Using a retrospective cohort study design, return to training and BCT graduation rates were evaluated. PTRP graduates were compared with matched non-PTRP graduates for 2-year retention in the Army. More PTRP women than men were discharged from the PTRP (60% and 48%, respectively, p < 0.01). Of PTRP trainees returning to BCT, 10% and 12% of men and women, respectively, were discharged from the Army compared with overall Fort Jackson discharge rates of 9% and 15% for men and women, respectively. Comparing PTRP graduates to matched non-PTRP graduates, there were no differences in 2-year retention for men (14.9% and 14.7%, respectively; p = 0.93) or women (26.6% and 30.1%, respectively; p = 0.19). Despite the high discharge rate in the PTRP, the BCT discharge rate for trainees who successfully rehabilitated was similar to the overall discharge rate at Fort Jackson. The 2-year retention in service for PTRP trainees who graduated from BCT was similar to that of non-PTRP trainees.

  6. A Foreign Object Damage Event Detector Data Fusion System for Turbofan Engines

    NASA Technical Reports Server (NTRS)

    Turso, James A.; Litt, Jonathan S.

    2004-01-01

    A Data Fusion System designed to provide a reliable assessment of the occurrence of Foreign Object Damage (FOD) in a turbofan engine is presented. The FOD-event feature level fusion scheme combines knowledge of shifts in engine gas path performance obtained using a Kalman filter, with bearing accelerometer signal features extracted via wavelet analysis, to positively identify a FOD event. A fuzzy inference system provides basic probability assignments (bpa) based on features extracted from the gas path analysis and bearing accelerometers to a fusion algorithm based on the Dempster-Shafer-Yager Theory of Evidence. Details are provided on the wavelet transforms used to extract the foreign object strike features from the noisy data and on the Kalman filter-based gas path analysis. The system is demonstrated using a turbofan engine combined-effects model (CEM), providing both gas path and rotor dynamic structural response, and is suitable for rapid-prototyping of control and diagnostic systems. The fusion of the disparate data can provide significantly more reliable detection of a FOD event than the use of either method alone. The use of fuzzy inference techniques combined with Dempster-Shafer-Yager Theory of Evidence provides a theoretical justification for drawing conclusions based on imprecise or incomplete data.

  7. A survey of computational methods and error rate estimation procedures for peptide and protein identification in shotgun proteomics

    PubMed Central

    Nesvizhskii, Alexey I.

    2010-01-01

    This manuscript provides a comprehensive review of the peptide and protein identification process using tandem mass spectrometry (MS/MS) data generated in shotgun proteomic experiments. The commonly used methods for assigning peptide sequences to MS/MS spectra are critically discussed and compared, from basic strategies to advanced multi-stage approaches. A particular attention is paid to the problem of false-positive identifications. Existing statistical approaches for assessing the significance of peptide to spectrum matches are surveyed, ranging from single-spectrum approaches such as expectation values to global error rate estimation procedures such as false discovery rates and posterior probabilities. The importance of using auxiliary discriminant information (mass accuracy, peptide separation coordinates, digestion properties, and etc.) is discussed, and advanced computational approaches for joint modeling of multiple sources of information are presented. This review also includes a detailed analysis of the issues affecting the interpretation of data at the protein level, including the amplification of error rates when going from peptide to protein level, and the ambiguities in inferring the identifies of sample proteins in the presence of shared peptides. Commonly used methods for computing protein-level confidence scores are discussed in detail. The review concludes with a discussion of several outstanding computational issues. PMID:20816881

  8. Ménage-à-trois: the amoeba Nuclearia sp. from Lake Zurich with its ecto- and endosymbiotic bacteria.

    PubMed

    Dirren, Sebastian; Salcher, Michaela M; Blom, Judith F; Schweikert, Michael; Posch, Thomas

    2014-09-01

    We present a fascinating triad relationship between a eukaryotic amoeba and its two bacterial symbionts. The morphological characteristics of the amoeba allowed for a confident assignment to the genus Nuclearia (Opisthokonta, Nucleariidae), but species identification resulted in an ambiguous result. Sequence analysis indicated an affiliation to the species N. thermophila, however, several morphological features contradict the original description. Amoebal isolates were cultured for several years with their preferred food source, the microcystin-producing harmful cyanobacterium Planktothrix rubescens. Symbioses of the amoeba with ecto- and endosymbiotic bacteria were maintained over this period. Several thousand cells of the ectosymbiont are regularly arranged inside a layer of extracellular polymeric substances produced by the amoeba. The ectosymbiont was identified as Paucibacter toxinivorans (Betaproteobacteria), which was originally isolated by enrichment with microcystins. We found indications that our isolated ectosymbiont indeed contributed to toxin-degradation. The endosymbiont (Gammaproteobacteria, 15-20 bacteria per amoeba) is enclosed in symbiosomes inside the host cytoplasm and represents probably an obligate symbiont. We propose the name "Candidatus Endonucleariobacter rarus" for this bacterium that was neither found free-living nor in a symbiotic association. Nucleariidae are uniquely suited model organisms to study the basic principles of symbioses between opisthokonts and prokaryotes. Copyright © 2014 Elsevier GmbH. All rights reserved.

  9. Variations and Determinants of Mortality and Length of Stay of Very Low Birth Weight and Very Low for Gestational Age Infants in Seven European Countries.

    PubMed

    Fatttore, Giovanni; Numerato, Dino; Peltola, Mikko; Banks, Helen; Graziani, Rebecca; Heijink, Richard; Over, Eelco; Klitkou, Søren Toksvig; Fletcher, Eilidh; Mihalicza, Péter; Sveréus, Sofia

    2015-12-01

    The EuroHOPE very low birth weight and very low for gestational age infants study aimed to measure and explain variation in mortality and length of stay (LoS) in the populations of seven European nations (Finland, Hungary, Italy (only the province of Rome), the Netherlands, Norway, Scotland and Sweden). Data were linked from birth, hospital discharge and mortality registries. For each infant basic clinical and demographic information, infant mortality and LoS at 1 year were retrieved. In addition, socio-economic variables at the regional level were used. Results based on 16,087 infants confirm that gestational age and Apgar score at 5 min are important determinants of both mortality and LoS. In most countries, infants admitted or transferred to third-level hospitals showed lower probability of death and longer LoS. In the meta-analyses, the combined estimates show that being male, multiple births, presence of malformations, per capita income and low population density are significant risk factors for death. It is essential that national policies improve the quality of administrative datasets and address systemic problems in assigning identification numbers at birth. European policy should aim at improving the comparability of data across jurisdictions. Copyright © 2015 John Wiley & Sons, Ltd.

  10. Two-dimensional fuzzy fault tree analysis for chlorine release from a chlor-alkali industry using expert elicitation.

    PubMed

    Renjith, V R; Madhu, G; Nayagam, V Lakshmana Gomathi; Bhasi, A B

    2010-11-15

    The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. Assignment of absolute stereostructures through quantum mechanics electronic and vibrational circular dichroism calculations.

    PubMed

    Dai, Peng; Jiang, Nan; Tan, Ren-Xiang

    2016-01-01

    Elucidation of absolute configuration of chiral molecules including structurally complex natural products remains a challenging problem in organic chemistry. A reliable method for assigning the absolute stereostructure is to combine the experimental circular dichroism (CD) techniques such as electronic and vibrational CD (ECD and VCD), with quantum mechanics (QM) ECD and VCD calculations. The traditional QM methods as well as their continuing developments make them more applicable with accuracy. Taking some chiral natural products with diverse conformations as examples, this review describes the basic concepts and new developments of QM approaches for ECD and VCD calculations in solution and solid states.

  12. Homeostatic Systems--Mechanisms for Survival. Science IV.

    ERIC Educational Resources Information Center

    Pfeiffer, Carl H.

    The two student notebooks in this set provide the basic outline and assignments for the fourth and last year of a senior high school unified science program which builds on the technical third year course, Science IIIA (see SE 012 149). An introductory section considers the problems of survival inherent in living systems, matter-energy…

  13. Configuration-Control Scheme Copes With Singularities

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun; Colbaugh, Richard D.

    1993-01-01

    Improved configuration-control scheme for robotic manipulator having redundant degrees of freedom suppresses large joint velocities near singularities, at expense of small trajectory errors. Provides means to enforce order of priority of tasks assigned to robot. Basic concept of configuration control of redundant robot described in "Increasing The Dexterity Of Redundant Robots" (NPO-17801).

  14. 20 CFR 655.1308 - Offered wage rate.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    .... Recruitment for this purpose begins when the job order is accepted by the SWA for posting. (d) Wage offer. The... job offers for beginning level employees who have a basic understanding of the occupation. These... monitored and reviewed for accuracy. (2) Level II wage rates are assigned to job offers for employees who...

  15. 77 FR 39741 - Solicitation for a Cooperative Agreement-Curricula Review and Revision: NIC Trainer Development...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-05

    ... virtual instructor-led trainings on, for example, distance learning or the effective use of social media in a learning environment. This medium is also ideal for orientation, expectations, and other basics; (C) Reading assignments on current research; (D) Discussion forums, blogs, and/or social media...

  16. 7 CFR 1940.592 - Community facilities grants.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...). (b) Basic formula criteria, data source, and weight. See § 1940.552(b). (1) The criteria used in the... percentage of National rural population with income below the poverty level—50 percent. (2) Data source for each of these criterion is based on the latest census data available. Each criterion is assigned a...

  17. Using Calibrated Peer Review to Teach Basic Research Skills

    ERIC Educational Resources Information Center

    Bracke, Marianne S.; Graveel, John G.

    2014-01-01

    Calibrated Peer Review (CPR) is an online tool being used in the class Introduction to Agriculture and Purdue University (AGR 10100) to integrate a writing and research component (http://cpr.molsci.ucla.edu/Home.aspx). Calibrated Peer Review combines the ability to create writing intensive assignments with an introduction to the peer-review…

  18. Solar Energy Education Packet for Elementary & Secondary Students. Revised Edition.

    ERIC Educational Resources Information Center

    Center for Renewable Resources, Washington, DC.

    The arrangement of this packet is essentially evolutionary, with a conscious effort to alternate reading assignments, activities and experiments. It begins with solar energy facts and terminology as background to introduce the reader to basic concepts. It progresses into a discussion of passive solar systems. This is followed by several projects…

  19. 45 CFR 263.2 - What kinds of State expenditures count toward meeting a State's basic MOE expenditure requirement?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN...) Cash assistance, including the State's share of the assigned child support collection that is... payment; (2) Child care assistance (see § 263.3); (3) Education activities designed to increase self...

  20. 45 CFR 263.2 - What kinds of State expenditures count toward meeting a State's basic MOE expenditure requirement?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN...) Cash assistance, including the State's share of the assigned child support collection that is... payment; (2) Child care assistance (see § 263.3); (3) Education activities designed to increase self...

  1. 45 CFR 263.2 - What kinds of State expenditures count toward meeting a State's basic MOE expenditure requirement?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN...) Cash assistance, including the State's share of the assigned child support collection that is... payment; (2) Child care assistance (see § 263.3); (3) Education activities designed to increase self...

  2. 45 CFR 263.2 - What kinds of State expenditures count toward meeting a State's basic MOE expenditure requirement?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN...) Cash assistance, including the State's share of the assigned child support collection that is... payment; (2) Child care assistance (see § 263.3); (3) Education activities designed to increase self...

  3. 45 CFR 263.2 - What kinds of State expenditures count toward meeting a State's basic MOE expenditure requirement?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN...) Cash assistance, including the State's share of the assigned child support collection that is... payment; (2) Child care assistance (see § 263.3); (3) Education activities designed to increase self...

  4. Use of an Interactive General-Purpose Computer Terminal to Simulate Training Equipment Operation.

    ERIC Educational Resources Information Center

    Lahey, George F.; And Others

    Trainees from Navy Basic Electricity/Electronics School were assigned to receive either computer-assisted instruction (CAI) or conventional individualized instruction in a segment of a course requiring use of a multimeter to measure resistance and current flow. The (CAI) group used PLATO IV plasma-screen terminals; individualized instruction…

  5. Searching for Buried Treasure: Uncovering Discovery in Discovery-Based Learning

    ERIC Educational Resources Information Center

    Chase, Kiera; Abrahamson, Dor

    2018-01-01

    Forty 4th and 9th grade students participated individually in tutorial interviews centered on a problem-solving activity designed for learning basic algebra mechanics through diagrammatic modeling of an engaging narrative about a buccaneering giant burying and unearthing her treasure on a desert island. Participants were randomly assigned to…

  6. Fostering First Graders' Reasoning Strategies with Basic Sums: The Value of Guided Instruction

    ERIC Educational Resources Information Center

    Purpura, David J.; Baroody, Arthur J.; Eiland, Michael D.; Reid, Erin E.

    2016-01-01

    An intervention experiment served to evaluate the efficacy of highly guided discovery learning of relations underlying add-1 and doubles combination families and to compare the impact of such instruction with minimally guided instruction. After a pretest, 78 first graders were randomly assigned to one of three intervention conditions: highly…

  7. The Rolling Can Investigation: Towards an Explanation

    ERIC Educational Resources Information Center

    Ireson, Gren; Twidle, John

    2005-01-01

    This paper presents a context lead approach to rotational dynamics. By using nothing more than two cans of cola the basic notions of linear velocity, angular velocity, moments of inertia and conservation of energy can be explored. The approach can be used equally well as both a demonstration or an investigative assignment. The same starting point…

  8. Training for International Development: A Summary of Faculty and Foreign Student Interviews.

    ERIC Educational Resources Information Center

    Wallace, George; And Others

    To determine a basic design for training Colorado State University (CSU) faculty for assignment to international development programs, a written questionnaire and oral interview were administered to faculty with experience in international programs in Africa, the Middle East, and Latin America. A subset of 10 selected from each geographical…

  9. Demonstration Assessment: Measuring Conceptual Understanding and Critical Thinking with Rubrics.

    ERIC Educational Resources Information Center

    Radford, David L.; And Others

    1995-01-01

    Presents the science demonstration assessment as an authentic- assessment technique to assess whether students understand basic science concepts and can use them to solve problems. Uses rubrics to prepare students for the assessment and to assign final grades. Provides examples of science demonstration assessments and the scoring of rubrics in the…

  10. Why Does My Cruorine Change Color? Using Classic Research Articles To Teach Biochemistry Topics.

    ERIC Educational Resources Information Center

    White, Harold B., III

    2001-01-01

    Uses the spectroscopic study by G.G. Stokes of the reversible "oxidation and reduction" of hemoglobin to illustrate how a series of open-ended group assignments and associated classroom demonstrations can be built around a single article in a way that integrates and illuminates basic concepts. (Author/MM)

  11. Goal Setting: A Strategy for Reducing Health Disparities

    ERIC Educational Resources Information Center

    Young, Tara D.; Barrett, Gloria J.; Martin, Anna C.; Metz, Diane L.; Kaiser, Lucia L.; Steinberg, Francene M.

    2011-01-01

    The Healthy Rewards study tested the effectiveness of goal setting to encourage behavior change in Latino and African American adults in three northern California counties. Four groups of adults were alternately assigned to receive either 1) basic health promotion and nutrition education without goal setting (control) or 2) the same education with…

  12. What's in a Grade?

    ERIC Educational Resources Information Center

    Hughey, Jim D.; Harper, Bena

    A study explored the processes and attitudes that occur when assigning students a final course grade. The final grades for 1,578 students in a basic communication course were used in discriminant analyses. The level (the mean of all grades given) and the spread (standard deviation of all grades given) were estimated for each of 17 instructors. The…

  13. 41 CFR 301-73.106 - What are the basic services that should be covered by a TMS?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... confirmation and seat assignment, compliance with the Fly America Act, Governmentwide travel policies, contract...Rooms properties, per diem rate availability, etc.). (3) Car rental and rail information (e.g... reservations by type of service (common carrier, lodging, and car rental); (2) Extent to which reservations are...

  14. 41 CFR 301-73.106 - What are the basic services that should be covered by a TMS?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... confirmation and seat assignment, compliance with the Fly America Act, Governmentwide travel policies, contract...Rooms properties, per diem rate availability, etc.). (3) Car rental and rail information (e.g... reservations by type of service (common carrier, lodging, and car rental); (2) Extent to which reservations are...

  15. 41 CFR 301-73.106 - What are the basic services that should be covered by a TMS?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... confirmation and seat assignment, compliance with the Fly America Act, Governmentwide travel policies, contract...Rooms properties, per diem rate availability, etc.). (3) Car rental and rail information (e.g... reservations by type of service (common carrier, lodging, and car rental); (2) Extent to which reservations are...

  16. 41 CFR 301-73.106 - What are the basic services that should be covered by a TMS?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... confirmation and seat assignment, compliance with the Fly America Act, Governmentwide travel policies, contract...Rooms properties, per diem rate availability, etc.). (3) Car rental and rail information (e.g... reservations by type of service (common carrier, lodging, and car rental); (2) Extent to which reservations are...

  17. Major Appliance Repair. Teacher Edition and Student Edition. Second Edition.

    ERIC Educational Resources Information Center

    Smreker, Gene; Calvert, King

    This second edition contains teacher and student guides for 14 units of instruction in major appliance repair. Each unit in the teacher edition includes some or all of the following basic components: objective sheet, suggested activities, answers to assignment sheets, answers to the written test, written test, a unit evaluation form, teacher…

  18. An Online Course of Business Statistics: The Proportion of Successful Students

    ERIC Educational Resources Information Center

    Pena-Sanchez, Rolando

    2009-01-01

    This article describes the students' academic progress in an online course of business statistics through interactive software assignments and diverse educational homework, which helps these students to build their own e-learning through basic competences; i.e. interpreting results and solving problems. Cross-tables were built for the categorical…

  19. Lightning Strike Peak Current Probabilities as Related to Space Shuttle Operations

    NASA Technical Reports Server (NTRS)

    Johnson, Dale L.; Vaughan, William W.

    2000-01-01

    A summary is presented of basic lightning characteristics/criteria applicable to current and future aerospace vehicles. The paper provides estimates on the probability of occurrence of a 200 kA peak lightning return current, should lightning strike an aerospace vehicle in various operational phases, i.e., roll-out, on-pad, launch, reenter/land, and return-to-launch site. A literature search was conducted for previous work concerning occurrence and measurement of peak lighting currents, modeling, and estimating the probabilities of launch vehicles/objects being struck by lightning. This paper presents a summary of these results.

  20. Integrated Design of Basic Training, Practicum and End-of-Course Assignment Modules in the Teacher Training Degree: Perception of University Teachers, Students, and School Teachers

    NASA Astrophysics Data System (ADS)

    Torremorell, Maria Carme Boqué; de Nicolás, Montserrat Alguacil; Valls, Mercè Pañellas

    Teacher training at the Blanquerna Faculty of Psychology and Educational and Sports Sciences (FPCEE), in Barcelona, has a long pedagogical tradition based on teaching innovation. Its educational style is characterised by methods focused on the students' involvement and on close collaboration with teaching practice centres. Within a core subject in the Teacher Training diploma course, students were asked to assess different methodological proposals aimed at promoting the development of their personal, social, and professional competences. In the assessment surveys, from a sample of 145 students, scores for variables very satisfactory or satisfactory ranged from 95.8 % to 83.4 % for the entire set of methodological actions under analysis. Data obtained in this first research phase were very useful to design basic training modules for the new Teacher Training Degree. In the second phase (in process), active teachers are asked for their perception on the orientation of the practicum, its connection with the end-of-course assignment, and the in-service student's incidence on innovation processes at school.

  1. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    PubMed

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  2. Measuring and managing risk improves strategic financial planning.

    PubMed

    Kleinmuntz, D N; Kleinmuntz, C E; Stephen, R G; Nordlund, D S

    1999-06-01

    Strategic financial risk assessment is a practical technique that can enable healthcare strategic decision makers to perform quantitative analyses of the financial risks associated with a given strategic initiative. The technique comprises six steps: (1) list risk factors that might significantly influence the outcomes, (2) establish best-guess estimates for assumptions regarding how each risk factor will affect its financial outcomes, (3) identify risk factors that are likely to have the greatest impact, (4) assign probabilities to assumptions, (5) determine potential scenarios associated with combined assumptions, and (6) determine the probability-weighted average of the potential scenarios.

  3. Contact networks and the study of contagion.

    PubMed

    Hartigan, P M

    1980-09-01

    The contact network among individuals in a patient group and in a control group is examined. The probability of knowing another person is modelled with parameters assigned to various factors, such as age, sex or disease, which may influence this probability. Standard likelihood techniques are used to estimate the parameters and to test the significance of the hypotheses, in particular the hypothesis of contagion, generated in the modelling process. The method is illustrated in a study of the Yale student body, in which infectious mononucleosis patients of the opposite sex are shown to know each other significantly more frequently than expected.

  4. Super-channel oriented routing, spectrum and core assignment under crosstalk limit in spatial division multiplexing elastic optical networks

    NASA Astrophysics Data System (ADS)

    Zhao, Yongli; Zhu, Ye; Wang, Chunhui; Yu, Xiaosong; Liu, Chuan; Liu, Binglin; Zhang, Jie

    2017-07-01

    With the capacity increasing in optical networks enabled by spatial division multiplexing (SDM) technology, spatial division multiplexing elastic optical networks (SDM-EONs) attract much attention from both academic and industry. Super-channel is an important type of service provisioning in SDM-EONs. This paper focuses on the issue of super-channel construction in SDM-EONs. Mixed super-channel oriented routing, spectrum and core assignment (MS-RSCA) algorithm is proposed in SDM-EONs considering inter-core crosstalk. Simulation results show that MS-RSCA can improve spectrum resource utilization and reduce blocking probability significantly compared with the baseline RSCA algorithms.

  5. Supplementary health insurance as a tool for risk-selection in mandatory basic health insurance markets.

    PubMed

    Paolucci, Francesco; Schut, Erik; Beck, Konstantin; Gress, Stefan; Van de Voorde, Carine; Zmora, Irit

    2007-04-01

    As the share of supplementary health insurance (SI) in health care finance is likely to grow, SI may become an increasingly attractive tool for risk-selection in basic health insurance (BI). In this paper, we develop a conceptual framework to assess the probability that insurers will use SI for favourable risk-selection in BI. We apply our framework to five countries in which risk-selection via SI is feasible: Belgium, Germany, Israel, the Netherlands, and Switzerland. For each country, we review the available evidence of SI being used as selection device. We find that the probability that SI is and will be used for risk-selection substantially varies across countries. Finally, we discuss several strategies for policy makers to reduce the chance that SI will be used for risk-selection in BI markets.

  6. Predictability of currency market exchange

    NASA Astrophysics Data System (ADS)

    Ohira, Toru; Sazuka, Naoya; Marumo, Kouhei; Shimizu, Tokiko; Takayasu, Misako; Takayasu, Hideki

    2002-05-01

    We analyze tick data of yen-dollar exchange with a focus on its up and down movement. We show that there exists a rather particular conditional probability structure with such high frequency data. This result provides us with evidence to question one of the basic assumptions of the traditional market theory, where such bias in high frequency price movements is regarded as not present. We also construct systematically a random walk model reflecting this probability structure.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mumpower, J.L.

    There are strong structural similarities between risks from technological hazards and big-purse state lottery games. Risks from technological hazards are often described as low-probability, high-consequence negative events. State lotteries could be equally well characterized as low-probability, high-consequence positive events. Typical communications about state lotteries provide a virtual strategic textbook for opponents of risky technologies. The same techniques can be used to sell lottery tickets or sell opposition to risky technologies. Eight basic principles are enumerated.

  8. Redundant Sensors for Mobile Robot Navigation

    DTIC Science & Technology

    1985-09-01

    represent a probability that the area is empty, while positive numbers mcan it’s probably occupied. Zero reprtsents the unknown. The basic idea is that...room to give it absolute positioning information. This works by using two infrared emitters and detectors on the robot. Measurements of anglcs are made...meters (T in Kelvin) 273 sec Distances returned when assuming 80 degrees Farenheit , but where. actual temperature is 60 degrees, will be seven inches

  9. The Advantages and Disadvanatages of Civilian Employers Hiring National Guardsmen and Reservists

    DTIC Science & Technology

    2014-12-12

    Without your support, I probably wouldn’t have been able to sleep at all during the year. To my thesis committee chair, Mr. Robert Scott Martin, and...preliminary observations. 23 2. Identify themes–review transcripts in detail and make note of themes. 3. Developing a coding scheme –assign a

  10. Mortality Risks for Forest Trees Threatened with Gypsy Moth Infestation

    Treesearch

    Owen W. Herrick; David A. Gansner; David A. Gansner

    1987-01-01

    Presents guidelines for estimating potential tree mortality associated with gypsy moth defoliation. A tree's crown condition, crown position, and species group can be used to assign probabilities of death. Forest-land managers need such information to develop marking guides and implement silvicultural treatments for forest trees threatened with gypsy moth...

  11. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  12. Personal and Situational Factors in Drug Use as Perceived by Kibbutz Youth.

    ERIC Educational Resources Information Center

    Wolf, Yuval; And Others

    1995-01-01

    Sixteen 17-year-old kibbutz members, including 7 hashish smokers and 9 nonsmokers, assessed the probability that a young person of similar background would use drugs. It was found that hashish smokers assigned meaningful importance to a combined influence of personal predisposition and group pressure, while the nonsmokers considered only group…

  13. Assigning and Combining Probabilities in Single-Case Studies

    ERIC Educational Resources Information Center

    Manolov, Rumen; Solanas, Antonio

    2012-01-01

    There is currently a considerable diversity of quantitative measures available for summarizing the results in single-case studies. Given that the interpretation of some of them is difficult due to the lack of established benchmarks, the current article proposes an approach for obtaining further numerical evidence on the importance of the results,…

  14. Statistical Power for the Comparative Regression Discontinuity Design With a Pretest No-Treatment Control Function: Theory and Evidence From the National Head Start Impact Study.

    PubMed

    Tang, Yang; Cook, Thomas D

    2018-01-01

    The basic regression discontinuity design (RDD) has less statistical power than a randomized control trial (RCT) with the same sample size. Adding a no-treatment comparison function to the basic RDD creates a comparative RDD (CRD); and when this function comes from the pretest value of the study outcome, a CRD-Pre design results. We use a within-study comparison (WSC) to examine the power of CRD-Pre relative to both basic RDD and RCT. We first build the theoretical foundation for power in CRD-Pre, then derive the relevant variance formulae, and finally compare them to the theoretical RCT variance. We conclude from this theoretical part of this article that (1) CRD-Pre's power gain depends on the partial correlation between the pretest and posttest measures after conditioning on the assignment variable, (2) CRD-Pre is less responsive than basic RDD to how the assignment variable is distributed and where the cutoff is located, and (3) under a variety of conditions, the efficiency of CRD-Pre is very close to that of the RCT. Data from the National Head Start Impact Study are then used to construct RCT, RDD, and CRD-Pre designs and to compare their power. The empirical results indicate (1) a high level of correspondence between the predicted and obtained power results for RDD and CRD-Pre relative to the RCT, and (2) power levels in CRD-Pre and RCT that are very close. The study is unique among WSCs for its focus on the correspondence between RCT and observational study standard errors rather than means.

  15. Uncertain deduction and conditional reasoning.

    PubMed

    Evans, Jonathan St B T; Thompson, Valerie A; Over, David E

    2015-01-01

    There has been a paradigm shift in the psychology of deductive reasoning. Many researchers no longer think it is appropriate to ask people to assume premises and decide what necessarily follows, with the results evaluated by binary extensional logic. Most every day and scientific inference is made from more or less confidently held beliefs and not assumptions, and the relevant normative standard is Bayesian probability theory. We argue that the study of "uncertain deduction" should directly ask people to assign probabilities to both premises and conclusions, and report an experiment using this method. We assess this reasoning by two Bayesian metrics: probabilistic validity and coherence according to probability theory. On both measures, participants perform above chance in conditional reasoning, but they do much better when statements are grouped as inferences, rather than evaluated in separate tasks.

  16. Performance of cellular frequency-hopped spread-spectrum radio networks

    NASA Astrophysics Data System (ADS)

    Gluck, Jeffrey W.; Geraniotis, Evaggelos

    1989-10-01

    Multiple access interference is characterized for cellular mobile networks, in which users are assumed to be Poisson-distributed in the plane and employ frequency-hopped spread-spectrum signaling with transmitter-oriented assignment of frequency-hopping patterns. Exact expressions for the bit error probabilities are derived for binary coherently demodulated systems without coding. Approximations for the packet error probability are derived for coherent and noncoherent systems and these approximations are applied when forward-error-control coding is employed. In all cases, the effects of varying interference power are accurately taken into account according to some propagation law. Numerical results are given in terms of bit error probability for the exact case and throughput for the approximate analyses. Comparisons are made with previously derived bounds and it is shown that these tend to be very pessimistic.

  17. Applying a basic development needs approach for sustainable and integrated community development in less-developed areas: report of ongoing Iranian experience.

    PubMed

    Asadi-Lari, M; Farshad, A A; Assaei, S E; Vaez Mahdavi, M R; Akbari, M E; Ameri, A; Salimi, Z; Gray, D

    2005-06-01

    Despite considerable achievements in the provision of basic developmental facilities in terms of drinking water, access to primary healthcare services, high-quality and nutritious food, social services, and proper housing facilities, there are many rural and slum communities in Iran where these essential needs remain unfulfilled. Lack of equity is prominent, as large differences exist in underprivileged provinces. New policies developed in the past two decades have resulted in substantial achievements in meeting population needs and reducing the socio-economic gap; nevertheless, poverty levels, unemployment due to a large increase in the birth rate in the early 1980s, and lack of community participation are matters yet to be addressed. To overcome these deficiencies, a basic development needs approach was adopted to promote the concept of community self-help and self-reliance through intersectoral collaboration, creating an environment where people could take an active part in the development process, with the Iranian government providing the necessary support to achieve the desired level of development. Following firm commitment from the Iranian government and technical support from the World Health Organization Regional Office, basic development needs was assigned a high priority in health and health-related sectors, reflected in the third National Masterplan (2001-2005). A comprehensive intersectoral plan was designed, and pilot projects were commenced in three villages. Each village elected a representative, and committee clusters were formed to run and monitor projects identified by a process of local needs assessment and priority assignment. In each region, a variety of needs were elicited from these assessments, which were actively supported by local authorities. A basic development needs approach was found to be a reliable discipline to improve community participation, needs-led resource allocation and intersectoral co-operation in community development, particularly in underprivileged areas. Iran's initial experience of basic development needs has gained widespread public support but will require periodical evaluation as it is introduced into other rural and urban regions across the country.

  18. Severely Aggressive Children Receiving Stimulant Medication Versus Stimulant and Risperidone: 12-Month Follow-Up of the TOSCA Trial.

    PubMed

    Gadow, Kenneth D; Brown, Nicole V; Arnold, L Eugene; Buchan-Page, Kristin A; Bukstein, Oscar G; Butter, Eric; Farmer, Cristan A; Findling, Robert L; Kolko, David J; Molina, Brooke S G; Rice, Robert R; Schneider, Jayne; Aman, Michael G

    2016-06-01

    The objective of this study was to evaluate 52-week clinical outcomes of children with co-occurring attention-deficit/hyperactivity disorder (ADHD), disruptive behavior disorder, and serious physical aggression who participated in a prospective, longitudinal study that began with a controlled, 9-week clinical trial comparing the relative efficacy of parent training + stimulant medication + placebo (Basic; n = 84) versus parent training + stimulant + risperidone (Augmented; n = 84). Almost two-thirds (n = 108; 64%) of families in the 9-week study participated in week 52 follow-ups (Basic, n = 55; Augmented, n = 53) and were representative of the initial study sample. The assessment battery included caregiver and clinician ratings and laboratory tests. Only 43% of participants in the Augmented group and 36% in the Basic group still adhered to their assigned regimen (not significant [NS]); 23% of those in the Augmented group and 11% in the Basic group were taking no medication (NS). Both randomized groups improved baseline to follow-up, but the 3 primary parent-reported behavioral outcomes showed no significant between-group differences. Exploratory analyses indicated that participants in the Augmented group (65%) were more likely (p = .02) to have a Clinical Global Impressions (CGI) severity score of 1 to 3 (i.e., normal to mildly ill) at follow-up than those in the Basic group (42%). Parents rated 45% of children as impaired often or very often from ADHD, noncompliant, or aggressive behavior. The Augmented group had elevated prolactin levels, and the Basic group had decreased weight over time. Findings were generally similar whether groups were defined by randomized assignment or follow-up treatment status. Both treatment strategies were associated with clinical improvement at follow-up, and primary behavioral outcomes did not differ significantly. Many children evidenced lingering mental health concerns, suggesting the need for additional research into more effective interventions. Clinical trial registration information-Treatment of Severe Childhood Aggression (the TOSCA Study); http://clinicaltrials.gov/; NCT00796302. Published by Elsevier Inc.

  19. The External Validity of Prediction Models for the Diagnosis of Obstructive Coronary Artery Disease in Patients With Stable Chest Pain: Insights From the PROMISE Trial.

    PubMed

    Genders, Tessa S S; Coles, Adrian; Hoffmann, Udo; Patel, Manesh R; Mark, Daniel B; Lee, Kerry L; Steyerberg, Ewout W; Hunink, M G Myriam; Douglas, Pamela S

    2018-03-01

    This study sought to externally validate prediction models for the presence of obstructive coronary artery disease (CAD). A better assessment of the probability of CAD may improve the identification of patients who benefit from noninvasive testing. Stable chest pain patients from the PROMISE (Prospective Multicenter Imaging Study for Evaluation of Chest Pain) trial with computed tomography angiography (CTA) or invasive coronary angiography (ICA) were included. The authors assumed that patients with CTA showing 0% stenosis and a coronary artery calcium (CAC) score of 0 were free of obstructive CAD (≥50% stenosis) on ICA, and they multiply imputed missing ICA results based on clinical variables and CTA results. Predicted CAD probabilities were calculated using published coefficients for 3 models: basic model (age, sex, chest pain type), clinical model (basic model + diabetes, hypertension, dyslipidemia, and smoking), and clinical + CAC score model. The authors assessed discrimination and calibration, and compared published effects with observed predictor effects. In 3,468 patients (1,805 women; mean 60 years of age; 779 [23%] with obstructive CAD on CTA), the models demonstrated moderate-good discrimination, with C-statistics of 0.69 (95% confidence interval [CI]: 0.67 to 0.72), 0.72 (95% CI: 0.69 to 0.74), and 0.86 (95% CI: 0.85 to 0.88) for the basic, clinical, and clinical + CAC score models, respectively. Calibration was satisfactory although typical chest pain and diabetes were less predictive and CAC score was more predictive than was suggested by the models. Among the 31% of patients for whom the clinical model predicted a low (≤10%) probability of CAD, actual prevalence was 7%; among the 48% for whom the clinical + CAC score model predicted a low probability the observed prevalence was 2%. In 2 sensitivity analyses excluding imputed data, similar results were obtained using CTA as the outcome, whereas in those who underwent ICA the models significantly underestimated CAD probability. Existing clinical prediction models can identify patients with a low probability of obstructive CAD. Obstructive CAD on ICA was imputed for 61% of patients; hence, further validation is necessary. Copyright © 2018 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  20. Comparison of spectra using a Bayesian approach. An argument using oil spills as an example.

    PubMed

    Li, Jianfeng; Hibbert, D Brynn; Fuller, Steven; Cattle, Julie; Pang Way, Christopher

    2005-01-15

    The problem of assigning a probability of matching a number of spectra is addressed. The context is in environmental spills when an EPA needs to show that the material from a polluting spill (e.g., oil) is likely to have originated at a particular site (factory, refinery) or from a vehicle (road tanker or ship). Samples are taken from the spill, and candidate sources and are analyzed by spectroscopy (IR, fluorescence) or chromatography (GC or GC/MS). A matching algorithm is applied to pairs of spectra giving a single statistic (R). This can be a point-to-point match giving a correlation coefficient or a Euclidean distance or a derivative of these parameters. The distributions of R for same and different samples are established from existing data. For matching statistics with values in the range {0,1} corresponding to no match (0) to a perfect match (1) a beta distribution can be fitted to most data. The values of R from the match of the spectrum of a spilled oil and of each of a number of suspects are calculated and Bayes' theorem is applied to give a probability of matches between spill sample and each candidate and the probability of no match at all. The method is most effective when simple inspection of the matching parameters does not lead to an obvious conclusion; i.e., there is overlap of the distributions giving rise to dubiety of an assignment. The probability of finding a matching statistic if there were a match to the probability of finding it if there were no match, expressed as a ratio (called the likelihood ratio), is a sensitive and useful parameter to guide the analyst. It is proposed that this approach may be acceptable to a court of law and avoid challenges of apparently subjective opinion of an analyst. Examples of matching the fluorescence and infrared spectra of diesel oils are given.

  1. Gender identification of Caspian Terns using external morphology and discriminant function analysis

    USGS Publications Warehouse

    Ackerman, Joshua T.; Takekawa, John Y.; Bluso, J.D.; Yee, J.L.; Eagles-Smith, Collin A.

    2008-01-01

    Caspian Tern (Sterna caspia) plumage characteristics are sexually monochromatic and gender cannot easily be distinguished in the field without extensive behavioral observations. We assessed sexual size dimorphism and developed a discriminant function to assign gender in Caspian Terns based on external morphology. We collected and measured Caspian Terns in San Francisco Bay, California, and confirmed their gender based on necropsy and genetic analysis. Of the eight morphological measurements we examined, only bill depth at the gonys and head plus bill length differed between males and females with males being larger than females. A discriminant function using both bill depth at the gonys and head plus bill length accurately assigned gender of 83% of terns for which gender was known. We improved the accuracy of our discriminant function to 90% by excluding individuals that had less than a 75% posterior probability of correctly being assigned to gender. Caspian Terns showed little sexual size dimorphism in many morphometries, but our results indicate they can be reliably assigned to gender in the field using two morphological measurements.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fogh, R.H.; Mabbutt, B.C.; Kem, W.R.

    Sequence-specific assignments are reported for the 500-MHz H nuclear magnetic resonance (NMR) spectrum of the 48-residue polypeptide neurotoxin I from the sea anemone Stichodactyla helianthus (Sh I). Spin systems were first identified by using two-dimensional relayed or multiple quantum filtered correlation spectroscopy, double quantum spectroscopy, and spin lock experiments. Specific resonance assignments were then obtained from nuclear Overhauser enhancement (NOE) connectivities between protons from residues adjacent in the amino acid sequence. Of a total of 265 potentially observable resonances, 248 (i.e., 94%) were assigned, arising from 39 completely and 9 partially assigned amino acid spin systems. The secondary structure ofmore » Sh I was defined on the basis of the pattern of sequential NOE connectivities. NOEs between protons on separate strands of the polypeptide backbone, and backbone amide exchange rates. Sh I contains a four-stranded antiparallel {beta}-sheet encompassing residues 1-5, 16-24, 30-33, and 40-46, with a {beta}-bulge at residues 17 and 18 and a reverse turn, probably a type II {beta}-turn, involving residues 27-30. No evidence of {alpha}-helical structure was found.« less

  3. Effects of MyTeachingPartner-Math/Science on Teacher-Child Interactions in Prekindergarten Classrooms

    ERIC Educational Resources Information Center

    Whittaker, Jessica Vick; Kinzie, Mable B.; Williford, Amanda; DeCoster, Jamie

    2016-01-01

    Research Findings: This study examined the impact of MyTeachingPartner-Math/Science, a system of math and science curricula and professional development, on the quality of teachers' interactions with children in their classrooms. Schools were randomly assigned to 1 of 2 intervention conditions (Basic: curricula providing within-activity, embedded…

  4. Can the Faculty Development Door Swing Both Ways? Science and Clinical Teaching in the 1990s.

    ERIC Educational Resources Information Center

    Tedesco, Lisa A.

    1988-01-01

    The relationship between clinical teaching and research in the basic sciences is discussed. The same energy expended to enhance clinical research will also efficiently build new curricula; ease the strains associated with assigning a priority to teaching or research; and serve to further science, teaching, and technology transfer. (MLW)

  5. Motivating Students to Read with Collaborative Reading Quizzes

    ERIC Educational Resources Information Center

    Quinn, Timothy; Eckerson, Todd

    2010-01-01

    One of the most important challenges a teacher faces is motivating his or her students to complete reading assignments and to complete them carefully. After all, if students bring to class a basic understanding of the text up for discussion, much deeper learning can occur than if the teacher is forced to spend time explaining the reading to…

  6. The Many Methods to Measure Testability: A Horror Story.

    DTIC Science & Technology

    1988-04-01

    it seems overly simplistic to assign only one "magic number" as a viable design goal. Different design technologies such as digital, analog, machanical ...FAILURE RATE 1 1 BASIC TEST PROGRAM 1 1 ATLAS TEST PROGRAM 1 1 EDIF FILE 1 1 TEST STRATEGY FLOWCHART 1 1 RTOK FREQUENCY 1 1 DIAGNOSIS AVERAGE COST 1 1

  7. Learner Control of Instructional Sequence in Computer-Based Instruction: A Comparison to Programmed Control.

    ERIC Educational Resources Information Center

    Lahey, George F.; Coady, James D.

    This study was conducted to determine whether learner control of lesson strategy is superior to programmed control in computer-based instruction (CBI), and, if so, whether learner control is more effective when guidance is provided. Subjects were 164 trainees assigned to the Basic Electricity/Electronics School in San Diego. They were randomly…

  8. Class Manual for Information Resources in the Humanities (LIS 382L.2).

    ERIC Educational Resources Information Center

    Roy, Loriene

    Basic course information and worksheets are presented in this textbook/workbook for "Information Resources in the Humanities," a course offered by the Graduate School of Library and Information Science at the University of Texas at Austin. The guide is divided into eight sections. The first presents the syllabus, lists assignments (e.g.,…

  9. Academic Music: Music Instruction to Engage Third-Grade Students in Learning Basic Fraction Concepts

    ERIC Educational Resources Information Center

    Courey, Susan Joan; Balogh, Endre; Siker, Jody Rebecca; Paik, Jae

    2012-01-01

    This study examined the effects of an academic music intervention on conceptual understanding of music notation, fraction symbols, fraction size, and equivalency of third graders from a multicultural, mixed socio-economic public school setting. Students (N = 67) were assigned by class to their general education mathematics program or to receive…

  10. Examining a Web-Based Peer Feedback System in an Introductory Computer Literacy Course

    ERIC Educational Resources Information Center

    Adiguzel, Tufan; Varank, Ilhan; Erkoç, Mehmet Fatih; Buyukimdat, Meryem Koskeroglu

    2017-01-01

    This study focused on formative use of peer feedback in an online system that was used in basic computer literacy for word processing assignment-related purposes. Specifically, the effect of quantity, modality and satisfaction of peer feedback provided through the online system on students' performance, self-efficacy, and technology acceptance was…

  11. An Introduction to Propensity Scores: What, When, and How

    ERIC Educational Resources Information Center

    Beal, Sarah J.; Kupzyk, Kevin A.

    2014-01-01

    The use of propensity scores as a method to promote causality in studies that cannot use random assignment has increased dramatically since its original publication in 1983. While the utility of these approaches is important, the concepts underlying their use are complex. The purpose of this article is to provide a basic tutorial for conducting…

  12. Making a Case for Exact Language as an Aspect of Rigour in Initial Teacher Education Mathematics Programmes

    ERIC Educational Resources Information Center

    van Jaarsveld, Pieter

    2016-01-01

    Pre-service secondary mathematics teachers have a poor command of the exact language of mathematics as evidenced in assignments, micro-lessons and practicums. The unrelenting notorious annual South African National Senior Certificate outcomes in mathematics and the recognition by the Department of Basic Education (DBE) that the correct use of…

  13. NLRB: The First 50 Years. The Story of the National Labor Relations Board 1935-1985.

    ERIC Educational Resources Information Center

    National Labor Relations Board, Washington, DC.

    The National Labor Relations Board (NLRB) is an independent federal agency created in 1935 by Congress to administer the National Labor Relations Act, the basic law governing relations between labor unions and business enterprises engaged in operations affecting interstate commerce. In its statutory assignment, the NLRB has two principal…

  14. Using Primary Literature in an Undergraduate Assignment: Demonstrating Connections among Cellular Processes

    ERIC Educational Resources Information Center

    Yeong, Foong May

    2015-01-01

    Learning basic cell biology in an essential module can be daunting to second-year undergraduates, given the depth of information that is provided in major molecular and cell biology textbooks. Moreover, lectures on cellular pathways are organised into sections, such that at the end of lectures, students might not see how various processes are…

  15. School Finance Case Study: Dealing with a School District Budget Deficit

    ERIC Educational Resources Information Center

    Kersten, Thomas

    2007-01-01

    This case study-based class assignment (see Appendix A) is designed as a culminating course activity through which students demonstrate not only their understanding of school finance basics but also show how to apply their knowledge to solving a problem impacting many public school administrators today. Because the case study is general in design,…

  16. Military Curriculum Materials for Vocational and Technical Education. Fundamentals of Electricity, 3-7.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This self-paced correspondence course for independent study in electricity was adapted from military curriculum materials for use in vocational education. This basic course is designed to provide the student with some fundamentals of electricity--not with specific job skills. The seven lessons of the course each have a lesson assignment sheet with…

  17. Implementation of Online Reading Assessments to Encourage Reading Interests

    ERIC Educational Resources Information Center

    Rahayu, Endang Yuliani; Februariyanti, Herni

    2015-01-01

    The current study reports a two-year research project funded by the Government of the Republic of Indonesia through a competitive research scheme. The aim is basically to respond to the fact most university students have very low interests in reading activities, such as finding out important information for their term papers as assigned by the…

  18. Prioritizing Project Risks Using AHP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thibadeau, Barbara M

    2007-01-01

    This essay introduces the Analytic Hierarchy Process (AHP) as a method by which to rank project risks, in terms of importance as well as likelihood. AHP is way to handle quantifiable and/or intangible criteria in the decision making process. It is a multi-objective multi-criteria decision-making approach that is based on the idea of pair-wise comparisons of alternatives with respect to a given criterion (e.g., which alternative, A or B, is preferred and by how much more is it preferred) or with respect to an objective (e.g., which is more important, A or B, and by how much more is itmore » important). This approach was pioneered by Thomas Saaty in the late 1970's. It has been suggested that a successful project is one that successfully manages risk and that project management is the management of uncertainty. Risk management relies on the quantification of uncertainty which, in turn, is predicated upon the accuracy of probabilistic approaches (in terms of likelihood as well as magnitude). In many cases, the appropriate probability distribution (or probability value) is unknown. And, researchers have shown that probability values are not made very accurately, that the use of verbal expressions is not a suitable alternative, that there is great variability in the use and interpretation of these values and that there is a great reluctance to assign them in the first place. Data from an ongoing project is used to show that AHP can be used to obtain these values, thus overcoming some of the problems associated with the direct assignment of discrete probability values. A novel method by which to calculate the consistency of the data is introduced. The AHP approach is easily implemented and, typically, offers results that are consistent with the decision maker's intuition.« less

  19. Pulmonary hypertension in interstitial lung disease: Limitations of echocardiography compared to cardiac catheterization.

    PubMed

    Keir, Gregory J; Wort, S John; Kokosi, Maria; George, Peter M; Walsh, Simon L F; Jacob, Joseph; Price, Laura; Bax, Simon; Renzoni, Elisabetta A; Maher, Toby M; MacDonald, Peter; Hansell, David M; Wells, Athol U

    2018-01-12

    In interstitial lung disease (ILD), pulmonary hypertension (PH) is a major adverse prognostic determinant. Transthoracic echocardiography (TTE) is the most widely used tool when screening for PH, although discordance between TTE and right heart catheter (RHC) measured pulmonary haemodynamics is increasingly recognized. We evaluated the predictive utility of the updated European Society of Cardiology/European Respiratory Society (ESC/ERS) TTE screening recommendations against RHC testing in a large, well-characterized ILD cohort. Two hundred and sixty-five consecutive patients with ILD and suspected PH underwent comprehensive assessment, including RHC, between 2006 and 2012. ESC/ERS recommended tricuspid regurgitation (TR) velocity thresholds for assigning high (>3.4 m/s), intermediate (2.9-3.4 m/s) and low (<2.8 m/s) probabilities of PH were evaluated against RHC testing. RHC testing confirmed PH in 86% of subjects with a peak TR velocity >3.4 m/s, and excluded PH in 60% of ILD subjects with a TR velocity <2.8 m/s. Thus, the ESC/ERS guidelines misclassified 40% of subjects as 'low probability' of PH, when PH was confirmed on subsequent RHC. Evaluating alternative TR velocity thresholds for assigning a low probability of PH did not significantly improve the ability of TR velocity to exclude a diagnosis of PH. In patients with ILD and suspected PH, currently recommended ESC/ERS TR velocity screening thresholds were associated with a high positive predictive value (86%) for confirming PH, but were of limited value in excluding PH, with 40% of patients misclassified as low probability when PH was confirmed at subsequent RHC. © 2018 Asian Pacific Society of Respirology.

  20. Comparison of empirical estimate of clinical pretest probability with the Wells score for diagnosis of deep vein thrombosis.

    PubMed

    Wang, Bo; Lin, Yin; Pan, Fu-shun; Yao, Chen; Zheng, Zi-Yu; Cai, Dan; Xu, Xiang-dong

    2013-01-01

    Wells score has been validated for estimation of pretest probability in patients with suspected deep vein thrombosis (DVT). In clinical practice, many clinicians prefer to use empirical estimation rather than Wells score. However, which method is better to increase the accuracy of clinical evaluation is not well understood. Our present study compared empirical estimation of pretest probability with the Wells score to investigate the efficiency of empirical estimation in the diagnostic process of DVT. Five hundred and fifty-five patients were enrolled in this study. One hundred and fifty patients were assigned to examine the interobserver agreement for Wells score between emergency and vascular clinicians. The other 405 patients were assigned to evaluate the pretest probability of DVT on the basis of the empirical estimation and Wells score, respectively, and plasma D-dimer levels were then determined in the low-risk patients. All patients underwent venous duplex scans and had a 45-day follow up. Weighted Cohen's κ value for interobserver agreement between emergency and vascular clinicians of the Wells score was 0.836. Compared with Wells score evaluation, empirical assessment increased the sensitivity, specificity, Youden's index, positive likelihood ratio, and positive and negative predictive values, but decreased negative likelihood ratio. In addition, the appropriate D-dimer cutoff value based on Wells score was 175 μg/l and 108 patients were excluded. Empirical assessment increased the appropriate D-dimer cutoff point to 225 μg/l and 162 patients were ruled out. Our findings indicated that empirical estimation not only improves D-dimer assay efficiency for exclusion of DVT but also increases clinical judgement accuracy in the diagnosis of DVT.

  1. Novel density-based and hierarchical density-based clustering algorithms for uncertain data.

    PubMed

    Zhang, Xianchao; Liu, Han; Zhang, Xiaotong

    2017-09-01

    Uncertain data has posed a great challenge to traditional clustering algorithms. Recently, several algorithms have been proposed for clustering uncertain data, and among them density-based techniques seem promising for handling data uncertainty. However, some issues like losing uncertain information, high time complexity and nonadaptive threshold have not been addressed well in the previous density-based algorithm FDBSCAN and hierarchical density-based algorithm FOPTICS. In this paper, we firstly propose a novel density-based algorithm PDBSCAN, which improves the previous FDBSCAN from the following aspects: (1) it employs a more accurate method to compute the probability that the distance between two uncertain objects is less than or equal to a boundary value, instead of the sampling-based method in FDBSCAN; (2) it introduces new definitions of probability neighborhood, support degree, core object probability, direct reachability probability, thus reducing the complexity and solving the issue of nonadaptive threshold (for core object judgement) in FDBSCAN. Then, we modify the algorithm PDBSCAN to an improved version (PDBSCANi), by using a better cluster assignment strategy to ensure that every object will be assigned to the most appropriate cluster, thus solving the issue of nonadaptive threshold (for direct density reachability judgement) in FDBSCAN. Furthermore, as PDBSCAN and PDBSCANi have difficulties for clustering uncertain data with non-uniform cluster density, we propose a novel hierarchical density-based algorithm POPTICS by extending the definitions of PDBSCAN, adding new definitions of fuzzy core distance and fuzzy reachability distance, and employing a new clustering framework. POPTICS can reveal the cluster structures of the datasets with different local densities in different regions better than PDBSCAN and PDBSCANi, and it addresses the issues in FOPTICS. Experimental results demonstrate the superiority of our proposed algorithms over the existing algorithms in accuracy and efficiency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Interactive RadioEpidemiological Program (IREP): a web-based tool for estimating probability of causation/assigned share of radiogenic cancers.

    PubMed

    Kocher, David C; Apostoaei, A Iulian; Henshaw, Russell W; Hoffman, F Owen; Schubauer-Berigan, Mary K; Stancescu, Daniel O; Thomas, Brian A; Trabalka, John R; Gilbert, Ethel S; Land, Charles E

    2008-07-01

    The Interactive RadioEpidemiological Program (IREP) is a Web-based, interactive computer code that is used to estimate the probability that a given cancer in an individual was induced by given exposures to ionizing radiation. IREP was developed by a Working Group of the National Cancer Institute and Centers for Disease Control and Prevention, and was adopted and modified by the National Institute for Occupational Safety and Health (NIOSH) for use in adjudicating claims for compensation for cancer under the Energy Employees Occupational Illness Compensation Program Act of 2000. In this paper, the quantity calculated in IREP is referred to as "probability of causation/assigned share" (PC/AS). PC/AS for a given cancer in an individual is calculated on the basis of an estimate of the excess relative risk (ERR) associated with given radiation exposures and the relationship PC/AS = ERR/ERR+1. IREP accounts for uncertainties in calculating probability distributions of ERR and PC/AS. An accounting of uncertainty is necessary when decisions about granting claims for compensation for cancer are made on the basis of an estimate of the upper 99% credibility limit of PC/AS to give claimants the "benefit of the doubt." This paper discusses models and methods incorporated in IREP to estimate ERR and PC/AS. Approaches to accounting for uncertainty are emphasized, and limitations of IREP are discussed. Although IREP is intended to provide unbiased estimates of ERR and PC/AS and their uncertainties to represent the current state of knowledge, there are situations described in this paper in which NIOSH, as a matter of policy, makes assumptions that give a higher estimate of the upper 99% credibility limit of PC/AS than other plausible alternatives and, thus, are more favorable to claimants.

  3. Inside the black box: starting to uncover the underlying decision rules used in one-by-one expert assessment of occupational exposure in case-control studies

    PubMed Central

    Wheeler, David C.; Burstyn, Igor; Vermeulen, Roel; Yu, Kai; Shortreed, Susan M.; Pronk, Anjoeka; Stewart, Patricia A.; Colt, Joanne S.; Baris, Dalsu; Karagas, Margaret R.; Schwenn, Molly; Johnson, Alison; Silverman, Debra T.; Friesen, Melissa C.

    2014-01-01

    Objectives Evaluating occupational exposures in population-based case-control studies often requires exposure assessors to review each study participants' reported occupational information job-by-job to derive exposure estimates. Although such assessments likely have underlying decision rules, they usually lack transparency, are time-consuming and have uncertain reliability and validity. We aimed to identify the underlying rules to enable documentation, review, and future use of these expert-based exposure decisions. Methods Classification and regression trees (CART, predictions from a single tree) and random forests (predictions from many trees) were used to identify the underlying rules from the questionnaire responses and an expert's exposure assignments for occupational diesel exhaust exposure for several metrics: binary exposure probability and ordinal exposure probability, intensity, and frequency. Data were split into training (n=10,488 jobs), testing (n=2,247), and validation (n=2,248) data sets. Results The CART and random forest models' predictions agreed with 92–94% of the expert's binary probability assignments. For ordinal probability, intensity, and frequency metrics, the two models extracted decision rules more successfully for unexposed and highly exposed jobs (86–90% and 57–85%, respectively) than for low or medium exposed jobs (7–71%). Conclusions CART and random forest models extracted decision rules and accurately predicted an expert's exposure decisions for the majority of jobs and identified questionnaire response patterns that would require further expert review if the rules were applied to other jobs in the same or different study. This approach makes the exposure assessment process in case-control studies more transparent and creates a mechanism to efficiently replicate exposure decisions in future studies. PMID:23155187

  4. Modulation/demodulation techniques for satellite communications. Part 1: Background

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1981-01-01

    Basic characteristics of digital data transmission systems described include the physical communication links, the notion of bandwidth, FCC regulations, and performance measurements such as bit rates, bit error probabilities, throughputs, and delays. The error probability performance and spectral characteristics of various modulation/demodulation techniques commonly used or proposed for use in radio and satellite communication links are summarized. Forward error correction with block or convolutional codes is also discussed along with the important coding parameter, channel cutoff rate.

  5. A diffusion climatology for Cape Canaveral, Florida

    NASA Technical Reports Server (NTRS)

    Siler, R. K.

    1980-01-01

    The problem of toxic effluent released by a space shuttle launch on local plant and animal life is discussed. Based on several successive years of data, nine basic weather patterns were identified, and the probabilities of pattern occurrence, of onshore/alongshore cloud transport, of precipitation accompanying the latter, and of ground-level concentrations of hydrogen chloride were determined. Diurnal variations for the patterns were also investigated. Sketches showing probable movement of launch cloud exhaust and isobaric maps are presented.

  6. Sedimentation in Hot Creek in vicinity of Hot Creek Fish Hatchery, Mono County, California

    USGS Publications Warehouse

    Burkham, D.E.

    1978-01-01

    An accumulation of fine-grained sediment in Hot Creek downstream from Hot Creek Fish Hatchery, Mono County, Calif., created concern that the site may be deteriorating as a habitat for trout. The accumulation is a phenomenon that probably occurs naturally in the problem reach. Fluctuation in the weather probably is the basic cause of the deposition of fine-grained sediment that has occurred since about 1970. Man 's activities and the Hot Creek Fish Hatchery may have contributed to the problem; the significance of these factors, however, probably was magnified because of drought conditions in 1975-77. (Woodard-USGS)

  7. Economic Intervention and Parenting: A Randomized Experiment of Statewide Child Development Accounts

    ERIC Educational Resources Information Center

    Nam, Yunju; Wikoff, Nora; Sherraden, Michael

    2016-01-01

    Objective: We examine the effects of Child Development Accounts (CDAs) on parenting stress and practices. Methods: We use data from the SEED for Oklahoma Kids (SEED OK) experiment. SEED OK selected caregivers of infants from Oklahoma birth certificates using a probability sampling method, randomly assigned caregivers to the treatment (n = 1,132)…

  8. An Introduction to Propensity Score Methods for Reducing the Effects of Confounding in Observational Studies

    ERIC Educational Resources Information Center

    Austin, Peter C.

    2011-01-01

    The propensity score is the probability of treatment assignment conditional on observed baseline characteristics. The propensity score allows one to design and analyze an observational (nonrandomized) study so that it mimics some of the particular characteristics of a randomized controlled trial. In particular, the propensity score is a balancing…

  9. Updating the Economic Impacts of the High/Scope Perry Preschool Program

    ERIC Educational Resources Information Center

    Nores, Milagros; Belfield, Clive R.; Barnett, W. Steven; Schweinhart, Lawrence

    2005-01-01

    This article derives an updated cost-benefit ratio for the High/Scope Perry Preschool Program, an intensive preschool intervention delivered during the 1960s to at-risk children in Ypsilanti, Michigan. Because children were randomly assigned to the program or a control group, differences in outcomes are probably attributable to program status.…

  10. Revisiting the Personal Essay with Ben Hamper's "Rivethead"

    ERIC Educational Resources Information Center

    Kramer, Jacob

    2011-01-01

    The personal essay--a paper in which a student brings in his or her own experience or concerns--is probably familiar to most historians. Teaching at the City University of New York, the author has found grading personal essays somewhat perplexing. They are sometimes written in response to an assignment that does not call for personal reflection.…

  11. Study on Failure of Third-Party Damage for Urban Gas Pipeline Based on Fuzzy Comprehensive Evaluation.

    PubMed

    Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong

    2016-01-01

    Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention.

  12. Application of fuzzy fault tree analysis based on modified fuzzy AHP and fuzzy TOPSIS for fire and explosion in the process industry.

    PubMed

    Yazdi, Mohammad; Korhan, Orhan; Daneshvar, Sahand

    2018-05-09

    This study aimed at establishing fault tree analysis (FTA) using expert opinion to compute the probability of an event. To find the probability of the top event (TE), all probabilities of the basic events (BEs) should be available when the FTA is drawn. In this case, employing expert judgment can be used as an alternative to failure data in an awkward situation. The fuzzy analytical hierarchy process as a standard technique is used to give a specific weight to each expert, and fuzzy set theory is engaged for aggregating expert opinion. In this regard, the probability of BEs will be computed and, consequently, the probability of the TE obtained using Boolean algebra. Additionally, to reduce the probability of the TE in terms of three parameters (safety consequences, cost and benefit), the importance measurement technique and modified TOPSIS was employed. The effectiveness of the proposed approach is demonstrated with a real-life case study.

  13. Detection of the nipple in automated 3D breast ultrasound using coronal slab-average-projection and cumulative probability map

    NASA Astrophysics Data System (ADS)

    Kim, Hannah; Hong, Helen

    2014-03-01

    We propose an automatic method for nipple detection on 3D automated breast ultrasound (3D ABUS) images using coronal slab-average-projection and cumulative probability map. First, to identify coronal images that appeared remarkable distinction between nipple-areola region and skin, skewness of each coronal image is measured and the negatively skewed images are selected. Then, coronal slab-average-projection image is reformatted from selected images. Second, to localize nipple-areola region, elliptical ROI covering nipple-areola region is detected using Hough ellipse transform in coronal slab-average-projection image. Finally, to separate the nipple from areola region, 3D Otsu's thresholding is applied to the elliptical ROI and cumulative probability map in the elliptical ROI is generated by assigning high probability to low intensity region. False detected small components are eliminated using morphological opening and the center point of detected nipple region is calculated. Experimental results show that our method provides 94.4% nipple detection rate.

  14. U.S. Maternally Linked Birth Records May Be Biased for Hispanics and Other Population Groups

    PubMed Central

    LEISS, JACK K.; GILES, DENISE; SULLIVAN, KRISTIN M.; MATHEWS, RAHEL; SENTELLE, GLENDA; TOMASHEK, KAY M.

    2010-01-01

    Purpose To advance understanding of linkage error in U.S. maternally linked datasets, and how the error may affect results of studies based on the linked data. Methods North Carolina birth and fetal death records for 1988-1997 were maternally linked (n=1,030,029). The maternal set probability, defined as the probability that all records assigned to the same maternal set do in fact represent events to the same woman, was used to assess differential maternal linkage error across race/ethnic groups. Results Maternal set probabilities were lower for records specifying Asian or Hispanic race/ethnicity, suggesting greater maternal linkage error. The lower probabilities for Hispanics were concentrated in women of Mexican origin who were not born in the United States. Conclusions Differential maternal linkage error may be a source of bias in studies using U.S. maternally linked datasets to make comparisons between Hispanics and other groups or among Hispanic subgroups. Methods to quantify and adjust for this potential bias are needed. PMID:20006273

  15. SUGGEL: A Program Suggesting the Orbital Angular Momentum of a Neutron Resonance from the Magnitude of its Neutron Width

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oh, S.Y.

    2001-02-02

    The SUGGEL computer code has been developed to suggest a value for the orbital angular momentum of a neutron resonance that is consistent with the magnitude of its neutron width. The suggestion is based on the probability that a resonance having a certain value of g{Gamma}{sub n} is an l-wave resonance. The probability is calculated by using Bayes' theorem on the conditional probability. The probability density functions (pdf's) of g{Gamma}{sub n} for up to d-wave (l=2) have been derived from the {chi}{sup 2} distribution of Porter and Thomas. The pdf's take two possible channel spins into account. This code ismore » a tool which evaluators will use to construct resonance parameters and help to assign resonance spin. The use of this tool is expected to reduce time and effort in the evaluation procedure, since the number of repeated runs of the fitting code (e.g., SAMMY) may be reduced.« less

  16. Culture and Probability Judgment Accuracy: The Influence of Holistic Reasoning

    PubMed Central

    Lechuga, Julia; Wiebe, John S.

    2012-01-01

    A well-established phenomenon in the judgment and decision-making tradition is the overconfidence one places in the amount of knowledge that one possesses. Overconfidence or probability judgment accuracy varies not only individually but also across cultures. However, research efforts to explain cross-cultural variations in the overconfidence phenomenon have seldom been made. In Study 1, the authors compared the probability judgment accuracy of U.S. Americans (N = 108) and Mexican participants (N = 100). In Study 2, they experimentally primed culture by randomly assigning English/Spanish bilingual Mexican Americans (N = 195) to response language. Results of both studies replicated the cross-cultural variation of probability judgment accuracy previously observed in other cultural groups. U.S. Americans displayed less overconfidence when compared to Mexicans. These results were then replicated in bilingual participants, when culture was experimentally manipulated with language priming. Holistic reasoning did not account for the cross-cultural variation of overconfidence. Suggestions for future studies are discussed. PMID:22879682

  17. Uncertain deduction and conditional reasoning

    PubMed Central

    Evans, Jonathan St. B. T.; Thompson, Valerie A.; Over, David E.

    2015-01-01

    There has been a paradigm shift in the psychology of deductive reasoning. Many researchers no longer think it is appropriate to ask people to assume premises and decide what necessarily follows, with the results evaluated by binary extensional logic. Most every day and scientific inference is made from more or less confidently held beliefs and not assumptions, and the relevant normative standard is Bayesian probability theory. We argue that the study of “uncertain deduction” should directly ask people to assign probabilities to both premises and conclusions, and report an experiment using this method. We assess this reasoning by two Bayesian metrics: probabilistic validity and coherence according to probability theory. On both measures, participants perform above chance in conditional reasoning, but they do much better when statements are grouped as inferences, rather than evaluated in separate tasks. PMID:25904888

  18. Knot probabilities in random diagrams

    NASA Astrophysics Data System (ADS)

    Cantarella, Jason; Chapman, Harrison; Mastin, Matt

    2016-10-01

    We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.

  19. Towards saturation of the electron-capture delayed fission probability: The new isotopes 240Es and 236Bk

    NASA Astrophysics Data System (ADS)

    Konki, J.; Khuyagbaatar, J.; Uusitalo, J.; Greenlees, P. T.; Auranen, K.; Badran, H.; Block, M.; Briselet, R.; Cox, D. M.; Dasgupta, M.; Di Nitto, A.; Düllmann, Ch. E.; Grahn, T.; Hauschild, K.; Herzán, A.; Herzberg, R.-D.; Heßberger, F. P.; Hinde, D. J.; Julin, R.; Juutinen, S.; Jäger, E.; Kindler, B.; Krier, J.; Leino, M.; Lommel, B.; Lopez-Martens, A.; Luong, D. H.; Mallaburn, M.; Nishio, K.; Pakarinen, J.; Papadakis, P.; Partanen, J.; Peura, P.; Rahkila, P.; Rezynkina, K.; Ruotsalainen, P.; Sandzelius, M.; Sarén, J.; Scholey, C.; Sorri, J.; Stolze, S.; Sulignano, B.; Theisen, Ch.; Ward, A.; Yakushev, A.; Yakusheva, V.

    2017-01-01

    The new neutron-deficient nuclei 240Es and 236Bk were synthesised at the gas-filled recoil separator RITU. They were identified by their radioactive decay chains starting from 240Es produced in the fusion-evaporation reaction 209Bi(34S,3n)240Es. Half-lives of 6 (2)s and 22-6+13s were obtained for 240Es and 236Bk, respectively. Two groups of α particles with energies Eα = 8.19 (3)MeV and 8.09 (3)MeV were unambiguously assigned to 240Es. Electron-capture delayed fission branches with probabilities of 0.16 (6) and 0.04 (2) were measured for 240Es and 236Bk, respectively. These new data show a continuation of the exponential increase of ECDF probabilities in more neutron-deficient isotopes.

  20. Common-Cause Failure Treatment in Event Assessment: Basis for a Proposed New Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dana Kelly; Song-Hua Shen; Gary DeMoss

    2010-06-01

    Event assessment is an application of probabilistic risk assessment in which observed equipment failures and outages are mapped into the risk model to obtain a numerical estimate of the event’s risk significance. In this paper, we focus on retrospective assessments to estimate the risk significance of degraded conditions such as equipment failure accompanied by a deficiency in a process such as maintenance practices. In modeling such events, the basic events in the risk model that are associated with observed failures and other off-normal situations are typically configured to be failed, while those associated with observed successes and unchallenged components aremore » assumed capable of failing, typically with their baseline probabilities. This is referred to as the failure memory approach to event assessment. The conditioning of common-cause failure probabilities for the common cause component group associated with the observed component failure is particularly important, as it is insufficient to simply leave these probabilities at their baseline values, and doing so may result in a significant underestimate of risk significance for the event. Past work in this area has focused on the mathematics of the adjustment. In this paper, we review the Basic Parameter Model for common-cause failure, which underlies most current risk modelling, discuss the limitations of this model with respect to event assessment, and introduce a proposed new framework for common-cause failure, which uses a Bayesian network to model underlying causes of failure, and which has the potential to overcome the limitations of the Basic Parameter Model with respect to event assessment.« less

  1. Patient-Centered Research

    PubMed Central

    Wicki, J; Perneger, TV; Junod, AF; Bounameaux, H; Perrier, A

    2000-01-01

    PURPOSE We aimed to develop a simple standardized clinical score to stratify emergency ward patients with clinically suspected PE into groups with a high, intermediate, or low probability of PE, in order to improve and simplify the diagnostic approach. METHODS Analysis of a database of 1090 consecutive patients admitted to the emergency ward for suspected PE, in whom diagnosis of PE was ruled in or out by a standard diagnostic algorithm. Logistic regression was used to predict clinical parameters associated with PE. RESULTS 296 out of 1090 patients (27%) were found to have PE. The optimal estimate of clinical probability was based on eight variables: recent surgery, previous thromboembolic event, older age, hypocapnia, hypoxemia, tachycardia, band atelectasis or elevation of a hemidiaphragm on chest X-ray. A probability score was calculated by adding points assigned to these variables. A cut-off score of 4 best identified patients with low probability of PE. 486 patients (49%) had a low clinical probability of PE (score < 4), of which 50 (10.3%) had a proven PE. The prevalence of PE was 38% in the 437 patients with an intermediate probability (score 5–8, n = 437) and 81% in the 63 patients with a high probability (score>9). CONCLUSION This clinical score, based on easily available and objective variables, provides a standardized assessment of the clinical probability of PE. Applying this score to emergency ward patients suspected of PE could allow a more efficient diagnostic process.

  2. Space shuttle solid rocket booster recovery system definition, volume 1

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The performance requirements, preliminary designs, and development program plans for an airborne recovery system for the space shuttle solid rocket booster are discussed. The analyses performed during the study phase of the program are presented. The basic considerations which established the system configuration are defined. A Monte Carlo statistical technique using random sampling of the probability distribution for the critical water impact parameters was used to determine the failure probability of each solid rocket booster component as functions of impact velocity and component strength capability.

  3. A trial-based economic evaluation of 2 nurse-led disease management programs in heart failure.

    PubMed

    Postmus, Douwe; Pari, Anees A Abdul; Jaarsma, Tiny; Luttik, Marie Louise; van Veldhuisen, Dirk J; Hillege, Hans L; Buskens, Erik

    2011-12-01

    Although previously conducted meta-analyses suggest that nurse-led disease management programs in heart failure (HF) can improve patient outcomes, uncertainty regarding the cost-effectiveness of such programs remains. To compare the relative merits of 2 variants of a nurse-led disease management program (basic or intensive support by a nurse specialized in the management of patients with HF) against care as usual (routine follow-up by a cardiologist), a trial-based economic evaluation was conducted alongside the COACH study. In terms of costs per life-year, basic support was found to dominate care as usual, whereas the incremental cost-effectiveness ratio between intensive support and basic support was found to be equal to €532,762 per life-year; in terms of costs per quality-adjusted life-year (QALY), basic support was found to dominate both care as usual and intensive support. An assessment of the uncertainty surrounding these findings showed that, at a threshold value of €20,000 per life-year/€20,000 per QALY, basic support was found to have a probability of 69/62% of being optimal against 17/30% and 14/8% for care as usual and intensive support, respectively. The results of our subgroup analysis suggest that a stratified approach based on offering basic support to patients with mild to moderate HF and intensive support to patients with severe HF would be optimal if the willingness-to-pay threshold exceeds €45,345 per life-year/€59,289 per QALY. Although the differences in costs and effects among the 3 study groups were not statistically significant, from a decision-making perspective, basic support still had a relatively large probability of generating the highest health outcomes at the lowest costs. Our results also substantiated that a stratified approach based on offering basic support to patients with mild to moderate HF and intensive support to patients with severe HF could further improve health outcomes at slightly higher costs. Copyright © 2011 Mosby, Inc. All rights reserved.

  4. Using Environmental Science as a Motivational Tool to Teach Physics to Non-Science Majors

    ERIC Educational Resources Information Center

    Busch, Hauke C.

    2010-01-01

    A traditional physical science course was transformed into an environmental physical science course to teach physics to non-science majors. The objective of the new course was to improve the learning of basic physics principles by applying them to current issues of interest. A new curriculum was developed with new labs, homework assignments,…

  5. Matter-Energy Interactions in Natural Systems. Science III and IIIA.

    ERIC Educational Resources Information Center

    Pfeiffer, Carl H.

    The two student notebooks in this set provide the basic outline and assignments for the third year of a four year senior high school unified science program. This course is the more technical of the two third-year courses offered in the program. The first unit, Extensions of the Particle Theories, deals with slide rule review, molecular theory and…

  6. Over the Counter Drugs (and Dietary Supplement) Exercise: A Team-based Introduction to Biochemistry for Health Professional Students

    ERIC Educational Resources Information Center

    Phadtare, Sangita; Abali, Emine; Brodsky, Barbara

    2013-01-01

    For successful delivery of basic science topics for health-professional students, it is critical to reduce apprehension and illustrate relevance to clinical settings and everyday life. At the beginning of the Biochemistry course for Physician Assistants, a team-based assignment was designed to develop an understanding of the mechanism of action,…

  7. Using Software Testing Techniques for Efficient Handling of Programming Exercises in an e-Learning Platform

    ERIC Educational Resources Information Center

    Schwieren, Joachim; Vossen, Gottfried; Westerkamp, Peter

    2006-01-01

    e-Learning has become a major field of interest in recent years, and multiple approaches and solutions have been developed. A typical form of e-learning application comprises exercise submission and assessment systems that allow students to work on assignments whenever and where they want (i.e., dislocated, asynchronous work). In basic computer…

  8. An Online Guided E-Journal Exercise in Pre-Clerkship Years: Oxidative Phosphorylation in Brown Adipose Tissue

    ERIC Educational Resources Information Center

    Abali, Emine Ercikan; Phadtare, Sangita; Galt, Jim; Brodsky, Barbara

    2014-01-01

    The rationale for this mandatory, guided online e-journal exercise is to foster the ability of students to independently read medical and scientific literature in a critical manner and to integrate journal reading with their basic science knowledge. After a lecture on oxidative phosphorylation, students were assigned to read an article on brown…

  9. The Impact of Multi-Age Instruction on Academic Performance in Mathematics and Reading

    ERIC Educational Resources Information Center

    Baukol, David

    2010-01-01

    Teachers and administrators are faced with a basic question when planning for a school year: how should the students be grouped when coming to school? Should students of similar age be together or should students be assigned to multi-age classrooms at the elementary school level? If the multi-age method is chosen, how will academic progress be…

  10. Introduction to the Apollo collections. Part 1: Lunar igneous rocks

    NASA Technical Reports Server (NTRS)

    Mcgee, P. E.; Warner, J. L.; Simonds, C. H.

    1977-01-01

    The basic petrographic, chemical, and age data is presented for a representative suite of igneous rocks gathered during the six Apollo missions. Tables are given for 69 samples: 32 igneous rocks and 37 impactites (breccias). A description is given of 26 basalts, four plutonic rocks, and two pyroclastic samples. The textural-mineralogic name assigned each sample is included.

  11. Effects of Dog-Assisted Therapy on Communication and Basic Social Skills of Adults with Intellectual Disabilities: A Pilot Study

    ERIC Educational Resources Information Center

    Scorzato, Ivano; Zaninotto, Leonardo; Romano, Michela; Menardi, Chiara; Cavedon, Lino; Pegoraro, Alessandra; Socche, Laura; Zanetti, Piera; Coppiello, Deborah

    2017-01-01

    Thirty-nine adults with severe to profound intellectual disability (ID) were randomly assigned to either an experimental group (n = 21) or a control group (n = 18). Assessment was blinded and included selected items from the International Classification of Functioning, Disability and Health (ICF), the Behavioral Assessment Battery (BAB), and the…

  12. A Pilot Study of a Kindergarten Summer School Reading Program in High-Poverty Urban Schools

    ERIC Educational Resources Information Center

    Denton, Carolyn A.; Solari, Emily J.; Ciancio, Dennis J.; Hecht, Steven A.; Swank, Paul R.

    2010-01-01

    This pilot study examined an implementation of a kindergarten summer school reading program in 4 high-poverty urban schools. The program targeted both basic reading skills and oral language development. Students were randomly assigned to a treatment group (n = 25) or a typical practice comparison group (n = 28) within each school; however,…

  13. A Comparison of Grade Achievement of Students Using a Programmed Mathematics Text Versus Students Using a Traditional Mathematics Text.

    ERIC Educational Resources Information Center

    Raines, Roy H.

    The effectiveness of a basic college mathematics course consisting of lecture-discussion classroom procedures and homework assignments from a traditional text was compared to the effectiveness of a course designed to combat low grade achievement and a high dropout rate by allowing for individual differences. The revised course consisted of…

  14. 42 CFR 23.8 - What operational requirements apply to an entity to which National Health Service Corps personnel...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false What operational requirements apply to an entity to... Assignment of National Health Service Corps Personnel § 23.8 What operational requirements apply to an entity...; and (g) Establish basic data, cost accounting, and management information and reporting systems as...

  15. A look into the relationship between personality traits and basic values: A longitudinal investigation.

    PubMed

    Vecchione, Michele; Alessandri, Guido; Roccas, Sonia; Caprara, Gian Vittorio

    2018-05-27

    The present study examines the longitudinal association between basic personal values and the Big Five personality traits. A sample of 546 young adults (57% females) with a mean age of 21.68 years (SD = 1.60) completed the Portrait Values Questionnaire and the Big Five Questionnaire at three-time points, each separated by an interval of four years. Cross-lagged models were used to investigate the possible reciprocal relations between traits and values, after the stability of the variables was taken into account. We found that values did not affect trait development. Traits, by contrast, have some effects on how values change. Specifically, high levels of agreeableness predict an increase over time in the importance assigned to benevolence values. Similarly, high levels of openness predict a later increase in the importance assigned to self-direction values. The same effect was not found for the other traits. Additionally, except for in the case of emotional stability, traits showed synchronous (i.e., within wave) correlations with values, suggesting that part of this relationship is due to common antecedents. Mechanisms underlying the associations between traits and values are discussed. This article is protected by copyright. All rights reserved. © 2018 Wiley Periodicals, Inc.

  16. HNP renumbering support in PMIPv6

    NASA Astrophysics Data System (ADS)

    Yan, Zhiwei; Geng, Guanggang; Lee, Xiaodong

    2015-12-01

    In the basic PMIPv6 (Proxy Mobile IPv6), the MN (Mobile Node) is assigned with a 64-bit HNP (Home Network Prefix) during the initial attachment for the HoA (Home Address) configuration. During the movements of MN, this prefix is assumed to be unchanged and then the upper layer applications do not have to use the reconfigured HoA and then the handover is transparent at the IP and above layers. However, the current protocol does not specify the related operation to support the MN to timely receive the new HNP and configure the new HoA when its HNP is renumbered. In this paper, this problem is discussed and a possible solution is proposed based on some simple extensions of the basic PMIPv6. Our analysis demonstrates that the proposed scheme can effectively discover the HNP renumbering and keep lower signaling cost, compared with the basic PMIPv6.

  17. Conservative Belief and Rationality

    DTIC Science & Technology

    2012-10-03

    mail: halpern@cs.cornell.edu, rafael@cs.cornell.edu October 3, 2012 Abstract Brandenburger and Dekel have shown that common belief of rationality (CBR...for public release; distribution unlimited 13. SUPPLEMENTARY NOTES Games and Economic Behavior?13 14. ABSTRACT Brandenburger and Dekel have shown...themselves in a position to which they initially assigned probability 0. Tan and Werlang [1988] and Brandenburger and Dekel [1987] show that common

  18. Analysis of Naval Ammunition Stock Positioning

    DTIC Science & Technology

    2015-12-01

    model takes once the Monte -Carlo simulation determines the assigned probabilities for site-to-site locations. Column two shows how the simulation...stockpiles and positioning them at coastal Navy facilities. A Monte -Carlo simulation model was developed to simulate expected cost and delivery...TERMS supply chain management, Monte -Carlo simulation, risk, delivery performance, stock positioning 15. NUMBER OF PAGES 85 16. PRICE CODE 17

  19. POOLMS: A computer program for fitting and model selection for two level factorial replication-free experiments

    NASA Technical Reports Server (NTRS)

    Amling, G. E.; Holms, A. G.

    1973-01-01

    A computer program is described that performs a statistical multiple-decision procedure called chain pooling. It uses a number of mean squares assigned to error variance that is conditioned on the relative magnitudes of the mean squares. The model selection is done according to user-specified levels of type 1 or type 2 error probabilities.

  20. Does School Policy Affect Housing Choices? Evidence from the End of Desegregation in Charlotte-Mecklenburg

    ERIC Educational Resources Information Center

    Liebowitz, David D.; Page, Lindsay C.

    2014-01-01

    We examine whether the legal decision to grant unitary status to the Charlotte-Mecklenburg school district, which led to the end of race-conscious student assignment policies, increased the probability that families with children enrolled in the district would move to neighborhoods with a greater proportion of student residents of the same race as…

  1. Effects of an Employee Wellness Program on Physiological Risk Factors, Job Satisfaction, and Monetary Savings in a South Texas University

    ERIC Educational Resources Information Center

    Hamilton, Jacqueline

    2009-01-01

    An experimental study was conducted to investigate the effects of an Employee Wellness Program on physiological risk factors, job satisfaction, and monetary savings in a South Texas University. The non-probability sample consisted of 31 employees from lower income level positions. The employees were randomly assigned to the treatment group which…

  2. The Identification of Word Meaning from Sentence Contexts: An Effect of Presentation Order.

    ERIC Educational Resources Information Center

    Ammon, Paul R.; Graves, Jack A.

    Sixty fourth- and fifth-grade children listened to six series of six sentences each, with each sentence in a series containing the same artificial word. The task was to assign to the artificial word a meaning which would fit all sentence contexts in the series. Preliminary data provided an estimate of the probability that a particular sentence,…

  3. A random walk rule for phase I clinical trials.

    PubMed

    Durham, S D; Flournoy, N; Rosenberger, W F

    1997-06-01

    We describe a family of random walk rules for the sequential allocation of dose levels to patients in a dose-response study, or phase I clinical trial. Patients are sequentially assigned the next higher, same, or next lower dose level according to some probability distribution, which may be determined by ethical considerations as well as the patient's response. It is shown that one can choose these probabilities in order to center dose level assignments unimodally around any target quantile of interest. Estimation of the quantile is discussed; the maximum likelihood estimator and its variance are derived under a two-parameter logistic distribution, and the maximum likelihood estimator is compared with other nonparametric estimators. Random walk rules have clear advantages: they are simple to implement, and finite and asymptotic distribution theory is completely worked out. For a specific random walk rule, we compute finite and asymptotic properties and give examples of its use in planning studies. Having the finite distribution theory available and tractable obviates the need for elaborate simulation studies to analyze the properties of the design. The small sample properties of our rule, as determined by exact theory, compare favorably to those of the continual reassessment method, determined by simulation.

  4. Method for Evaluation of Outage Probability on Random Access Channel in Mobile Communication Systems

    NASA Astrophysics Data System (ADS)

    Kollár, Martin

    2012-05-01

    In order to access the cell in all mobile communication technologies a so called random-access procedure is used. For example in GSM this is represented by sending the CHANNEL REQUEST message from Mobile Station (MS) to Base Transceiver Station (BTS) which is consequently forwarded as an CHANNEL REQUIRED message to the Base Station Controller (BSC). If the BTS decodes some noise on the Random Access Channel (RACH) as random access by mistake (so- called ‘phantom RACH') then it is a question of pure coincidence which èstablishment cause’ the BTS thinks to have recognized. A typical invalid channel access request or phantom RACH is characterized by an IMMEDIATE ASSIGNMENT procedure (assignment of an SDCCH or TCH) which is not followed by sending an ESTABLISH INDICATION from MS to BTS. In this paper a mathematical model for evaluation of the Power RACH Busy Threshold (RACHBT) in order to guaranty in advance determined outage probability on RACH is described and discussed as well. It focuses on Global System for Mobile Communications (GSM) however the obtained results can be generalized on remaining mobile technologies (ie WCDMA and LTE).

  5. The calculation of aircraft collision probabilities

    DOT National Transportation Integrated Search

    1971-10-01

    The basic limitation of, air traffic compression, from the safety point of view, is the increased risk of collision due to reduced separations. In order to evolve new procedures, and eventually a fully, automatic system, it is desirable to have a mea...

  6. 75 FR 9592 - FPL Energy Maine Hydro, LLC; Notice of Intent To Prepare an Environmental Document and Soliciting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-03

    ... a probable maximum flood, and modification of the existing earthen embankments for improved slope stability and safety. The proposed remedial measures would not alter the basic footprint of the existing dam...

  7. Optimizing a realistic large-scale frequency assignment problem using a new parallel evolutionary approach

    NASA Astrophysics Data System (ADS)

    Chaves-González, José M.; Vega-Rodríguez, Miguel A.; Gómez-Pulido, Juan A.; Sánchez-Pérez, Juan M.

    2011-08-01

    This article analyses the use of a novel parallel evolutionary strategy to solve complex optimization problems. The work developed here has been focused on a relevant real-world problem from the telecommunication domain to verify the effectiveness of the approach. The problem, known as frequency assignment problem (FAP), basically consists of assigning a very small number of frequencies to a very large set of transceivers used in a cellular phone network. Real data FAP instances are very difficult to solve due to the NP-hard nature of the problem, therefore using an efficient parallel approach which makes the most of different evolutionary strategies can be considered as a good way to obtain high-quality solutions in short periods of time. Specifically, a parallel hyper-heuristic based on several meta-heuristics has been developed. After a complete experimental evaluation, results prove that the proposed approach obtains very high-quality solutions for the FAP and beats any other result published.

  8. Cytochemical evaluation of the Guard procedure a regressive staining method for demonstrating chromosomal basic proteins. I. Effects of fixation, blocking reactions, selective extractions, and polyacid "differentiation".

    PubMed

    Cowden, R R; Rasch, E M; Curtis, S K

    1976-08-12

    Appropriately fixed preparations stained by a modification of the Guard (1959) reaction for "sex chromatin" display selective staining of interphase chromatin and mitotic or meiotic chromosomes. This is a regressive staining method which seems to depend on the selective displacement of an acidic dye from less basic structures, and retention of the dye at more basic sites. The results obtained with the reaction can be controlled by the length of time that the preparations are "differentiated" in solutions containing phosphomolybdic and phosphotungstic acids (polyacids). After three- or four-hour exposures to polyacid solutions, all chromatin is stained. However, with longer differentiation, "condensed" chromatin can be stained preferentially. Of a number of fixatives investigated, only 10% formalin, ethanol-acetic acid (3:1), and Bouin's solution proved useful. Others resulted in diminished specificity or a total loss of selectivity. The most intense staining was obtained after formalin fixation. Less intense dyebinding was observed after fixation in 3:1 - probably due to extraction of some histone fractions-and the least amount of dye was bound in Bouin's-fixed chromatin - probably due to blockage of arginine residues by picric acid. The reaction was not affected by enzymatic removal of nucleic acids or the extraction of lipids. It was diminished by treatment with trypsin or weak acetylation, and it was completely prevented by strong acetylation, deamination, or extraction of basic proteins with HCl. The results presented suggest that the modified Guard (1959) procedure selectively demonstrates basic nucleoproteins. Further, by the use of regressive differentiation in polyacid solutions, the retention of dye in more condensed chromatin can be favored.

  9. Intuitionistic fuzzy evidential power aggregation operator and its application in multiple criteria decision-making

    NASA Astrophysics Data System (ADS)

    Jiang, Wen; Wei, Boya

    2018-02-01

    The theory of intuitionistic fuzzy sets (IFS) is widely used for dealing with vagueness and the Dempster-Shafer (D-S) evidence theory has a widespread use in multiple criteria decision-making problems under uncertain situation. However, there are many methods to aggregate intuitionistic fuzzy numbers (IFNs), but the aggregation operator to fuse basic probability assignment (BPA) is rare. Power average (P-A) operator, as a powerful operator, is useful and important in information fusion. Motivated by the idea of P-A power, in this paper, a new operator based on the IFS and D-S evidence theory is proposed, which is named as intuitionistic fuzzy evidential power average (IFEPA) aggregation operator. First, an IFN is converted into a BPA, and the uncertainty is measured in D-S evidence theory. Second, the difference between BPAs is measured by Jousselme distance and a satisfying support function is proposed to get the support degree between each other effectively. Then the IFEPA operator is used for aggregating the original IFN and make a more reasonable decision. The proposed method is objective and reasonable because it is completely driven by data once some parameters are required. At the same time, it is novel and interesting. Finally, an application of developed models to the 'One Belt, One road' investment decision-making problems is presented to illustrate the effectiveness and feasibility of the proposed operator.

  10. A probabilistic approach to interpreting verbal autopsies: methodology and preliminary validation in Vietnam.

    PubMed

    Byass, Peter; Huong, Dao Lan; Minh, Hoang Van

    2003-01-01

    Verbal autopsy (VA) has become an important tool in the past 20 years for determining cause of death in communities where there is no routine registration. In many cases, expert physicians have been used to interpret the VA findings and so assign individual causes of death. However, this is time consuming and not always repeatable. Other approaches such as algorithms and neural networks have been developed in some settings. This paper aims to develop a method that is simple, reliable and consistent, which could represent an advance in VA interpretation. This paper describes the development of a Bayesian probability model for VA interpretation as an attempt to find a better approach. This methodology and a preliminary implementation are described, with an evaluation based on VA material from rural Vietnam. The new model was tested against a series of 189 VA interviews from a rural community in Vietnam. Using this very basic model, over 70% of individual causes of death corresponded with those determined by two physicians increasing to over 80% if those cases ascribed to old age or as being indeterminate by the physicians were excluded. Although there is a clear need to improve the preliminary model and to test more extensively with larger and more varied datasets, these preliminary results suggest that there may be good potential in this probabilistic approach.

  11. Resonance assignment of the NMR spectra of disordered proteins using a multi-objective non-dominated sorting genetic algorithm.

    PubMed

    Yang, Yu; Fritzsching, Keith J; Hong, Mei

    2013-11-01

    A multi-objective genetic algorithm is introduced to predict the assignment of protein solid-state NMR (SSNMR) spectra with partial resonance overlap and missing peaks due to broad linewidths, molecular motion, and low sensitivity. This non-dominated sorting genetic algorithm II (NSGA-II) aims to identify all possible assignments that are consistent with the spectra and to compare the relative merit of these assignments. Our approach is modeled after the recently introduced Monte-Carlo simulated-annealing (MC/SA) protocol, with the key difference that NSGA-II simultaneously optimizes multiple assignment objectives instead of searching for possible assignments based on a single composite score. The multiple objectives include maximizing the number of consistently assigned peaks between multiple spectra ("good connections"), maximizing the number of used peaks, minimizing the number of inconsistently assigned peaks between spectra ("bad connections"), and minimizing the number of assigned peaks that have no matching peaks in the other spectra ("edges"). Using six SSNMR protein chemical shift datasets with varying levels of imperfection that was introduced by peak deletion, random chemical shift changes, and manual peak picking of spectra with moderately broad linewidths, we show that the NSGA-II algorithm produces a large number of valid and good assignments rapidly. For high-quality chemical shift peak lists, NSGA-II and MC/SA perform similarly well. However, when the peak lists contain many missing peaks that are uncorrelated between different spectra and have chemical shift deviations between spectra, the modified NSGA-II produces a larger number of valid solutions than MC/SA, and is more effective at distinguishing good from mediocre assignments by avoiding the hazard of suboptimal weighting factors for the various objectives. These two advantages, namely diversity and better evaluation, lead to a higher probability of predicting the correct assignment for a larger number of residues. On the other hand, when there are multiple equally good assignments that are significantly different from each other, the modified NSGA-II is less efficient than MC/SA in finding all the solutions. This problem is solved by a combined NSGA-II/MC algorithm, which appears to have the advantages of both NSGA-II and MC/SA. This combination algorithm is robust for the three most difficult chemical shift datasets examined here and is expected to give the highest-quality de novo assignment of challenging protein NMR spectra.

  12. [Posthumous nomination for Medicine Nobel Prizes II. The positivism era (1849-1899)].

    PubMed

    Cruz-Coke, R

    1997-06-01

    The author proposes the nomination of great physicians of the second half of the XIX century for a posthumous Medicine Nobel Prize. The valorization given by medical historians Garrison, Lavastine, Castiglioni, Lain Entralgo and Guerra, is used to select the better candidates. One to three names are assigned by year from 1849 to 1899. Four categories of Nobel prizes are assigned: a) Basic biological disciplines, b) Clinical and surgical medicine, pathology and specialties, c) Discoverers of transcendental diseases that are eponyms and d) New medical technologies. A total of 84 nominees for the Nobel Prize are presented. These lists are presented as preliminary and tentative to allow an extensive debate about the history of medicine during the nineteenth century.

  13. Instruction in information structuring improves Bayesian judgment in intelligence analysts.

    PubMed

    Mandel, David R

    2015-01-01

    An experiment was conducted to test the effectiveness of brief instruction in information structuring (i.e., representing and integrating information) for improving the coherence of probability judgments and binary choices among intelligence analysts. Forty-three analysts were presented with comparable sets of Bayesian judgment problems before and immediately after instruction. After instruction, analysts' probability judgments were more coherent (i.e., more additive and compliant with Bayes theorem). Instruction also improved the coherence of binary choices regarding category membership: after instruction, subjects were more likely to invariably choose the category to which they assigned the higher probability of a target's membership. The research provides a rare example of evidence-based validation of effectiveness in instruction to improve the statistical assessment skills of intelligence analysts. Such instruction could also be used to improve the assessment quality of other types of experts who are required to integrate statistical information or make probabilistic assessments.

  14. Application of random match probability calculations to mixed STR profiles.

    PubMed

    Bille, Todd; Bright, Jo-Anne; Buckleton, John

    2013-03-01

    Mixed DNA profiles are being encountered more frequently as laboratories analyze increasing amounts of touch evidence. If it is determined that an individual could be a possible contributor to the mixture, it is necessary to perform a statistical analysis to allow an assignment of weight to the evidence. Currently, the combined probability of inclusion (CPI) and the likelihood ratio (LR) are the most commonly used methods to perform the statistical analysis. A third method, random match probability (RMP), is available. This article compares the advantages and disadvantages of the CPI and LR methods to the RMP method. We demonstrate that although the LR method is still considered the most powerful of the binary methods, the RMP and LR methods make similar use of the observed data such as peak height, assumed number of contributors, and known contributors where the CPI calculation tends to waste information and be less informative. © 2013 American Academy of Forensic Sciences.

  15. Self-reported hand washing behaviors and foodborne illness: a propensity score matching approach.

    PubMed

    Ali, Mir M; Verrill, Linda; Zhang, Yuanting

    2014-03-01

    Hand washing is a simple and effective but easily overlooked way to reduce cross-contamination and the transmission of foodborne pathogens. In this study, we used the propensity score matching methodology to account for potential selection bias to explore our hypothesis that always washing hands before food preparation tasks is associated with a reduction in the probability of reported foodborne illness. Propensity score matching can simulate random assignment to a condition so that pretreatment observable differences between a treatment group and a control group are homogenous on all the covariates except the treatment variable. Using the U.S. Food and Drug Administration's 2010 Food Safety Survey, we estimated the effect of self-reported hand washing behavior on the probability of self-reported foodborne illness. Our results indicate that reported washing of hands with soap always before food preparation leads to a reduction in the probability of reported foodborne illness.

  16. Accuracy of pastoralists' memory-based kinship assignment of Ankole cattle: a microsatellite DNA analysis.

    PubMed

    Kugonza, D R; Kiwuwa, G H; Mpairwe, D; Jianlin, H; Nabasirye, M; Okeyo, A M; Hanotte, O

    2012-02-01

    This study aimed to estimate the level of relatedness within Ankole cattle herds using autosomal microsatellite markers and to assess the accuracy of relationship assignment based on farmers' memory. Eight cattle populations (four from each of two counties in Mbarara district in Uganda) were studied. Cattle in each population shared varying degrees of relatedness (first-, second- and third-degree relatives and unrelated individuals). Only memory-based kinship assignments which farmers knew with some confidence were tested in this experiment. DNA isolated from the blood of a subsample of 304 animals was analysed using 19 microsatellite markers. Average within population relatedness coefficients ranged from 0.010 ± 0.005 (Nshaara) to 0.067 ± 0.004 (Tayebwa). An exclusion probability of 99.9% was observed for both sire-offspring and dam-offspring relationships using the entire panel of 19 markers. Confidence from likelihood tests performed on 292 dyads showed that first-degree relatives were more easily correctly assigned by farmers than second-degree ones (p < 0.01), which were also easier to assign than third-degree relatives (p < 0.01). Accuracy of kinship assignment by the farmers was 91.9% ± 5.0 for dam-offspring dyads, 85.5% ± 3.4 for sire-offspring dyads, 75.6% ± 12.3 for half-sib and 60.0% ± 5.0 for grand dam-grand offspring dyads. Herd size, number of dyads assigned and length of time spent by the herder with their cattle population did not correlate with error in memorizing relationships. However, herd size strongly correlated with number of dyads assigned by the herder (r = 0.967, p < 0.001). Overall, we conclude that memorized records of pastoralists can be used to trace relationships and for pedigree reconstruction within Ankole cattle populations, but with the awareness that herd size constrains the number of kinship assignments remembered by the farmer. © 2011 Blackwell Verlag GmbH.

  17. Hydrologic drought prediction under climate change: Uncertainty modeling with Dempster-Shafer and Bayesian approaches

    NASA Astrophysics Data System (ADS)

    Raje, Deepashree; Mujumdar, P. P.

    2010-09-01

    Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster-Shafer (D-S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D-S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D-S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D-S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster-Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D-S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D-S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change.

  18. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  19. Multiport Multiplex Program

    DTIC Science & Technology

    1975-02-01

    UNCLASSIFIED AD NUMBER LIMITATION CHANGES TO: FROM: AUTHORITY THIS PAGE IS UNCLASSIFIED ADB013811 Approved for public release; distribution is...of Changing Sampling Frequency and Bits/Sample 13 Image Coding Methods 63 Basic Dual-Mode Coder Code Assignment 73 Oversampled Dual...results from the threshold at which a 1 bit will oe trans- mitted. The threshold corresponds to a finite change on the gray scale or resolution of the

  20. Global Fleet Station: Station Ship Concept

    DTIC Science & Technology

    2008-02-01

    The basic ISO TEU containers can be designed for any number of configurations and provide many different capabilities. For example there are...Design Design Process The ship was designed using an iterative weight and volume balancing method . This method assigns a weight and volume to each...from existing merchant ships3. Different ship types are modeled in the algorithm though the selection of appropriate non-dimensional factors

  1. An Analysis of the Tasks in School Social Work as a Basis for Improved Use of Staff. Final Report.

    ERIC Educational Resources Information Center

    Costin, Lela B.

    The two basic questions investigated in this study were: (1) the function of school social work and its relative importance as defined by social workers, and (2) whether this definition provides a basis for experimentation in assigning responsibilities to social work staff with different levels of training. A comprehensive list of the school…

  2. Using Expected Value to Introduce the Laplace Transform

    ERIC Educational Resources Information Center

    Lutzer, Carl V.

    2015-01-01

    We propose an introduction to the Laplace transform in which Riemann sums are used to approximate the expected net change in a function, assuming that it quantifies a process that can terminate at random. We assume only a basic understanding of probability.

  3. Persuasion, Surveillance, and Voting Behavior

    ERIC Educational Resources Information Center

    Gross, Alan E.; And Others

    1974-01-01

    The present study was designed to test the efficacy of two basic strategies which might be employed to increase the probability that a potential voter will act in accordance with his presumed belief that it is right, good, or desirable to exercise his franchise. (Author)

  4. Therapeutic Effects of Standardized Formulation of Stachys lavandulifolia Vahl on Primary Dysmenorrhea: A Randomized, Double-Blind, Crossover, Placebo-Controlled Pilot Study.

    PubMed

    Monji, Faezeh; Hashemian, Farshad; Salehi Surmaghi, Mohammad-Hossein; Mohammadyari, Fatemeh; Ghiyaei, Saeid; Soltanmohammadi, Alireza

    2018-05-09

    In Iranian folklore medicine, boiled extract of Stachys lavandulifolia Vahl is reputed to have therapeutic effects in painful disorders. This study evaluated the efficacy of the standardized formulation of S. lavandulifolia Vahl in reducing pain in primary dysmenorrhea, which is known to be a common disorder with significant impact on quality of life. A randomized, double blind, crossover, placebo-controlled pilot study. Bu-Ali Hospital affiliated with Tehran Medical Branch, Islamic Azad University. Twenty-nine patients with primary dysmenorrhea. Patients were enrolled according to medical history and gynecologic sonography. Standardized capsules of S. lavandulifolia were prepared. All the patients were allowed to take mefenamic acid up to 250 mg/q6h if they needed, in the first menstruation cycle to estimate the analgesic consumption at baseline. By the use of an add-on design in the next cycle, they were randomly assigned to receive either herbal or placebo capsules every 4-6 h. Then, they were crossed over to the other group during the course of the trial. At the end of the fourth day of each cycle, the intensity of pain was measured by visual analogue scale and McGill pain questionnaire. Statistical significance was evaluated using repeated-measures one-way analysis of variance. Pain intensity was significantly decreased during consumption of Stachys lavandulifolia capsules in comparison with basic and placebo cycles (p < 0.05). Interestingly, the consumption of mefenamic acid capsules was reduced dramatically in the S. lavandulifolia cycle in comparison with basic and placebo cycles (p < 0.001). It was demonstrated that S. lavandulifolia-prepared formulation can reduce menstrual pain, and can probably be recommended as an add-on therapy or even an alternative remedy to nonsteroidal anti-inflammatory drugs (NSAIDs) with fewer side effects in primary dysmenorrhea.

  5. Parallel discrete event simulation: A shared memory approach

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  6. Mixture modelling for cluster analysis.

    PubMed

    McLachlan, G J; Chang, S U

    2004-10-01

    Cluster analysis via a finite mixture model approach is considered. With this approach to clustering, the data can be partitioned into a specified number of clusters g by first fitting a mixture model with g components. An outright clustering of the data is then obtained by assigning an observation to the component to which it has the highest estimated posterior probability of belonging; that is, the ith cluster consists of those observations assigned to the ith component (i = 1,..., g). The focus is on the use of mixtures of normal components for the cluster analysis of data that can be regarded as being continuous. But attention is also given to the case of mixed data, where the observations consist of both continuous and discrete variables.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wemmer, D.E.; Kumar, N.V.; Metrione, R.M.

    Toxin II from Radianthus paumotensis (Rp/sub II/) has been investigated by high-resolution NMR and chemical sequencing methods. Resonance assignments have been obtained for this protein by the sequential approach. NMR assignments could not be made consistent with the previously reported primary sequence for this protein, and chemical methods have been used to determine a sequence with which the NMR data are consistent. Analysis of the 2D NOE spectra shows that the protein secondary structure is comprised of two sequences of ..beta..-sheet, probably joined into a distorted continuous sheet, connected by turns and extended loops, without any regular ..cap alpha..-helical segments.more » The residues previously implicated in activity in this class of proteins, D8 and R13, occur in a loop region.« less

  8. How did you guess? Or, what do multiple-choice questions measure?

    PubMed

    Cox, K R

    1976-06-05

    Multiple-choice questions classified as requiring problem-solving skills have been interpreted as measuring problem-solving skills within students, with the implicit hypothesis that questions needing an increasingly complex intellectual process should present increasing difficulty to the student. This hypothesis was tested in a 150-question paper taken by 721 students in seven Australian medical schools. No correlation was observed between difficulty and assigned process. Consequently, the question-answering process was explored with a group of final-year students. Anecdotal recall by students gave heavy weight to knowledge rather than problem solving in answering these questions. Assignment of the 150 questions to the classification by three teachers and six students showed their congruence to be a little above random probability.

  9. On the Structure of a Best Possible Crossover Selection Strategy in Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Lässig, Jörg; Hoffmann, Karl Heinz

    The paper considers the problem of selecting individuals in the current population in genetic algorithms for crossover to find a solution with high fitness for a given optimization problem. Many different schemes have been described in the literature as possible strategies for this task but so far comparisons have been predominantly empirical. It is shown that if one wishes to maximize any linear function of the final state probabilities, e.g. the fitness of the best individual in the final population of the algorithm, then a best probability distribution for selecting an individual in each generation is a rectangular distribution over the individuals sorted in descending sequence by their fitness values. This means uniform probabilities have to be assigned to a group of the best individuals of the population but probabilities equal to zero to individuals with lower fitness, assuming that the probability distribution to choose individuals from the current population can be chosen independently for each iteration and each individual. This result is then generalized also to typical practically applied performance measures, such as maximizing the expected fitness value of the best individual seen in any generation.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marleau, Peter; Monterial, Mateusz; Clarke, Shaun

    A Bayesian approach is proposed for pulse shape discrimination of photons and neutrons in liquid organic scinitillators. Instead of drawing a decision boundary, each pulse is assigned a photon or neutron confidence probability. In addition, this allows for photon and neutron classification on an event-by-event basis. The sum of those confidence probabilities is used to estimate the number of photon and neutron instances in the data. An iterative scheme, similar to an expectation-maximization algorithm for Gaussian mixtures, is used to infer the ratio of photons-to-neutrons in each measurement. Therefore, the probability space adapts to data with varying photon-to-neutron ratios. Amore » time-correlated measurement of Am–Be and separate measurements of 137Cs, 60Co and 232Th photon sources were used to construct libraries of neutrons and photons. These libraries were then used to produce synthetic data sets with varying ratios of photons-to-neutrons. Probability weighted method that we implemented was found to maintain neutron acceptance rate of up to 90% up to photon-to-neutron ratio of 2000, and performed 9% better than the decision boundary approach. Furthermore, the iterative approach appropriately changed the probability space with an increasing number of photons which kept the neutron population estimate from unrealistically increasing.« less

  11. Circuit analysis method for thin-film solar cell modules

    NASA Technical Reports Server (NTRS)

    Burger, D. R.

    1985-01-01

    The design of a thin-film solar cell module is dependent on the probability of occurrence of pinhole shunt defects. Using known or assumed defect density data, dichotomous population statistics can be used to calculate the number of defects expected in a module. Probability theory is then used to assign the defective cells to individual strings in a selected series-parallel circuit design. Iterative numerical calculation is used to calcuate I-V curves using cell test values or assumed defective cell values as inputs. Good and shunted cell I-V curves are added to determine the module output power and I-V curve. Different levels of shunt resistance can be selected to model different defect levels.

  12. Evidential reasoning research on intrusion detection

    NASA Astrophysics Data System (ADS)

    Wang, Xianpei; Xu, Hua; Zheng, Sheng; Cheng, Anyu

    2003-09-01

    In this paper, we mainly aim at D-S theory of evidence and the network intrusion detection these two fields. It discusses the method how to apply this probable reasoning as an AI technology to the Intrusion Detection System (IDS). This paper establishes the application model, describes the new mechanism of reasoning and decision-making and analyses how to implement the model based on the synscan activities detection on the network. The results suggest that if only rational probability values were assigned at the beginning, the engine can, according to the rules of evidence combination and hierarchical reasoning, compute the values of belief and finally inform the administrators of the qualities of the traced activities -- intrusions, normal activities or abnormal activities.

  13. Multiple-Event Seismic Location Using the Markov-Chain Monte Carlo Technique

    NASA Astrophysics Data System (ADS)

    Myers, S. C.; Johannesson, G.; Hanley, W.

    2005-12-01

    We develop a new multiple-event location algorithm (MCMCloc) that utilizes the Markov-Chain Monte Carlo (MCMC) method. Unlike most inverse methods, the MCMC approach produces a suite of solutions, each of which is consistent with observations and prior estimates of data and model uncertainties. Model parameters in MCMCloc consist of event hypocenters, and travel-time predictions. Data are arrival time measurements and phase assignments. Posteriori estimates of event locations, path corrections, pick errors, and phase assignments are made through analysis of the posteriori suite of acceptable solutions. Prior uncertainty estimates include correlations between travel-time predictions, correlations between measurement errors, the probability of misidentifying one phase for another, and the probability of spurious data. Inclusion of prior constraints on location accuracy allows direct utilization of ground-truth locations or well-constrained location parameters (e.g. from InSAR) that aid in the accuracy of the solution. Implementation of a correlation structure for travel-time predictions allows MCMCloc to operate over arbitrarily large geographic areas. Transition in behavior between a multiple-event locator for tightly clustered events and a single-event locator for solitary events is controlled by the spatial correlation of travel-time predictions. We test the MCMC locator on a regional data set of Nevada Test Site nuclear explosions. Event locations and origin times are known for these events, allowing us to test the features of MCMCloc using a high-quality ground truth data set. Preliminary tests suggest that MCMCloc provides excellent relative locations, often outperforming traditional multiple-event location algorithms, and excellent absolute locations are attained when constraints from one or more ground truth event are included. When phase assignments are switched, we find that MCMCloc properly corrects the error when predicted arrival times are separated by several seconds. In cases where the predicted arrival times are within the combined uncertainty of prediction and measurement errors, MCMCloc determines the probability of one or the other phase assignment and propagates this uncertainty into all model parameters. We find that MCMCloc is a promising method for simultaneously locating large, geographically distributed data sets. Because we incorporate prior knowledge on many parameters, MCMCloc is ideal for combining trusted data with data of unknown reliability. This work was performed under the auspices of the U.S. Department of Energy by the University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48, Contribution UCRL-ABS-215048

  14. Xpert MTB/RIF assay for diagnosis of pulmonary tuberculosis in children: a prospective, multi-centre evaluation.

    PubMed

    Reither, Klaus; Manyama, Christina; Clowes, Petra; Rachow, Andrea; Mapamba, Daniel; Steiner, Andreas; Ross, Amanda; Mfinanga, Elirehema; Sasamalo, Mohamed; Nsubuga, Martin; Aloi, Francesco; Cirillo, Daniela; Jugheli, Levan; Lwilla, Fred

    2015-04-01

    Following endorsement by the World Health Organisation, the Xpert MTB/RIF assay has been widely incorporated into algorithms for the diagnosis of adult tuberculosis (TB). However, data on its performance in children remain scarce. This prospective, multi-centre study evaluated the performance of Xpert MTB/RIF to diagnose pulmonary tuberculosis in children. Children older than eight weeks and younger than 16 years with suspected pulmonary tuberculosis were enrolled at three TB endemic settings in Tanzania and Uganda, and assigned to five well-defined case definition categories: culture-confirmed TB, highly probable TB, probable TB, not TB, or indeterminate. The diagnostic accuracy of Xpert MTB/RIF was assessed using culture-confirmed TB cases as reference standard. In total, 451 children were enrolled. 37 (8%) had culture-confirmed TB, 48 (11%) highly probably TB and 62 probable TB (13%). The Xpert MTB/RIF assay had a sensitivity of 68% (95% CI, 50%-82%) and specificity of 100% (95% CI, 97%-100%); detecting 1.7 times more culture-confirmed cases than smear microscopy with a similar time to detection. Xpert MTB/RIF was positive in 2% (1/48) of highly probable and in 3% (2/62) of probable TB cases. Xpert MTB/RIF provided timely results with moderate sensitivity and excellent specificity compared to culture. Low yields in children with highly probable and probable TB remain problematic. Copyright © 2014 The British Infection Association. Published by Elsevier Ltd. All rights reserved.

  15. Probability of Elevated Volatile Organic Compound (VOC) Concentrations in Groundwater in the Eagle River Watershed Valley-Fill Aquifer, Eagle County, North-Central Colorado, 2006-2007

    USGS Publications Warehouse

    Rupert, Michael G.; Plummer, Niel

    2009-01-01

    This raster data set delineates the predicted probability of elevated volatile organic compound (VOC) concentrations in groundwater in the Eagle River watershed valley-fill aquifer, Eagle County, North-Central Colorado, 2006-2007. This data set was developed by a cooperative project between the U.S. Geological Survey, Eagle County, the Eagle River Water and Sanitation District, the Town of Eagle, the Town of Gypsum, and the Upper Eagle Regional Water Authority. This project was designed to evaluate potential land-development effects on groundwater and surface-water resources so that informed land-use and water management decisions can be made. This groundwater probability map and its associated probability maps was developed as follows: (1) A point data set of wells with groundwater quality and groundwater age data was overlaid with thematic layers of anthropogenic (related to human activities) and hydrogeologic data by using a geographic information system to assign each well values for depth to groundwater, distance to major streams and canals, distance to gypsum beds, precipitation, soils, and well depth. These data then were downloaded to a statistical software package for analysis by logistic regression. (2) Statistical models predicting the probability of elevated nitrate concentrations, the probability of unmixed young water (using chlorofluorocarbon-11 concentrations and tritium activities), and the probability of elevated volatile organic compound concentrations were developed using logistic regression techniques. (3) The statistical models were entered into a GIS and the probability map was constructed.

  16. Probability of Elevated Nitrate Concentrations in Groundwater in the Eagle River Watershed Valley-Fill Aquifer, Eagle County, North-Central Colorado, 2006-2007

    USGS Publications Warehouse

    Rupert, Michael G.; Plummer, Niel

    2009-01-01

    This raster data set delineates the predicted probability of elevated nitrate concentrations in groundwater in the Eagle River watershed valley-fill aquifer, Eagle County, North-Central Colorado, 2006-2007. This data set was developed by a cooperative project between the U.S. Geological Survey, Eagle County, the Eagle River Water and Sanitation District, the Town of Eagle, the Town of Gypsum, and the Upper Eagle Regional Water Authority. This project was designed to evaluate potential land-development effects on groundwater and surface-water resources so that informed land-use and water management decisions can be made. This groundwater probability map and its associated probability maps was developed as follows: (1) A point data set of wells with groundwater quality and groundwater age data was overlaid with thematic layers of anthropogenic (related to human activities) and hydrogeologic data by using a geographic information system to assign each well values for depth to groundwater, distance to major streams and canals, distance to gypsum beds, precipitation, soils, and well depth. These data then were downloaded to a statistical software package for analysis by logistic regression. (2) Statistical models predicting the probability of elevated nitrate concentrations, the probability of unmixed young water (using chlorofluorocarbon-11 concentrations and tritium activities), and the probability of elevated volatile organic compound concentrations were developed using logistic regression techniques. (3) The statistical models were entered into a GIS and the probability map was constructed.

  17. Phase-Angle Dependence of Determinations of Diameter, Albedo, and Taxonomy: A Case Study of NEO 3691 Bede

    NASA Technical Reports Server (NTRS)

    Wooden, Diane H.; Lederer, Susan M.; Jehin, Emmanuel; Howell, Ellen S.; Fernandez, Yan; Harker, David E.; Ryan, Erin; Lovell, Amy; Woodward, Charles E.; Benner, Lance A.

    2015-01-01

    Parameters important for NEO risk assessment and mitigation include Near-Earth Object diameter and taxonomic classification, which translates to surface composition. Diameters of NEOs are derived from the thermal fluxes measured by WISE, NEOWISE, Spitzer Warm Mission and ground-based telescopes including the IRTF and UKIRT. Diameter and its coupled parameters Albedo and IR beaming parameter (a proxy for thermal inertia and/or surface roughness) are dependent upon the phase angle, which is the Sun-target-observer angle. Orbit geometries of NEOs, however, typically provide for observations at phase angles greater than 20 degrees. At higher phase angles, the observed thermal emission is sampling both the day and night sides of the NEO. We compare thermal models for NEOs that exclude (NEATM) and include (NESTM) night-side emission. We present a case study of NEO 3691 Bede, which is a higher albedo object, X (Ec) or Cgh taxonomy, to highlight the range of H magnitudes for this object (depending on the albedo and phase function slope parameter G), and to examine at different phase angles the taxonomy and thermal model fits for this NEO. Observations of 3691 Bede include our observations with IRTF+SpeX and with the 10 micrometer UKIRT+Michelle instrument, as well as WISE and Spitzer Warm mission data. By examining 3691 Bede as a case study, we highlight the interplay between the derivation of basic physical parameters and observing geometry, and we discuss the uncertainties in H magnitude, taxonomy assignment amongst the X-class (P, M, E), and diameter determinations. Systematic dependencies in the derivation of basic characterization parameters of H-magnitude, diameter, albedo and taxonomy with observing geometry are important to understand. These basic characterization parameters affect the statistical assessments of the NEO population, which in turn, affects the assignment of statistically-assessed basic parameters to discovered but yet-to-be-fully-characterized NEOs.

  18. Paired basic science and clinical problem-based learning faculty teaching side by side: do students evaluate them differently?

    PubMed

    Stevenson, Frazier T; Bowe, Connie M; Gandour-Edwards, Regina; Kumari, Vijaya G

    2005-02-01

    Many studies have evaluated the desirability of expert versus non-expert facilitators in problem-based learning (PBL), but performance differences between basic science and clinical facilitators has been less studied. In a PBL course at our university, pairs of faculty facilitators (1 clinician, 1 basic scientist) were assigned to student groups to maximise integration of basic science with clinical science. This study set out to establish whether students evaluate basic science and clinical faculty members differently when they teach side by side. Online questionnaires were used to survey 188 students about their faculty facilitators immediately after they completed each of 3 serial PBL cases. Overall satisfaction was measured using a scale of 1-7 and yes/no responses were gathered from closed questions describing faculty performance. results: Year 1 students rated basic science and clinical facilitators the same, but Year 2 students rated the clinicians higher overall. Year 1 students rated basic scientists higher in their ability to understand the limits of their own knowledge. Year 2 students rated the clinicians higher in several content expertise-linked areas: preparedness, promotion of in-depth understanding, and ability to focus the group, and down-rated the basic scientists for demonstrating overspecialised knowledge. Students' overall ratings of individual faculty best correlated with the qualities of stimulation, focus and preparedness, but not with overspecialisation, excessive interjection of the faculty member's own opinions, and encouragement of psychosocial issue discussion. When taught by paired basic science and clinical PBL facilitators, students in Year 1 rated basic science and clinical PBL faculty equally, while Year 2 students rated clinicians more highly overall. The Year 2 difference may be explained by perceived differences in content expertise.

  19. The Benefits of College Athletic Success: An Application of the Propensity Score Design with Instrumental Variables. NBER Working Paper No. 18196

    ERIC Educational Resources Information Center

    Anderson, Michael L.

    2012-01-01

    Spending on big-time college athletics is often justified on the grounds that athletic success attracts students and raises donations. Testing this claim has proven difficult because success is not randomly assigned. We exploit data on bookmaker spreads to estimate the probability of winning each game for college football teams. We then condition…

  20. Description of and link to the I-131 dose/risk calculator

    Cancer.gov

    This calculator estimates radiation dose received by the thyroid from radionuclides in fallout from nuclear tests conducted at the Nevada Test Site (NTS) and sites outside of the United States (global fallout); estimates risk of developing thyroid cancer from that exposure; and provides an estimate of probability of causation, sometimes called assigned share (PC/AS), for individuals who have been diagnosed with thyroid cancer.

  1. 76 FR 35577 - Magnuson-Stevens Fishery Conservation and Management Act Provisions; Fisheries of the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-17

    ... assigned to Level 4 would have ABC derived by the SSC using case-by-case approaches based on biomass, catch... threshold defined as the ratio of biomass (B)/B MSY to identify the probability of overfishing the stock... biomass for a given stock does not fall to a very low level from which recovery is more difficult. It...

  2. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less

  3. Descriptive data analysis.

    PubMed

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  4. Oleanna Math Program Materials.

    ERIC Educational Resources Information Center

    Coole, Walter A.

    This document is a collection of course outlines, syllabi, and test materials designed for several high school level and lower division mathematics courses taught in an auto-tutorial learning laboratory at Skagit Valley College (Washington). The courses included are: Pre-Algebra, Basic Algebra, Plan Geometry, Intermediate Algebra, Probability and…

  5. CAUSAL ANALYSIS AND PROBABILITY DATA: EXAMPLES FOR IMPAIRED AQUATIC CONDITION

    EPA Science Inventory

    Causal analysis is plausible reasoning applied to diagnosing observed effect(s), for example, diagnosing

    cause of biological impairment in a stream. Sir Bradford Hill basically defined the application of causal

    analysis when he enumerated the elements of causality f...

  6. Study on Failure of Third-Party Damage for Urban Gas Pipeline Based on Fuzzy Comprehensive Evaluation

    PubMed Central

    Li, Jun; Zhang, Hong; Han, Yinshan; Wang, Baodong

    2016-01-01

    Focusing on the diversity, complexity and uncertainty of the third-party damage accident, the failure probability of third-party damage to urban gas pipeline was evaluated on the theory of analytic hierarchy process and fuzzy mathematics. The fault tree of third-party damage containing 56 basic events was built by hazard identification of third-party damage. The fuzzy evaluation of basic event probabilities were conducted by the expert judgment method and using membership function of fuzzy set. The determination of the weight of each expert and the modification of the evaluation opinions were accomplished using the improved analytic hierarchy process, and the failure possibility of the third-party to urban gas pipeline was calculated. Taking gas pipelines of a certain large provincial capital city as an example, the risk assessment structure of the method was proved to conform to the actual situation, which provides the basis for the safety risk prevention. PMID:27875545

  7. Prefrontal Neurons Encode a Solution to the Credit-Assignment Problem

    PubMed Central

    Perge, János A.; Eskandar, Emad N.

    2017-01-01

    To adapt successfully to our environments, we must use the outcomes of our choices to guide future behavior. Critically, we must be able to correctly assign credit for any particular outcome to the causal features which preceded it. In some cases, the causal features may be immediately evident, whereas in others they may be separated in time or intermingled with irrelevant environmental stimuli, creating a potentially nontrivial credit-assignment problem. We examined the neuronal representation of information relevant for credit assignment in the dorsolateral prefrontal cortex (dlPFC) of two male rhesus macaques performing a task that elicited key aspects of this problem. We found that neurons conveyed the information necessary for credit assignment. Specifically, neuronal activity reflected both the relevant cues and outcomes at the time of feedback and did so in a manner that was stable over time, in contrast to prior reports of representational instability in the dlPFC. Furthermore, these representations were most stable early in learning, when credit assignment was most needed. When the same features were not needed for credit assignment, these neuronal representations were much weaker or absent. These results demonstrate that the activity of dlPFC neurons conforms to the basic requirements of a system that performs credit assignment, and that spiking activity can serve as a stable mechanism that links causes and effects. SIGNIFICANCE STATEMENT Credit assignment is the process by which we infer the causes of our successes and failures. We found that neuronal activity in the dorsolateral prefrontal cortex conveyed the necessary information for performing credit assignment. Importantly, while there are various potential mechanisms to retain a “trace” of the causal events over time, we observed that spiking activity was sufficiently stable to act as the link between causes and effects, in contrast to prior reports that suggested spiking representations were unstable over time. In addition, we observed that this stability varied as a function of learning, such that the neural code was more reliable over time during early learning, when it was most needed. PMID:28634307

  8. Risk management.

    PubMed

    Chambers, David W

    2010-01-01

    Every plan contains risk. To proceed without planning some means of managing that risk is to court failure. The basic logic of risk is explained. It consists in identifying a threshold where some corrective action is necessary, the probability of exceeding that threshold, and the attendant cost should the undesired outcome occur. This is the probable cost of failure. Various risk categories in dentistry are identified, including lack of liquidity; poor quality; equipment or procedure failures; employee slips; competitive environments; new regulations; unreliable suppliers, partners, and patients; and threats to one's reputation. It is prudent to make investments in risk management to the extent that the cost of managing the risk is less than the probable loss due to risk failure and when risk management strategies can be matched to type of risk. Four risk management strategies are discussed: insurance, reducing the probability of failure, reducing the costs of failure, and learning. A risk management accounting of the financial meltdown of October 2008 is provided.

  9. Extreme weather and experience influence reproduction in an endangered bird

    USGS Publications Warehouse

    Reichert, Brian E.; Cattau, Christopher E.; Fletcher, Robert J.; Kendall, William L.; Kitchens, Wiley M.

    2012-01-01

    Using a 14-year time series spanning large variation in climatic conditions and the entirety of a population's breeding range, we estimated the effects of extreme weather conditions (drought) on the state-specific probabilities of breeding and survival of an endangered bird, the Florida Snail Kite (Rostrhamus sociabilis plumbeus). Our analysis accounted for uncertainty in breeding status assignment, a common source of uncertainty that is often ignored when states are based on field observations. Breeding probabilities in adult kites (>1 year of age) decreased during droughts, whereas the probability of breeding in young kites (1 year of age) tended to increase. Individuals attempting to breed showed no evidence of reduced future survival. Although population viability analyses of this species and other species often implicitly assume that all adults will attempt to breed, we find that breeding probabilities were significantly <1 for all 13 estimable years considered. Our results suggest that experience is an important factor determining whether or not individuals attempt to breed during harsh environmental conditions and that reproductive effort may be constrained by an individual's quality and/or despotic behavior among individuals attempting to breed.

  10. Complex growing networks with intrinsic vertex fitness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bedogne, C.; Rodgers, G. J.

    2006-10-15

    One of the major questions in complex network research is to identify the range of mechanisms by which a complex network can self organize into a scale-free state. In this paper we investigate the interplay between a fitness linking mechanism and both random and preferential attachment. In our models, each vertex is assigned a fitness x, drawn from a probability distribution {rho}(x). In Model A, at each time step a vertex is added and joined to an existing vertex, selected at random, with probability p and an edge is introduced between vertices with fitnesses x and y, with a ratemore » f(x,y), with probability 1-p. Model B differs from Model A in that, with probability p, edges are added with preferential attachment rather than randomly. The analysis of Model A shows that, for every fixed fitness x, the network's degree distribution decays exponentially. In Model B we recover instead a power-law degree distribution whose exponent depends only on p, and we show how this result can be generalized. The properties of a number of particular networks are examined.« less

  11. Associations among milk production and rectal temperature on pregnancy maintenance in lactating recipient dairy cows.

    PubMed

    Vasconcelos, J L M; Cooke, R F; Jardina, D T G; Aragon, F L; Veras, M B; Soriano, S; Sobreira, N; Scarpa, A B

    2011-09-01

    The objective of this study was to evaluate the associations among milk production, rectal temperature, and pregnancy maintenance in lactating recipient dairy cows. Data were collected during an 11-mo period from 463 Holstein cows (203 primiparous and 260 multiparous) assigned to a fixed-time embryo transfer (ET) protocol. Only cows detected with a visible corpus luteum immediately prior to ET were used. Rectal temperatures were collected from all cows on the same day of ET. Milk production at ET was calculated by averaging individual daily milk production during the 7d preceding ET. Pregnancy diagnosis was performed by transrectal ultrasonography 21d after ET. Cows were ranked and assigned to groups according to median milk production (median=35kg/d; HPROD=above median; LPROD=below median) and rectal temperature (≤39.0°C=LTEMP; >39.0°C=HTEMP). A milk production×temperature group interaction was detected (P=0.04) for pregnancy analysis because HTEMP cows ranked as LPROD were 3.1 time more likely to maintain pregnancy compared with HTEMP cows ranked as HPROD (P=0.03). Milk production did not affect (P=0.55) odds of pregnancy maintenance within LTEMP cows, however, and no differences in odds of pregnancy maintenance were detected between HTEMP and LTEMP within milk production groups (P>0.11). Within HTEMP cows, increased milk production decreased the probability of pregnancy maintenance linearly, whereas within LTEMP cows, increased milk production increased the probability of pregnancy maintenance linearly. Within HPROD, increased rectal temperature decreased the probability of pregnancy maintenance linearly, whereas within LPROD cows, no associations between rectal temperatures and probability of cows to maintain pregnancy were detected. In summary, high-producing dairy cows with rectal temperatures below 39.0°C did not experience reduced pregnancy maintenance to ET compared to cohorts with reduced milk production. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Framework Concept: The Information Factor within a Comprehensive Approach to Multinational Crisis Management

    DTIC Science & Technology

    2009-04-03

    Project, 2002. Bell D.: The Coming of Post-Industrial Society. Basic Books, New York 1976. Blumer H.: Symbolic Interactionism – Perspective and...communication and use by assigned meaning through known conventions used in symbolic representation. 05. The ability to receive, share and transmit information...communicated by symbols (information), i.e., through concepts within the mind that represent reality. 33. The meaning and value of information depends

  13. Training Basic Visual Attention Leads to Changes in Responsiveness to Social-Communicative Cues in 9-Month-Olds

    ERIC Educational Resources Information Center

    Forssman, Linda; Wass, Sam V.

    2018-01-01

    This study investigated transfer effects of gaze-interactive attention training to more complex social and cognitive skills in infancy. Seventy 9-month-olds were assigned to a training group (n = 35) or an active control group (n = 35). Before, after, and at 6-week follow-up both groups completed an assessment battery assessing transfer to…

  14. Neural Meta-Memes Framework for Combinatorial Optimization

    NASA Astrophysics Data System (ADS)

    Song, Li Qin; Lim, Meng Hiot; Ong, Yew Soon

    In this paper, we present a Neural Meta-Memes Framework (NMMF) for combinatorial optimization. NMMF is a framework which models basic optimization algorithms as memes and manages them dynamically when solving combinatorial problems. NMMF encompasses neural networks which serve as the overall planner/coordinator to balance the workload between memes. We show the efficacy of the proposed NMMF through empirical study on a class of combinatorial problem, the quadratic assignment problem (QAP).

  15. Cause and Effect: Testing a Mechanism and Method for the Cognitive Integration of Basic Science.

    PubMed

    Kulasegaram, Kulamakan; Manzone, Julian C; Ku, Cheryl; Skye, Aimee; Wadey, Veronica; Woods, Nicole N

    2015-11-01

    Methods of integrating basic science with clinical knowledge are still debated in medical training. One possibility is increasing the spatial and temporal proximity of clinical content to basic science. An alternative model argues that teaching must purposefully expose relationships between the domains. The authors compared different methods of integrating basic science: causal explanations linking basic science to clinical features, presenting both domains separately but in proximity, and simply presenting clinical features First-year undergraduate health professions students were randomized to four conditions: (1) science-causal explanations (SC), (2) basic science before clinical concepts (BC), (3) clinical concepts before basic science (CB), and (4) clinical features list only (FL). Based on assigned conditions, participants were given explanations for four disorders in neurology or rheumatology followed by a memory quiz and diagnostic test consisting of 12 cases which were repeated after one week. Ninety-four participants completed the study. No difference was found on memory test performance, but on the diagnostic test, a condition by time interaction was found (F[3,88] = 3.05, P < .03, ηp = 0.10). Although all groups had similar immediate performance, the SC group had a minimal decrease in performance on delayed testing; the CB and FL groups had the greatest decreases. These results suggest that creating proximity between basic science and clinical concepts may not guarantee cognitive integration. Although cause-and-effect explanations may not be possible for all domains, making explicit and specific connections between domains will likely facilitate the benefits of integration for learners.

  16. Practice of Project-based Learning on Fused Multiple Department and Educational Effect by Assignment System

    NASA Astrophysics Data System (ADS)

    Okada, Masato; Muranaka, Takayuki; Kameyama, Kentaro; Kitagawa, Hirokazu; Suzuki, Hidekazu

    In this paper, a new subject based on PBL (Project Based Learning) and its educational effects are discussed. The feature in this subject is that problems are solved based on the division of labor. In this subject, students break into four-member groups, and develop a line trace robot together cooperatively. Then, they share their responsibility for mechanism, electric circuit and programming, and learn basic knowledge of assigned area from teachers. After that, they develop the robot based on discussions. This procedure is like that in companies and the main objective of this subject is to get this skill. Each robot is evaluated by competition held in a public space of campus. From the questionnaire, very active posture and high attendance degree of satisfaction was gotten.

  17. A genomewide survey of basic helix–loop–helix factors in Drosophila

    PubMed Central

    Moore, Adrian W.; Barbel, Sandra; Jan, Lily Yeh; Jan, Yuh Nung

    2000-01-01

    The basic helix–loop–helix (bHLH) transcription factors play important roles in the specification of tissue type during the development of animals. We have used the information contained in the recently published genomic sequence of Drosophila melanogaster to identify 12 additional bHLH proteins. By sequence analysis we have assigned these proteins to families defined by Atonal, Hairy-Enhancer of Split, Hand, p48, Mesp, MYC/USF, and the bHLH-Per, Arnt, Sim (PAS) domain. In addition, one single protein represents a unique family of bHLH proteins. mRNA in situ analysis demonstrates that the genes encoding these proteins are expressed in several tissue types but are particularly concentrated in the developing nervous system and mesoderm. PMID:10973473

  18. The Physics of a Gymnastics Flight Element

    NASA Astrophysics Data System (ADS)

    Contakos, Jonas; Carlton, Les G.; Thompson, Bruce; Suddaby, Rick

    2009-09-01

    From its inception, performance in the sport of gymnastics has relied on the laws of physics to create movement patterns and static postures that appear almost impossible. In general, gymnastics is physics in motion and can provide an ideal framework for studying basic human modeling techniques and physical principles. Using low-end technology and basic principles of physics, we analyzed a high-end gymnastics skill competed in by both men and women. The comprehensive goal of the examination is to scientifically understand how a skill of this magnitude is actually physically possible and what must a gymnast do to successfully complete the skill. The examination is divided into three sections, each of which is comprehensive enough to be a separate assignment or small group project.

  19. A Simple Statistical Thermodynamics Experiment

    ERIC Educational Resources Information Center

    LoPresto, Michael C.

    2010-01-01

    Comparing the predicted and actual rolls of combinations of both two and three dice can help to introduce many of the basic concepts of statistical thermodynamics, including multiplicity, probability, microstates, and macrostates, and demonstrate that entropy is indeed a measure of randomness, that disordered states (those of higher entropy) are…

  20. Cutting Cakes Carefully

    ERIC Educational Resources Information Center

    Hill, Theodore P.; Morrison, Kent E.

    2010-01-01

    This paper surveys the fascinating mathematics of fair division, and provides a suite of examples using basic ideas from algebra, calculus, and probability which can be used to examine and test new and sometimes complex mathematical theories and claims involving fair division. Conversely, the classical cut-and-choose and moving-knife algorithms…

  1. The Five-Year Outlook on Science and Technology: 1982.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC. Committee on Science and Public Policy.

    Presented are reports on trends and probable future developments in eight selected areas of basic science and engineering. These reports are: "The Genetic Program of Complex Organisms" (Maxine F. Singer); "The Molecular and Genetic Technology of Plants" (Joseph E. Varner); "Cell Receptors for Hormones and…

  2. Hierarchical auto-configuration addressing in mobile ad hoc networks (HAAM)

    NASA Astrophysics Data System (ADS)

    Ram Srikumar, P.; Sumathy, S.

    2017-11-01

    Addressing plays a vital role in networking to identify devices uniquely. A device must be assigned with a unique address in order to participate in the data communication in any network. Different protocols defining different types of addressing are proposed in literature. Address auto-configuration is a key requirement for self organizing networks. Existing auto-configuration based addressing protocols require broadcasting probes to all the nodes in the network before assigning a proper address to a new node. This needs further broadcasts to reflect the status of the acquired address in the network. Such methods incur high communication overheads due to repetitive flooding. To address this overhead, a new partially stateful address allocation scheme, namely Hierarchical Auto-configuration Addressing (HAAM) scheme is extended and proposed. Hierarchical addressing basically reduces latency and overhead caused during address configuration. Partially stateful addressing algorithm assigns addresses without the need for flooding and global state awareness, which in turn reduces the communication overhead and space complexity respectively. Nodes are assigned addresses hierarchically to maintain the graph of the network as a spanning tree which helps in effectively avoiding the broadcast storm problem. Proposed algorithm for HAAM handles network splits and merges efficiently in large scale mobile ad hoc networks incurring low communication overheads.

  3. Evolutionary squeaky wheel optimization: a new framework for analysis.

    PubMed

    Li, Jingpeng; Parkes, Andrew J; Burke, Edmund K

    2011-01-01

    Squeaky wheel optimization (SWO) is a relatively new metaheuristic that has been shown to be effective for many real-world problems. At each iteration SWO does a complete construction of a solution starting from the empty assignment. Although the construction uses information from previous iterations, the complete rebuilding does mean that SWO is generally effective at diversification but can suffer from a relatively weak intensification. Evolutionary SWO (ESWO) is a recent extension to SWO that is designed to improve the intensification by keeping the good components of solutions and only using SWO to reconstruct other poorer components of the solution. In such algorithms a standard challenge is to understand how the various parameters affect the search process. In order to support the future study of such issues, we propose a formal framework for the analysis of ESWO. The framework is based on Markov chains, and the main novelty arises because ESWO moves through the space of partial assignments. This makes it significantly different from the analyses used in local search (such as simulated annealing) which only move through complete assignments. Generally, the exact details of ESWO will depend on various heuristics; so we focus our approach on a case of ESWO that we call ESWO-II and that has probabilistic as opposed to heuristic selection and construction operators. For ESWO-II, we study a simple problem instance and explicitly compute the stationary distribution probability over the states of the search space. We find interesting properties of the distribution. In particular, we find that the probabilities of states generally, but not always, increase with their fitness. This nonmonotonocity is quite different from the monotonicity expected in algorithms such as simulated annealing.

  4. Impact of Type 2 Diabetes and Postmenopausal Hormone Therapy on Incidence of Cognitive Impairment in Older Women

    PubMed Central

    Brinton, Roberta Diaz; Hugenschmidt, Christina; Manson, JoAnn E.; Craft, Suzanne; Yaffe, Kristine; Weitlauf, Julie; Vaughan, Leslie; Johnson, Karen C.; Padula, Claudia B.; Jackson, Rebecca D.; Resnick, Susan M.

    2015-01-01

    OBJECTIVE In older women, higher levels of estrogen may exacerbate the increased risk for cognitive impairment conveyed by diabetes. We examined whether the effect of postmenopausal hormone therapy (HT) on cognitive impairment incidence differs depending on type 2 diabetes. RESEARCH DESIGN AND METHODS The Women’s Health Initiative (WHI) randomized clinical trials assigned women to HT (0.625 mg/day conjugated equine estrogens with or without [i.e., unopposed] 2.5 mg/day medroxyprogesterone acetate) or matching placebo for an average of 4.7–5.9 years. A total of 7,233 women, aged 65–80 years, were classified according to type 2 diabetes status and followed for probable dementia and cognitive impairment (mild cognitive impairment or dementia). RESULTS Through a maximum of 18 years of follow-up, women with diabetes had increased risk of probable dementia (hazard ratio [HR] 1.54 [95% CI 1.16–2.06]) and cognitive impairment (HR 1.83 [1.50–2.23]). The combination of diabetes and random assignment to HT increased their risk of dementia (HR 2.12 [1.47–3.06]) and cognitive impairment (HR 2.20 [1.70–2.87]) compared with women without these conditions, interaction P = 0.09 and P = 0.08. These interactions appeared to be limited to women assigned to unopposed conjugated equine estrogens. CONCLUSIONS These analyses provide additional support to a prior report that higher levels of estrogen may exacerbate risks that type 2 diabetes poses for cognitive function in older women. The role estrogen plays in suppressing non–glucose-based energy sources in the brain may explain this interaction. PMID:26486190

  5. Becoming pure: identifying generational classes of admixed individuals within lesser and greater scaup populations.

    PubMed

    Lavretsky, Philip; Peters, Jeffrey L; Winker, Kevin; Bahn, Volker; Kulikova, Irina; Zhuravlev, Yuri N; Wilson, Robert E; Barger, Chris; Gurney, Kirsty; McCracken, Kevin G

    2016-02-01

    Estimating the frequency of hybridization is important to understand its evolutionary consequences and its effects on conservation efforts. In this study, we examined the extent of hybridization in two sister species of ducks that hybridize. We used mitochondrial control region sequences and 3589 double-digest restriction-associated DNA sequences (ddRADseq) to identify admixture between wild lesser scaup (Aythya affinis) and greater scaup (A. marila). Among 111 individuals, we found one introgressed mitochondrial DNA haplotype in lesser scaup and four in greater scaup. Likewise, based on the site-frequency spectrum from autosomal DNA, gene flow was asymmetrical, with higher rates from lesser into greater scaup. However, using ddRADseq nuclear DNA, all individuals were assigned to their respective species with >0.95 posterior assignment probability. To examine the power for detecting admixture, we simulated a breeding experiment in which empirical data were used to create F1 hybrids and nine generations (F2-F10) of backcrossing. F1 hybrids and F2, F3 and most F4 backcrosses were clearly distinguishable from pure individuals, but evidence of admixed histories was effectively lost after the fourth generation. Thus, we conclude that low interspecific assignment probabilities (0.011-0.043) for two lesser and nineteen greater scaup were consistent with admixed histories beyond the F3 generation. These results indicate that the propensity of these species to hybridize in the wild is low and largely asymmetric. When applied to species-specific cases, our approach offers powerful utility for examining concerns of hybridization in conservation efforts, especially for determining the generational time until admixed histories are effectively lost through backcrossing. © 2015 John Wiley & Sons Ltd.

  6. Impact of Type 2 Diabetes and Postmenopausal Hormone Therapy on Incidence of Cognitive Impairment in Older Women.

    PubMed

    Espeland, Mark A; Brinton, Roberta Diaz; Hugenschmidt, Christina; Manson, JoAnn E; Craft, Suzanne; Yaffe, Kristine; Weitlauf, Julie; Vaughan, Leslie; Johnson, Karen C; Padula, Claudia B; Jackson, Rebecca D; Resnick, Susan M

    2015-12-01

    In older women, higher levels of estrogen may exacerbate the increased risk for cognitive impairment conveyed by diabetes. We examined whether the effect of postmenopausal hormone therapy (HT) on cognitive impairment incidence differs depending on type 2 diabetes. The Women's Health Initiative (WHI) randomized clinical trials assigned women to HT (0.625 mg/day conjugated equine estrogens with or without [i.e., unopposed] 2.5 mg/day medroxyprogesterone acetate) or matching placebo for an average of 4.7-5.9 years. A total of 7,233 women, aged 65-80 years, were classified according to type 2 diabetes status and followed for probable dementia and cognitive impairment (mild cognitive impairment or dementia). Through a maximum of 18 years of follow-up, women with diabetes had increased risk of probable dementia (hazard ratio [HR] 1.54 [95% CI 1.16-2.06]) and cognitive impairment (HR 1.83 [1.50-2.23]). The combination of diabetes and random assignment to HT increased their risk of dementia (HR 2.12 [1.47-3.06]) and cognitive impairment (HR 2.20 [1.70-2.87]) compared with women without these conditions, interaction P = 0.09 and P = 0.08. These interactions appeared to be limited to women assigned to unopposed conjugated equine estrogens. These analyses provide additional support to a prior report that higher levels of estrogen may exacerbate risks that type 2 diabetes poses for cognitive function in older women. The role estrogen plays in suppressing non-glucose-based energy sources in the brain may explain this interaction. © 2015 by the American Diabetes Association. Readers may use this article as long as the work is properly cited, the use is educational and not for profit, and the work is not altered.

  7. History of the Armed Services Vocational Aptitude Battery (ASVAB) 1974-1980

    DTIC Science & Technology

    1980-03-01

    the high school equivalent form to ASVAB-3 being used operationally by the Air Force and Marine Corps). A counterbalanced design , in ... , 11. ,,󈧒...National Lcartudinal Survev of Youth made use of three Independent probability samples. Two of these samples were designed to cover the non ...literacy, but non -language tests were also introduced for service qualifica- tion. After service entry, the primary test instruments for assignment

  8. Data normalization in biosurveillance: an information-theoretic approach.

    PubMed

    Peter, William; Najmi, Amir H; Burkom, Howard

    2007-10-11

    An approach to identifying public health threats by characterizing syndromic surveillance data in terms of its surprisability is discussed. Surprisability in our model is measured by assigning a probability distribution to a time series, and then calculating its entropy, leading to a straightforward designation of an alert. Initial application of our method is to investigate the applicability of using suitably-normalized syndromic counts (i.e., proportions) to improve early event detection.

  9. Water Control Data System Software Manual.

    DTIC Science & Technology

    1983-02-01

    latter computes multiple durations of hydrologic data, finds annual maxima and minima, and assigns probabilities to events by computing annual frequency... SOFWAR I:Cp SiFTK~ARE ACQUISITION GROUP (A) (O lCN I AL(ATI AlI Al I ’IU ~ tT I T R r/ pno..s/oa AAT U I I IT Y~ -------- IN o t i i i I \\ ~~S0I’fY

  10. Inference for the effect of treatment on survival probability in randomized trials with noncompliance and administrative censoring.

    PubMed

    Nie, Hui; Cheng, Jing; Small, Dylan S

    2011-12-01

    In many clinical studies with a survival outcome, administrative censoring occurs when follow-up ends at a prespecified date and many subjects are still alive. An additional complication in some trials is that there is noncompliance with the assigned treatment. For this setting, we study the estimation of the causal effect of treatment on survival probability up to a given time point among those subjects who would comply with the assignment to both treatment and control. We first discuss the standard instrumental variable (IV) method for survival outcomes and parametric maximum likelihood methods, and then develop an efficient plug-in nonparametric empirical maximum likelihood estimation (PNEMLE) approach. The PNEMLE method does not make any assumptions on outcome distributions, and makes use of the mixture structure in the data to gain efficiency over the standard IV method. Theoretical results of the PNEMLE are derived and the method is illustrated by an analysis of data from a breast cancer screening trial. From our limited mortality analysis with administrative censoring times 10 years into the follow-up, we find a significant benefit of screening is present after 4 years (at the 5% level) and this persists at 10 years follow-up. © 2011, The International Biometric Society.

  11. Will it Blend? Visualization and Accuracy Evaluation of High-Resolution Fuzzy Vegetation Maps

    NASA Astrophysics Data System (ADS)

    Zlinszky, A.; Kania, A.

    2016-06-01

    Instead of assigning every map pixel to a single class, fuzzy classification includes information on the class assigned to each pixel but also the certainty of this class and the alternative possible classes based on fuzzy set theory. The advantages of fuzzy classification for vegetation mapping are well recognized, but the accuracy and uncertainty of fuzzy maps cannot be directly quantified with indices developed for hard-boundary categorizations. The rich information in such a map is impossible to convey with a single map product or accuracy figure. Here we introduce a suite of evaluation indices and visualization products for fuzzy maps generated with ensemble classifiers. We also propose a way of evaluating classwise prediction certainty with "dominance profiles" visualizing the number of pixels in bins according to the probability of the dominant class, also showing the probability of all the other classes. Together, these data products allow a quantitative understanding of the rich information in a fuzzy raster map both for individual classes and in terms of variability in space, and also establish the connection between spatially explicit class certainty and traditional accuracy metrics. These map products are directly comparable to widely used hard boundary evaluation procedures, support active learning-based iterative classification and can be applied for operational use.

  12. Understanding evidence-based diagnosis.

    PubMed

    Kohn, Michael A

    2014-01-01

    The real meaning of the word "diagnosis" is naming the disease that is causing a patient's illness. The cognitive process of assigning this name is a mysterious combination of pattern recognition and the hypothetico-deductive approach that is only remotely related to the mathematical process of using test results to update the probability of a disease. What I refer to as "evidence-based diagnosis" is really evidence-based use of medical tests to guide treatment decisions. Understanding how to use test results to update the probability of disease can help us interpret test results more rationally. Also, evidence-based diagnosis reminds us to consider the costs and risks of testing and the dangers of over-diagnosis and over-treatment, in addition to the costs and risks of missing serious disease.

  13. Author Credit for Transdisciplinary Collaboration

    PubMed Central

    Xu, Jian; Ding, Ying; Malic, Vincent

    2015-01-01

    Transdisciplinary collaboration is the key for innovation. An evaluation mechanism is necessary to ensure that academic credit for this costly process can be allocated fairly among coauthors. This paper proposes a set of quantitative measures (e.g., t_credit and t_index) to reflect authors’ transdisciplinary contributions to publications. These measures are based on paper-topic probability distributions and author-topic probability distributions. We conduct an empirical analysis of the information retrieval domain which demonstrates that these measures effectively improve the results of harmonic_credit and h_index measures by taking into account the transdisciplinary contributions of authors. The definitions of t_credit and t_index provide a fair and effective way for research organizations to assign credit to authors of transdisciplinary publications. PMID:26375678

  14. Who Needs Plants? Science (Experimental).

    ERIC Educational Resources Information Center

    Ropeik, Bernard H.; Kleinman, David Z.

    The basic elective course in introductory botany is designed for secondary students who probably will not continue study in plant science. The objectives of the course are to help the student 1) identify, compare and differentiate types of plants; 2) identify plant cell structures; 3) distinguish between helpful and harmful plants; 4) predict…

  15. Probability, Problem Solving, and "The Price is Right."

    ERIC Educational Resources Information Center

    Wood, Eric

    1992-01-01

    This article discusses the analysis of a decision-making process faced by contestants on the television game show "The Price is Right". The included analyses of the original and related problems concern pattern searching, inductive reasoning, quadratic functions, and graphing. Computer simulation programs in BASIC and tables of…

  16. Effects of Spatial and Selective Attention on Basic Multisensory Integration

    ERIC Educational Resources Information Center

    Gondan, Matthias; Blurton, Steven P.; Hughes, Flavia; Greenlee, Mark W.

    2011-01-01

    When participants respond to auditory and visual stimuli, responses to audiovisual stimuli are substantially faster than to unimodal stimuli (redundant signals effect, RSE). In such tasks, the RSE is usually higher than probability summation predicts, suggestive of specific integration mechanisms underlying the RSE. We investigated the role of…

  17. The Dutch Identity: A New Tool for the Study of Item Response Models.

    ERIC Educational Resources Information Center

    Holland, Paul W.

    1990-01-01

    The Dutch Identity is presented as a useful tool for expressing the basic equations of item response models that relate the manifest probabilities to the item response functions and the latent trait distribution. Ways in which the identity may be exploited are suggested and illustrated. (SLD)

  18. Students' Conceptual Difficulties in Quantum Mechanics: Potential Well Problems

    ERIC Educational Resources Information Center

    Ozcan, Ozgur; Didis, Nilufer; Tasar, Mehmet Fatih

    2009-01-01

    In this study, students' conceptual difficulties about some basic concepts in quantum mechanics like one-dimensional potential well problems and probability density of tunneling particles were identified. For this aim, a multiple choice instrument named Quantum Mechanics Conceptual Test has been developed by one of the researchers of this study…

  19. Secondary Schools Curriculum Guide, Mathematics, Grades 10-12. Revised.

    ERIC Educational Resources Information Center

    Cranston School Dept., RI.

    Behavioral objectives for grades 10 through 12 are specified for plane geometry, algebra, general mathematics, computer mathematics, slide rule mathematics, basic college mathematics, trigonometry, analytic geometry, calculus and probability. Most sections present material in terms of portions of a school year. At least one major objective is…

  20. Using High-Probability Instructional Sequences and Explicit Instruction to Teach Multiplication Facts

    ERIC Educational Resources Information Center

    Leach, Debra

    2016-01-01

    Students with learning disabilities often struggle with math fact fluency and require specialized interventions to recall basic facts. Deficits in math fact fluency can result in later difficulties when learning higher-level mathematical computation, concepts, and problem solving. The response-to-intervention (RTI) and…

Top