Sample records for model invalidation-based approach

  1. Compiler-directed cache management in multiprocessors

    NASA Technical Reports Server (NTRS)

    Cheong, Hoichi; Veidenbaum, Alexander V.

    1990-01-01

    The necessity of finding alternatives to hardware-based cache coherence strategies for large-scale multiprocessor systems is discussed. Three different software-based strategies sharing the same goals and general approach are presented. They consist of a simple invalidation approach, a fast selective invalidation scheme, and a version control scheme. The strategies are suitable for shared-memory multiprocessor systems with interconnection networks and a large number of processors. Results of trace-driven simulations conducted on numerical benchmark routines to compare the performance of the three schemes are presented.

  2. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling

    PubMed Central

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-01-01

    Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270

  3. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative-quantitative modeling.

    PubMed

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-05-01

    Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/

  4. Set-base dynamical parameter estimation and model invalidation for biochemical reaction networks.

    PubMed

    Rumschinski, Philipp; Borchers, Steffen; Bosio, Sandro; Weismantel, Robert; Findeisen, Rolf

    2010-05-25

    Mathematical modeling and analysis have become, for the study of biological and cellular processes, an important complement to experimental research. However, the structural and quantitative knowledge available for such processes is frequently limited, and measurements are often subject to inherent and possibly large uncertainties. This results in competing model hypotheses, whose kinetic parameters may not be experimentally determinable. Discriminating among these alternatives and estimating their kinetic parameters is crucial to improve the understanding of the considered process, and to benefit from the analytical tools at hand. In this work we present a set-based framework that allows to discriminate between competing model hypotheses and to provide guaranteed outer estimates on the model parameters that are consistent with the (possibly sparse and uncertain) experimental measurements. This is obtained by means of exact proofs of model invalidity that exploit the polynomial/rational structure of biochemical reaction networks, and by making use of an efficient strategy to balance solution accuracy and computational effort. The practicability of our approach is illustrated with two case studies. The first study shows that our approach allows to conclusively rule out wrong model hypotheses. The second study focuses on parameter estimation, and shows that the proposed method allows to evaluate the global influence of measurement sparsity, uncertainty, and prior knowledge on the parameter estimates. This can help in designing further experiments leading to improved parameter estimates.

  5. Set-base dynamical parameter estimation and model invalidation for biochemical reaction networks

    PubMed Central

    2010-01-01

    Background Mathematical modeling and analysis have become, for the study of biological and cellular processes, an important complement to experimental research. However, the structural and quantitative knowledge available for such processes is frequently limited, and measurements are often subject to inherent and possibly large uncertainties. This results in competing model hypotheses, whose kinetic parameters may not be experimentally determinable. Discriminating among these alternatives and estimating their kinetic parameters is crucial to improve the understanding of the considered process, and to benefit from the analytical tools at hand. Results In this work we present a set-based framework that allows to discriminate between competing model hypotheses and to provide guaranteed outer estimates on the model parameters that are consistent with the (possibly sparse and uncertain) experimental measurements. This is obtained by means of exact proofs of model invalidity that exploit the polynomial/rational structure of biochemical reaction networks, and by making use of an efficient strategy to balance solution accuracy and computational effort. Conclusions The practicability of our approach is illustrated with two case studies. The first study shows that our approach allows to conclusively rule out wrong model hypotheses. The second study focuses on parameter estimation, and shows that the proposed method allows to evaluate the global influence of measurement sparsity, uncertainty, and prior knowledge on the parameter estimates. This can help in designing further experiments leading to improved parameter estimates. PMID:20500862

  6. Context-based virtual metrology

    NASA Astrophysics Data System (ADS)

    Ebersbach, Peter; Urbanowicz, Adam M.; Likhachev, Dmitriy; Hartig, Carsten; Shifrin, Michael

    2018-03-01

    Hybrid and data feed forward methodologies are well established for advanced optical process control solutions in highvolume semiconductor manufacturing. Appropriate information from previous measurements, transferred into advanced optical model(s) at following step(s), provides enhanced accuracy and exactness of the measured topographic (thicknesses, critical dimensions, etc.) and material parameters. In some cases, hybrid or feed-forward data are missed or invalid for dies or for a whole wafer. We focus on approaches of virtual metrology to re-create hybrid or feed-forward data inputs in high-volume manufacturing. We discuss missing data inputs reconstruction which is based on various interpolation and extrapolation schemes and uses information about wafer's process history. Moreover, we demonstrate data reconstruction approach based on machine learning techniques utilizing optical model and measured spectra. And finally, we investigate metrics that allow one to assess error margin of virtual data input.

  7. Preliminary design specifications of a calcium model

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A list of objectives, requirements, and guidelines are given for a calcium model. Existing models are reviewed and evaluated in relation to the stated objectives and requirements. The reviewed models were either too abstract or apparently invalidated. A technical approach to the design of a desirable model is identified.

  8. [Modelling of the costs of productivity losses due to smoking in Germany for the year 2005].

    PubMed

    Prenzler, A; Mittendorf, T; von der Schulenburg, J M

    2007-11-01

    The aim of this study was to estimate disease-related productivity costs attributable to smoking in the year 2005 in Germany. The calculation was based on the updated relative smoking-related disease risk found in the US Cancer Prevention Study II combined with data on smoking prevalence for Germany. With this, smoking-attributable cases resulting in premature mortality, invalidity, and temporal disability to work could be estimated. Neoplasms, diseases of the circulatory and the respiratory systems as well as health problems in children younger than one year were considered in the analysis. The human capital approach was applied to calculate years of potential work loss and productivity costs as a result of smoking. Various sensitivity analyses were conducted to test for robustness of the underlying model. Based on the assumptions within the model, 107,389 deaths, 14,112 invalidity cases, and 1.19 million cases of temporary disability to work were found to be due to smoking in 2005 in Germany, respectively. As a result, productivity costs of 9.6 billion were caused by smoking. The model showed that smoking has a high financial effect. Even so, further analyses are necessary to estimate an overall impact of smoking on the German society.

  9. The challenge of mapping the human connectome based on diffusion tractography.

    PubMed

    Maier-Hein, Klaus H; Neher, Peter F; Houde, Jean-Christophe; Côté, Marc-Alexandre; Garyfallidis, Eleftherios; Zhong, Jidan; Chamberland, Maxime; Yeh, Fang-Cheng; Lin, Ying-Chia; Ji, Qing; Reddick, Wilburn E; Glass, John O; Chen, David Qixiang; Feng, Yuanjing; Gao, Chengfeng; Wu, Ye; Ma, Jieyan; Renjie, H; Li, Qiang; Westin, Carl-Fredrik; Deslauriers-Gauthier, Samuel; González, J Omar Ocegueda; Paquette, Michael; St-Jean, Samuel; Girard, Gabriel; Rheault, François; Sidhu, Jasmeen; Tax, Chantal M W; Guo, Fenghua; Mesri, Hamed Y; Dávid, Szabolcs; Froeling, Martijn; Heemskerk, Anneriet M; Leemans, Alexander; Boré, Arnaud; Pinsard, Basile; Bedetti, Christophe; Desrosiers, Matthieu; Brambati, Simona; Doyon, Julien; Sarica, Alessia; Vasta, Roberta; Cerasa, Antonio; Quattrone, Aldo; Yeatman, Jason; Khan, Ali R; Hodges, Wes; Alexander, Simon; Romascano, David; Barakovic, Muhamed; Auría, Anna; Esteban, Oscar; Lemkaddem, Alia; Thiran, Jean-Philippe; Cetingul, H Ertan; Odry, Benjamin L; Mailhe, Boris; Nadar, Mariappan S; Pizzagalli, Fabrizio; Prasad, Gautam; Villalon-Reina, Julio E; Galvis, Justin; Thompson, Paul M; Requejo, Francisco De Santiago; Laguna, Pedro Luque; Lacerda, Luis Miguel; Barrett, Rachel; Dell'Acqua, Flavio; Catani, Marco; Petit, Laurent; Caruyer, Emmanuel; Daducci, Alessandro; Dyrby, Tim B; Holland-Letz, Tim; Hilgetag, Claus C; Stieltjes, Bram; Descoteaux, Maxime

    2017-11-07

    Tractography based on non-invasive diffusion imaging is central to the study of human brain connectivity. To date, the approach has not been systematically validated in ground truth studies. Based on a simulated human brain data set with ground truth tracts, we organized an open international tractography challenge, which resulted in 96 distinct submissions from 20 research groups. Here, we report the encouraging finding that most state-of-the-art algorithms produce tractograms containing 90% of the ground truth bundles (to at least some extent). However, the same tractograms contain many more invalid than valid bundles, and half of these invalid bundles occur systematically across research groups. Taken together, our results demonstrate and confirm fundamental ambiguities inherent in tract reconstruction based on orientation information alone, which need to be considered when interpreting tractography and connectivity results. Our approach provides a novel framework for estimating reliability of tractography and encourages innovation to address its current limitations.

  10. [Assessment of invalidity as a result of infectious diseases].

    PubMed

    Čeledová, L; Čevela, R; Bosák, M

    2016-01-01

    The article features the new medical assessment paradigm for invalidity as a result of infectious disease which is applied as of 1 January 2010. The invalidity assessment criteria are regulated specifically by Regulation No. 359/2009. Chapter I of the Annexe to the invalidity assessment regulation addresses the area of infectious diseases with respect to functional impairment and its impact on the quality of life. Since 2010, the invalidity has also been newly categorized into three groups. The new assessment approach makes it possible to evaluate a persons functional capacity, type of disability, and eligibility for compensation for reduced capacity for work. In 2010, a total of 170 375 invalidity cases were assessed, and in 2014, 147 121 invalidity assessments were made. Invalidity as a result of infectious disease was assessed in 177 persons in 2010, and 128 invalidity assessments were made in 2014. The most common causes of invalidity as a result of infectious disease are chronic viral hepatitis, other spirochetal infections, tuberculosis of the respiratory tract, tick-borne viral encephalitis, and HIV/AIDS. The number of assessments of invalidity as a result of infectious disease showed a declining trend between 2010 and 2014, similarly to the total of invalidity assessments. In spite of this fact, the cases of invalidity as a result of infectious disease account for approximately half percent of all invalidity assessments made in the above-mentioned period of time.

  11. Univariate time series modeling and an application to future claims amount in SOCSO's invalidity pension scheme

    NASA Astrophysics Data System (ADS)

    Chek, Mohd Zaki Awang; Ahmad, Abu Bakar; Ridzwan, Ahmad Nur Azam Ahmad; Jelas, Imran Md.; Jamal, Nur Faezah; Ismail, Isma Liana; Zulkifli, Faiz; Noor, Syamsul Ikram Mohd

    2012-09-01

    The main objective of this study is to forecast the future claims amount of Invalidity Pension Scheme (IPS). All data were derived from SOCSO annual reports from year 1972 - 2010. These claims consist of all claims amount from 7 benefits offered by SOCSO such as Invalidity Pension, Invalidity Grant, Survivors Pension, Constant Attendance Allowance, Rehabilitation, Funeral and Education. Prediction of future claims of Invalidity Pension Scheme will be made using Univariate Forecasting Models to predict the future claims among workforce in Malaysia.

  12. In defence of model-based inference in phylogeography

    PubMed Central

    Beaumont, Mark A.; Nielsen, Rasmus; Robert, Christian; Hey, Jody; Gaggiotti, Oscar; Knowles, Lacey; Estoup, Arnaud; Panchal, Mahesh; Corander, Jukka; Hickerson, Mike; Sisson, Scott A.; Fagundes, Nelson; Chikhi, Lounès; Beerli, Peter; Vitalis, Renaud; Cornuet, Jean-Marie; Huelsenbeck, John; Foll, Matthieu; Yang, Ziheng; Rousset, Francois; Balding, David; Excoffier, Laurent

    2017-01-01

    Recent papers have promoted the view that model-based methods in general, and those based on Approximate Bayesian Computation (ABC) in particular, are flawed in a number of ways, and are therefore inappropriate for the analysis of phylogeographic data. These papers further argue that Nested Clade Phylogeographic Analysis (NCPA) offers the best approach in statistical phylogeography. In order to remove the confusion and misconceptions introduced by these papers, we justify and explain the reasoning behind model-based inference. We argue that ABC is a statistically valid approach, alongside other computational statistical techniques that have been successfully used to infer parameters and compare models in population genetics. We also examine the NCPA method and highlight numerous deficiencies, either when used with single or multiple loci. We further show that the ages of clades are carelessly used to infer ages of demographic events, that these ages are estimated under a simple model of panmixia and population stationarity but are then used under different and unspecified models to test hypotheses, a usage the invalidates these testing procedures. We conclude by encouraging researchers to study and use model-based inference in population genetics. PMID:29284924

  13. Racial identity invalidation with multiracial individuals: An instrument development study.

    PubMed

    Franco, Marisa G; O'Brien, Karen M

    2018-01-01

    Racial identity invalidation, others' denial of an individual's racial identity, is a salient racial stressor with harmful effects on the mental health and well-being of Multiracial individuals. The purpose of this study was to create a psychometrically sound measure to assess racial identity invalidation for use with Multiracial individuals (N = 497). The present sample was mostly female (75%) with a mean age of 26.52 years (SD = 9.60). The most common racial backgrounds represented were Asian/White (33.4%) and Black/White (23.7%). Participants completed several online measures via Qualtrics. Exploratory factor analyses revealed 3 racial identity invalidation factors: behavior invalidation, phenotype invalidation, and identity incongruent discrimination. A confirmatory factor analysis provided support for the initial factor structure. Alternative model testing indicated that the bifactor model was superior to the 3-factor model. Thus, a total score and/or 3 subscale scores can be used when administering this instrument. Support was found for the reliability and validity of the total scale and subscales. In line with the minority stress theory, challenges with racial identity mediated relationships between racial identity invalidation and mental health and well-being outcomes. The findings highlight the different dimensions of racial identity invalidation and indicate their negative associations with connectedness and psychological well-being. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. Association among self-compassion, childhood invalidation, and borderline personality disorder symptomatology in a Singaporean sample.

    PubMed

    Keng, Shian-Ling; Wong, Yun Yi

    2017-01-01

    Linehan's biosocial theory posits that parental invalidation during childhood plays a role in the development of borderline personality disorder symptoms later in life. However, little research has examined components of the biosocial model in an Asian context, and variables that may influence the relationship between childhood invalidation and borderline symptoms. Self-compassion is increasingly regarded as an adaptive way to regulate one's emotions and to relate to oneself, and may serve to moderate the association between invalidation and borderline symptoms. The present study investigated the association among childhood invalidation, self-compassion, and borderline personality disorder symptoms in a sample of Singaporean undergraduate students. Two hundred and ninety undergraduate students from a large Singaporean university were recruited and completed measures assessing childhood invalidation, self-compassion, and borderline personality disorder symptoms. Analyses using multiple regression indicated that both childhood invalidation and self-compassion significantly predicted borderline personality disorder symptomatology. Results from moderation analyses indicated that relationship between childhood invalidation and borderline personality disorder symptomatology did not vary as a function of self-compassion. This study provides evidence in support of aspects of the biosocial model in an Asian context, and demonstrates a strong association between self-compassion and borderline personality disorder symptoms, independent of one's history of parental invalidation during childhood.

  15. Using "Excel" for White's Test--An Important Technique for Evaluating the Equality of Variance Assumption and Model Specification in a Regression Analysis

    ERIC Educational Resources Information Center

    Berenson, Mark L.

    2013-01-01

    There is consensus in the statistical literature that severe departures from its assumptions invalidate the use of regression modeling for purposes of inference. The assumptions of regression modeling are usually evaluated subjectively through visual, graphic displays in a residual analysis but such an approach, taken alone, may be insufficient…

  16. A mapping closure for turbulent scalar mixing using a time-evolving reference field

    NASA Technical Reports Server (NTRS)

    Girimaji, Sharath S.

    1992-01-01

    A general mapping-closure approach for modeling scalar mixing in homogeneous turbulence is developed. This approach is different from the previous methods in that the reference field also evolves according to the same equations as the physical scalar field. The use of a time-evolving Gaussian reference field results in a model that is similar to the mapping closure model of Pope (1991), which is based on the methodology of Chen et al. (1989). Both models yield identical relationships between the scalar variance and higher-order moments, which are in good agreement with heat conduction simulation data and can be consistent with any type of epsilon(phi) evolution. The present methodology can be extended to any reference field whose behavior is known. The possibility of a beta-pdf reference field is explored. The shortcomings of the mapping closure methods are discussed, and the limit at which the mapping becomes invalid is identified.

  17. Parental Invalidation and the Development of Narcissism.

    PubMed

    Huxley, Elizabeth; Bizumic, Boris

    2017-02-17

    Parenting behaviors and childhood experiences have played a central role in theoretical approaches to the etiology of narcissism. Research has suggested an association between parenting and narcissism; however, it has been limited in its examination of different narcissism subtypes and individual differences in parenting behaviors. This study investigates the influence of perceptions of parental invalidation, an important aspect of parenting behavior theoretically associated with narcissism. Correlational and hierarchical regression analyses were conducted using a sample of 442 Australian participants to examine the relationship between invalidating behavior from mothers and fathers, and grandiose and vulnerable narcissism. Results indicate that stronger recollections of invalidating behavior from either mothers or fathers are associated with higher levels of grandiose and vulnerable narcissism when controlling for age, gender, and the related parenting behaviors of rejection, coldness, and overprotection. The lowest levels of narcissism were found in individuals who reported low levels of invalidation in both parents. These findings support the idea that parental invalidation is associated with narcissism.

  18. A review and comparison of Bayesian and likelihood-based inferences in beta regression and zero-or-one-inflated beta regression.

    PubMed

    Liu, Fang; Eugenio, Evercita C

    2018-04-01

    Beta regression is an increasingly popular statistical technique in medical research for modeling of outcomes that assume values in (0, 1), such as proportions and patient reported outcomes. When outcomes take values in the intervals [0,1), (0,1], or [0,1], zero-or-one-inflated beta (zoib) regression can be used. We provide a thorough review on beta regression and zoib regression in the modeling, inferential, and computational aspects via the likelihood-based and Bayesian approaches. We demonstrate the statistical and practical importance of correctly modeling the inflation at zero/one rather than ad hoc replacing them with values close to zero/one via simulation studies; the latter approach can lead to biased estimates and invalid inferences. We show via simulation studies that the likelihood-based approach is computationally faster in general than MCMC algorithms used in the Bayesian inferences, but runs the risk of non-convergence, large biases, and sensitivity to starting values in the optimization algorithm especially with clustered/correlated data, data with sparse inflation at zero and one, and data that warrant regularization of the likelihood. The disadvantages of the regular likelihood-based approach make the Bayesian approach an attractive alternative in these cases. Software packages and tools for fitting beta and zoib regressions in both the likelihood-based and Bayesian frameworks are also reviewed.

  19. Invalid-point removal based on epipolar constraint in the structured-light method

    NASA Astrophysics Data System (ADS)

    Qi, Zhaoshuai; Wang, Zhao; Huang, Junhui; Xing, Chao; Gao, Jianmin

    2018-06-01

    In structured-light measurement, there unavoidably exist many invalid points caused by shadows, image noise and ambient light. According to the property of the epipolar constraint, because the retrieved phase of the invalid point is inaccurate, the corresponding projector image coordinate (PIC) will not satisfy the epipolar constraint. Based on this fact, a new invalid-point removal method based on the epipolar constraint is proposed in this paper. First, the fundamental matrix of the measurement system is calculated, which will be used for calculating the epipolar line. Then, according to the retrieved phase map of the captured fringes, the PICs of each pixel are retrieved. Subsequently, the epipolar line in the projector image plane of each pixel is obtained using the fundamental matrix. The distance between the corresponding PIC and the epipolar line of a pixel is defined as the invalidation criterion, which quantifies the satisfaction degree of the epipolar constraint. Finally, all pixels with a distance larger than a certain threshold are removed as invalid points. Experiments verified that the method is easy to implement and demonstrates better performance than state-of-the-art measurement systems.

  20. Feature integration, attention, and fixations during visual search.

    PubMed

    Khani, Abbas; Ordikhani-Seyedlar, Mehdi

    2017-01-01

    We argue that mechanistic premises of "item-based" theories are not invalidated by the fixation-based approach. We use item-based theories to propose an account that does not advocate strict serial item processing and integrates fixations. The main focus of this account is feature integration within fixations. We also suggest that perceptual load determines the size of the fixations.

  1. Dynamic Computation of Change Operations in Version Management of Business Process Models

    NASA Astrophysics Data System (ADS)

    Küster, Jochen Malte; Gerth, Christian; Engels, Gregor

    Version management of business process models requires that changes can be resolved by applying change operations. In order to give a user maximal freedom concerning the application order of change operations, position parameters of change operations must be computed dynamically during change resolution. In such an approach, change operations with computed position parameters must be applicable on the model and dependencies and conflicts of change operations must be taken into account because otherwise invalid models can be constructed. In this paper, we study the concept of partially specified change operations where parameters are computed dynamically. We provide a formalization for partially specified change operations using graph transformation and provide a concept for their applicability. Based on this, we study potential dependencies and conflicts of change operations and show how these can be taken into account within change resolution. Using our approach, a user can resolve changes of business process models without being unnecessarily restricted to a certain order.

  2. Internal Invalidity in Pretest-Posttest Self-Report Evaluations and a Re-Evaluation of Retrospective Pretests.

    ERIC Educational Resources Information Center

    Howard, George S.; And Others

    1979-01-01

    True experimental designs are thought to provide internally valid results. In this investigation of the evaluations of five interventions, a source of internal invalidity is identified when self-report measures are used. An alternative approach is presented and implications of the findings for evaluation research are discussed. (JKS)

  3. An Updated Protocol to Detect Invalid Entries in an Online Survey of Men Who Have Sex with Men (MSM): How Do Valid and Invalid Submissions Compare?

    PubMed Central

    Konstan, Joseph; Iantaffi, Alex; Wilkerson, J. Michael; Galos, Dylan; Simon Rosser, B. R.

    2017-01-01

    Researchers use protocols to screen for suspicious survey submissions in online studies. We evaluated how well a de-duplication and cross-validation process detected invalid entries. Data were from the Sexually Explicit Media Study, an Internet-based HIV prevention survey of men who have sex with men. Using our protocol, 146 (11.6 %) of 1254 entries were identified as invalid. Most indicated changes to the screening questionnaire to gain entry (n = 109, 74.7 %), matched other submissions’ payment profiles (n = 56, 41.8 %), or featured an IP address that was recorded previously (n = 43, 29.5 %). We found few demographic or behavioral differences between valid and invalid samples, however. Invalid submissions had lower odds of reporting HIV testing in the past year (OR 0.63), and higher odds of requesting no payment compared to check payments (OR 2.75). Thus, rates of HIV testing would have been underestimated if invalid submissions had not been removed, and payment may not be the only incentive for invalid participation. PMID:25805443

  4. Income, Deprivation and Economic Stress in the Enlarged European Union

    ERIC Educational Resources Information Center

    Whelan, Christopher T.; Maitre, Bertrand

    2007-01-01

    At risk of poverty indicators based on relative income measures suggest that within the enlarged EU societies located at quite different points on a continuum of affluence have similar levels of poverty. Substantial differences in levels of income between societies do not in themselves invalidate this approach. However, the relative income…

  5. Evaluating the accuracy of the Wechsler Memory Scale-Fourth Edition (WMS-IV) logical memory embedded validity index for detecting invalid test performance.

    PubMed

    Soble, Jason R; Bain, Kathleen M; Bailey, K Chase; Kirton, Joshua W; Marceaux, Janice C; Critchfield, Edan A; McCoy, Karin J M; O'Rourke, Justin J F

    2018-01-08

    Embedded performance validity tests (PVTs) allow for continuous assessment of invalid performance throughout neuropsychological test batteries. This study evaluated the utility of the Wechsler Memory Scale-Fourth Edition (WMS-IV) Logical Memory (LM) Recognition score as an embedded PVT using the Advanced Clinical Solutions (ACS) for WAIS-IV/WMS-IV Effort System. This mixed clinical sample was comprised of 97 total participants, 71 of whom were classified as valid and 26 as invalid based on three well-validated, freestanding criterion PVTs. Overall, the LM embedded PVT demonstrated poor concordance with the criterion PVTs and unacceptable psychometric properties using ACS validity base rates (42% sensitivity/79% specificity). Moreover, 15-39% of participants obtained an invalid ACS base rate despite having a normatively-intact age-corrected LM Recognition total score. Receiving operating characteristic curve analysis revealed a Recognition total score cutoff of < 61% correct improved specificity (92%) while sensitivity remained weak (31%). Thus, results indicated the LM Recognition embedded PVT is not appropriate for use from an evidence-based perspective, and that clinicians may be faced with reconciling how a normatively intact cognitive performance on the Recognition subtest could simultaneously reflect invalid performance validity.

  6. Visual judgment of similarity across shape transformations: evidence for a compositional model of articulated objects.

    PubMed

    Barenholtz, Elan; Tarr, Michael J

    2008-06-01

    A single biological object, such as a hand, can assume multiple, very different shapes, due to the articulation of its parts. Yet we are able to recognize all of these shapes as examples of the same object. How is this invariance to pose achieved? Here, we present evidence that the visual system maintains a model of object transformation that is based on rigid, convex parts articulating at extrema of negative curvature, i.e., part boundaries. We compared similarity judgments in a task in which subjects had to decide which of the two transformed versions of a 'base' shape-one a 'biologically valid' articulation and one a geometrically similar but 'biologically invalid' articulation-was more similar to the base shape. Two types of comparisons were made: in the figure/ground-reversal, the invalid articulation consisted of exactly the same contour transformation as the valid one with reversed figural polarity. In the axis-of-rotation reversal, the valid articulation consisted of a part rotated around its concave part boundaries, while the invalid articulation consisted of the same part rotated around the endpoints on the opposite side of the part. In two separate 2AFC similarity experiments-one in which the base and transformed shapes were presented simultaneously and one in which they were presented sequentially-subjects were more likely to match the base shape to a transform when it corresponded to a legitimate articulation. These results suggest that the visual system maintains expectations about the way objects will transform, based on their static geometry.

  7. On the implications of the classical ergodic theorems: analysis of developmental processes has to focus on intra-individual variation.

    PubMed

    Molenaar, Peter C M

    2008-01-01

    It is argued that general mathematical-statistical theorems imply that standard statistical analysis techniques of inter-individual variation are invalid to investigate developmental processes. Developmental processes have to be analyzed at the level of individual subjects, using time series data characterizing the patterns of intra-individual variation. It is shown that standard statistical techniques based on the analysis of inter-individual variation appear to be insensitive to the presence of arbitrary large degrees of inter-individual heterogeneity in the population. An important class of nonlinear epigenetic models of neural growth is described which can explain the occurrence of such heterogeneity in brain structures and behavior. Links with models of developmental instability are discussed. A simulation study based on a chaotic growth model illustrates the invalidity of standard analysis of inter-individual variation, whereas time series analysis of intra-individual variation is able to recover the true state of affairs. (c) 2007 Wiley Periodicals, Inc.

  8. [Optimal rehabilitation of patients with coronary heart disease in outpatient setting].

    PubMed

    Korzhenkov, N P; Kuzichkina, S F; Shcherbakova, N A; Kukhaleishvili, N R; Iarlykov, I I

    2012-01-01

    The problem of invalid rehabilitation in Russia is an important state task and dictates necessity of design of an effective state program of primary prevention of cardiovascular diseases. Common global practice of medico-social model is based on complex detailed medico-social aid. Rehabilitation of postmyocardial infarction patients consists of three phases (stages): hospital posthospital (readaptation) and postreconvalescent (supportive). The program includes physical, psychological and pharmacological rehabilitation. Departments of readaptation and medico-social rehabilitation provide effective conduction of all kinds of rehabilitation. The Moscow North-East Regional Administration has a rich experience in organization of departments of readaptation and medico-social rehabilitation. The departments practice an individual approach to the patients and work in a close contact with bureaus of medico-social commission of experts. Management of patients by cardiologist, rehabilitation specialist and outpatient clinic's physicians provides uninterrupted staged rehabilitation, timely correction of pharmacotherapy, early patient referral to invasive investigations and treatment of coronary heart disease. A course of rehabilitative measures lasts 2 months. Setting up departments of medico-social rehabilitation in outpatient clinics provides more effective use of money assigned by the state for social support of invalids.

  9. Combined expectancies: electrophysiological evidence for the adjustment of expectancy effects

    PubMed Central

    Mattler, Uwe; van der Lugt, Arie; Münte, Thomas F

    2006-01-01

    Background When subjects use cues to prepare for a likely stimulus or a likely response, reaction times are facilitated by valid cues but prolonged by invalid cues. In studies on combined expectancy effects, two cues can independently give information regarding two dimensions of the forthcoming task. In certain situations, cueing effects on one dimension are reduced when the cue on the other dimension is invalid. According to the Adjusted Expectancy Model, cues affect different processing levels and a mechanism is presumed which is sensitive to the validity of early level cues and leads to online adjustment of expectancy effects at later levels. To examine the predictions of this model cueing of stimulus modality was combined with response cueing. Results Behavioral measures showed the interaction of cueing effects. Electrophysiological measures of the lateralized readiness potential (LRP) and the N200 amplitude confirmed the predictions of the model. The LRP showed larger effects of response cues on response activation when modality cues were valid rather than invalid. N200 amplitude was largest with valid modality cues and invalid response cues, medium with invalid modality cues, and smallest with two valid cues. Conclusion Findings support the view that the validity of early level expectancies modulates the effects of late level expectancies, which included response activation and response conflict in the present study. PMID:16674805

  10. Invalid performance and the ImPACT in national collegiate athletic association division I football players.

    PubMed

    Szabo, Ashley J; Alosco, Michael L; Fedor, Andrew; Gunstad, John

    2013-01-01

    Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) is a computerized cognitive test battery commonly used for concussion evaluation. An important aspect of these procedures is baseline testing, but researchers have suggested that many users do not use validity indices to ensure adequate effort during testing. No one has examined the prevalence of invalid performance for college football players. To examine the prevalence of invalid scores on ImPACT testing. Cross-sectional study. National Collegiate Athletic Association Division I university. A total of 159 athletes (age = 20.3 ± 1.41 years; range = 17.8-23.7 years) from a Division I collegiate football team participated. An informational intervention regarding the importance of concussion testing to promote safety was administered before testing for the most recent season. We examined preseason ImPACT testing data across a 3-year period (total assessments = 269). Based on invalid and sandbagging indices denoted by the ImPACT manual, protocols were examined to indicate how many invalid indices each athlete had. A total of 27.9% (n = 75) of assessments were suggestive of invalid scores, with 4.1% (n = 11) suggesting invalid responding only, 17.5% (n = 47) indicating "sandbagging" only, and 6.3% (n = 17) showing both invalid and sandbagging responding. The informational intervention did not reduce the prevalence of invalid responding. These findings highlight the need for further information about the ImPACT validity indices and whether they truly reflect poor effort. Future work is needed to identify practices to reliably target and reduce invalid responding.

  11. Base rate of performance invalidity among non-clinical undergraduate research participants.

    PubMed

    Silk-Eglit, Graham M; Stenclik, Jessica H; Gavett, Brandon E; Adams, Jason W; Lynch, Julie K; Mccaffrey, Robert J

    2014-08-01

    Neuropsychological research frequently uses non-clinical undergraduate participants to evaluate neuropsychological tests. However, a recent study by An and colleagues (2012, Archives of Clinical Neuropsychology, 27, 849-857) called into question that the extent to which the interpretation of these participants' performance on neuropsychological tests is valid. This study found that in a sample of 36 participants, 55.6% exhibited performance invalidity at an initial session and 30.8% exhibited performance invalidity at a follow-up session. The current study attempted to replicate these findings in a larger, more representative sample using a more rigorous methodology. Archival data from 133 non-clinical undergraduate research participants were analyzed. Participants were classified as performance invalid if they failed any one PVT. In the current sample, only 2.26% of participants exhibited performance invalidity. Thus, concerns regarding insufficient effort and performance invalidity when using undergraduate research participants appear to be overstated. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Application of dynamic slip wall modeling to a turbine nozzle guide vane

    NASA Astrophysics Data System (ADS)

    Bose, Sanjeeb; Talnikar, Chaitanya; Blonigan, Patrick; Wang, Qiqi

    2015-11-01

    Resolution of near-wall turbulent structures is computational prohibitive necessitating the need for wall-modeled large-eddy simulation approaches. Standard wall models are often based on assumptions of equilibrium boundary layers, which do not necessarily account for the dissimilarity of the momentum and thermal boundary layers. We investigate the use of the dynamic slip wall boundary condition (Bose and Moin, 2014) for the prediction of surface heat transfer on a turbine nozzle guide vane (Arts and de Rouvroit, 1992). The heat transfer coefficient is well predicted by the slip wall model, including capturing the transition to turbulence. The sensitivity of the heat transfer coefficient to the incident turbulence intensity will additionally be discussed. Lastly, the behavior of the thermal and momentum slip lengths will be contrasted between regions where the strong Reynolds analogy is invalid (near transition on the suction side) and an isothermal, zero pressure gradient flat plate boundary layer (Wu and Moin, 2010).

  13. Vaccines against leptospirosis.

    PubMed

    Adler, Ben

    2015-01-01

    Vaccines against leptospirosis followed within a year of the first isolation of Leptospira, with the first use of a killed whole cell bacterin vaccine in guinea pigs published in 1916. Since then, bacterin vaccines have been used in humans, cattle, swine, and dogs and remain the only vaccines licensed at the present time. The immunity elicited is restricted to serovars with related lipopolysaccharide (LPS) antigen. Likewise, vaccines based on LPS antigens have clearly demonstrated protection in animal models, which is also at best serogroup specific. The advent of leptospiral genome sequences has allowed a reverse vaccinology approach for vaccine development. However, the use of inadequate challenge doses and inappropriate statistical analysis invalidates many of the claims of protection with recombinant proteins.

  14. [The loss of work fitness and the course of invalidism in patients with limb vessel lesions].

    PubMed

    Chernenko, V F; Goncharenko, A G; Shuvalov, A Iu; Chernenko, V V; Tarasov, I V

    2005-01-01

    The growth of the sick rate of limb peripheral vessels associated with a severe outcome (trophic ulcers, amputation) exerts an appreciable effect on the lowering of quality of life in patients. This manifests by the prolonged loss of work fitness, change of the habitual occupation and disability establishment. Objective analytical information on this problem will be of help in the delineation of the tendencies in this direction and potential approaches to the prevention of social losses. The present work is based on an analysis of 2115 statements of medicosocial expert evaluation (MSEE) of invalids suffering from diseases of limb vessels, performed over recent 8 years in the Altai region. The decisions made by the MSEE were based on the results of the clinical examination of patients using the current diagnostic modalities (ultrasonography, duplex scanning, angiography, etc). It has been established that among persons who had undergone MSEE, over the half (64.1%) were under 60 years, i.e. in the age of work fitness. It is noteworthy that the overwhelming number of invalids were men (83%) and workers (84.2%). As for special vascular pathologies, the majority of patients presented with obliterative arterial diseases (OAD) of the lower limbs, accounting for 76.3% whereas patients with venous pathology ranked second in number (15.9%). The highest severity of invalidism (groups I and II) was also recorded in OAD (77.5%), especially in atherosclerosis obliterans (AO) which accounted for 84%. Of note, these diseases were marked by no tendency toward reduction of their incidence. The time of temporary disability (from 3 to 9 months) was also most frequently recorded in OAD of the limbs. In OAD, the temporary or persistent loss of work fitness were caused by critical ischemia and amputations whereas in venous pathology, namely in varicosity and post-thrombophlebotic syndrome, the cause was progressing CVI complicated by trophic ulcers. On the whole, the lack of changes in the lowering of the number of invalids due to the given pathology evidences the unsatisfactory results of these patients' rehabilitation and the high socioeconomic tension determined by considerable treatment expenses and the high number of the disabled. Approaches to the escape from such a situation should be looked for in the early mass screening diagnosis of vascular lesions, early drug and surgical treatment and in the refinement of the system of rehabilitation prophylactic medical examination.

  15. The effect of motion and signalling on drivers' ability to predict intentions of other road users.

    PubMed

    Lee, Yee Mun; Sheppard, Elizabeth

    2016-10-01

    Failure in making the correct judgment about the intention of an approaching vehicle at a junction could lead to a collision. This paper investigated the impact of dynamic information on drivers' judgments about the intentions of approaching cars and motorcycles, and whether a valid or invalid signal was provided was also manipulated. Participants were presented with videoclips of vehicles approaching a junction which terminated immediately before the vehicle made any manoeuvre, or images of the final frame of each video. They were asked to judge whether or not the vehicle would turn. Drivers were better in judging the manoeuvre of approaching vehicles in dynamic than static stimuli, for both vehicle types. Drivers were better in judging the manoeuvre of cars than motorcycles for videos, but not for photographs. Drivers were also better in judging the manoeuvre of approaching vehicles when a valid signal was provided than an invalid signal, demonstrating the importance of providing a valid signal while driving. However, drivers were still somewhat successful in their judgments in most of the conditions with an invalid signal, suggesting that drivers were able to focus on other cues to intention. Finally, given that dynamic stimuli more closely reflect the demands of real-life driving there may be a need for drivers to adopt a more cautious approach while inferring a motorcyclist's intentions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. A model-based design and validation approach with OMEGA-UML and the IF toolset

    NASA Astrophysics Data System (ADS)

    Ben-hafaiedh, Imene; Constant, Olivier; Graf, Susanne; Robbana, Riadh

    2009-03-01

    Intelligent, embedded systems such as autonomous robots and other industrial systems are becoming increasingly more heterogeneous with respect to the platforms on which they are implemented, and thus the software architecture more complex to design and analyse. In this context, it is important to have well-defined design methodologies which should be supported by (1) high level design concepts allowing to master the design complexity, (2) concepts for the expression of non-functional requirements and (3) analysis tools allowing to verify or invalidate that the system under development will be able to conform to its requirements. We illustrate here such an approach for the design of complex embedded systems on hand of a small case study used as a running example for illustration purposes. We briefly present the important concepts of the OMEGA-RT UML profile, we show how we use this profile in a modelling approach, and explain how these concepts are used in the IFx verification toolbox to integrate validation into the design flow and make scalable verification possible.

  17. Modeling of correlated data with informative cluster sizes: An evaluation of joint modeling and within-cluster resampling approaches.

    PubMed

    Zhang, Bo; Liu, Wei; Zhang, Zhiwei; Qu, Yanping; Chen, Zhen; Albert, Paul S

    2017-08-01

    Joint modeling and within-cluster resampling are two approaches that are used for analyzing correlated data with informative cluster sizes. Motivated by a developmental toxicity study, we examined the performances and validity of these two approaches in testing covariate effects in generalized linear mixed-effects models. We show that the joint modeling approach is robust to the misspecification of cluster size models in terms of Type I and Type II errors when the corresponding covariates are not included in the random effects structure; otherwise, statistical tests may be affected. We also evaluate the performance of the within-cluster resampling procedure and thoroughly investigate the validity of it in modeling correlated data with informative cluster sizes. We show that within-cluster resampling is a valid alternative to joint modeling for cluster-specific covariates, but it is invalid for time-dependent covariates. The two methods are applied to a developmental toxicity study that investigated the effect of exposure to diethylene glycol dimethyl ether.

  18. An improved cooperative adaptive cruise control (CACC) algorithm considering invalid communication

    NASA Astrophysics Data System (ADS)

    Wang, Pangwei; Wang, Yunpeng; Yu, Guizhen; Tang, Tieqiao

    2014-05-01

    For the Cooperative Adaptive Cruise Control (CACC) Algorithm, existing research studies mainly focus on how inter-vehicle communication can be used to develop CACC controller, the influence of the communication delays and lags of the actuators to the string stability. However, whether the string stability can be guaranteed when inter-vehicle communication is invalid partially has hardly been considered. This paper presents an improved CACC algorithm based on the sliding mode control theory and analyses the range of CACC controller parameters to maintain string stability. A dynamic model of vehicle spacing deviation in a platoon is then established, and the string stability conditions under improved CACC are analyzed. Unlike the traditional CACC algorithms, the proposed algorithm can ensure the functionality of the CACC system even if inter-vehicle communication is partially invalid. Finally, this paper establishes a platoon of five vehicles to simulate the improved CACC algorithm in MATLAB/Simulink, and the simulation results demonstrate that the improved CACC algorithm can maintain the string stability of a CACC platoon through adjusting the controller parameters and enlarging the spacing to prevent accidents. With guaranteed string stability, the proposed CACC algorithm can prevent oscillation of vehicle spacing and reduce chain collision accidents under real-world circumstances. This research proposes an improved CACC algorithm, which can guarantee the string stability when inter-vehicle communication is invalid.

  19. Pain patients' experiences of validation and invalidation from physicians before and after multimodal pain rehabilitation: Associations with pain, negative affectivity, and treatment outcome.

    PubMed

    Edlund, Sara M; Wurm, Matilda; Holländare, Fredrik; Linton, Steven J; Fruzzetti, Alan E; Tillfors, Maria

    2017-10-01

    Validating and invalidating responses play an important role in communication with pain patients, for example regarding emotion regulation and adherence to treatment. However, it is unclear how patients' perceptions of validation and invalidation relate to patient characteristics and treatment outcome. The aim of this study was to investigate the occurrence of subgroups based on pain patients' perceptions of validation and invalidation from their physicians. The stability of these perceptions and differences between subgroups regarding pain, pain interference, negative affectivity and treatment outcome were also explored. A total of 108 pain patients answered questionnaires regarding perceived validation and invalidation, pain severity, pain interference, and negative affectivity before and after pain rehabilitation treatment. Two cluster analyses using perceived validation and invalidation were performed, one on pre-scores and one on post-scores. The stability of patient perceptions from pre- to post-treatment was investigated, and clusters were compared on pain severity, pain interference, and negative affectivity. Finally, the connection between perceived validation and invalidation and treatment outcome was explored. Three clusters emerged both before and after treatment: (1) low validation and heightened invalidation, (2) moderate validation and invalidation, and (3) high validation and low invalidation. Perceptions of validation and invalidation were generally stable over time, although there were individuals whose perceptions changed. When compared to the other two clusters, the low validation/heightened invalidation cluster displayed significantly higher levels of pain interference and negative affectivity post-treatment but not pre-treatment. The whole sample significantly improved on pain interference and depression, but treatment outcome was independent of cluster. Unexpectedly, differences between clusters on pain interference and negative affectivity were only found post-treatment. This appeared to be due to the pre- and post-heightened invalidation clusters not containing the same individuals. Therefore, additional analyses were conducted to investigate the individuals who changed clusters. Results showed that patients scoring high on negative affectivity ended up in the heightened invalidation cluster post-treatment. Taken together, most patients felt understood when communicating with their rehabilitation physician. However, a smaller group of patients experienced the opposite: low levels of validation and heightened levels of invalidation. This group stood out as more problematic, reporting greater pain interference and negative affectivity when compared to the other groups after treatment. Patient perceptions were typically stable over time, but some individuals changed cluster, and these movements seemed to be related to negative affectivity and pain interference. These results do not support a connection between perceived validation and invalidation from physicians (meeting the patients pre- and post-treatment) and treatment outcome. Overall, our results suggest that there is a connection between negative affectivity and pain interference in the patients, and perceived validation and invalidation from the physicians. In clinical practice, it is important to pay attention to comorbid psychological problems and level of pain interference, since these factors may negatively influence effective communication. A focus on decreasing invalidating responses and/or increasing validating responses might be particularly important for patients with high levels of psychological problems and pain interference. Copyright © 2017. Published by Elsevier B.V.

  20. Experimental invalidation of phase-transition-induced elastic softening in CrN

    NASA Astrophysics Data System (ADS)

    Wang, Shanmin; Yu, Xiaohui; Zhang, Jianzhong; Chen, Miao; Zhu, Jinlong; Wang, Liping; He, Duanwei; Lin, Zhijun; Zhang, Ruifeng; Leinenweber, Kurt; Zhao, Yusheng

    2012-08-01

    We report experimental results of phase stability and incompressibility of CrN. The obtained bulk moduli for cubic and orthorhombic CrN are 257 and 262 GPa, respectively. These results invalidate the conclusion of phase-transition-induced elastic softening recently reported based on nonmagnetic simulations for cubic CrN [Nature Mater.NMAACR1476-112210.1038/nmat2549 8, 947 (2009)]. On the other hand, they provide the only experimental evidence to support the computational models involving the local magnetic moment of Cr atoms [Nature Mater.NMAACR1476-112210.1038/nmat2722 9, 283 (2010)], indicating that atomic spin has a profound influence on the material's elastic properties. We also demonstrate that nonstoichiometry in CrNx has strong effects on its structural stability.

  1. Childhood Emotional Invalidation and Adult Psychological Distress: The Mediating Role of Emotional Inhibition.

    ERIC Educational Resources Information Center

    Krause, Elizabeth D.; Mendelson, Tamar; Lynch, Thomas R.

    2003-01-01

    Adults (n=127) completed a series of self-report questionnaires and 88 completed an additional measure of current avoidant coping in response to a laboratory stressor. Findings strongly supported a model in which a history of childhood emotional invalidation was associated with chronic emotional inhibition in adulthood. (Contains references.)…

  2. Intergenerational Transmission of Emotion Dysregulation Through Parental Invalidation of Emotions: Implications for Adolescent Internalizing and Externalizing Behaviors

    PubMed Central

    Parra, Gilbert R.; Jobe-Shields, Lisa

    2014-01-01

    We examined parent emotion dysregulation as part of a model of family emotion-related processes and adolescent psychopathology. Participants were 80 parent–adolescent dyads (mean age = 13.6; 79 % African-American and 17 % Caucasian) with diverse family composition and socioeconomic status. Parent and adolescent dyads self-reported on their emotion regulation difficulties and adolescents reported on their perceptions of parent invalidation (i.e., punishment and neglect) of emotions and their own internalizing and externalizing behaviors. Results showed that parents who reported higher levels of emotion dysregulation tended to invalidate their adolescent’s emotional expressions more often, which in turn related to higher levels of adolescent emotion dysregulation. Additionally, adolescent-reported emotion dysregulation mediated the relation between parent invalidation of emotions and adolescent internalizing and externalizing behaviors. Potential applied implications are discussed. PMID:24855329

  3. On performance of parametric and distribution-free models for zero-inflated and over-dispersed count responses.

    PubMed

    Tang, Wan; Lu, Naiji; Chen, Tian; Wang, Wenjuan; Gunzler, Douglas David; Han, Yu; Tu, Xin M

    2015-10-30

    Zero-inflated Poisson (ZIP) and negative binomial (ZINB) models are widely used to model zero-inflated count responses. These models extend the Poisson and negative binomial (NB) to address excessive zeros in the count response. By adding a degenerate distribution centered at 0 and interpreting it as describing a non-risk group in the population, the ZIP (ZINB) models a two-component population mixture. As in applications of Poisson and NB, the key difference between ZIP and ZINB is the allowance for overdispersion by the ZINB in its NB component in modeling the count response for the at-risk group. Overdispersion arising in practice too often does not follow the NB, and applications of ZINB to such data yield invalid inference. If sources of overdispersion are known, other parametric models may be used to directly model the overdispersion. Such models too are subject to assumed distributions. Further, this approach may not be applicable if information about the sources of overdispersion is unavailable. In this paper, we propose a distribution-free alternative and compare its performance with these popular parametric models as well as a moment-based approach proposed by Yu et al. [Statistics in Medicine 2013; 32: 2390-2405]. Like the generalized estimating equations, the proposed approach requires no elaborate distribution assumptions. Compared with the approach of Yu et al., it is more robust to overdispersed zero-inflated responses. We illustrate our approach with both simulated and real study data. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Unscaled Bayes factors for multiple hypothesis testing in microarray experiments.

    PubMed

    Bertolino, Francesco; Cabras, Stefano; Castellanos, Maria Eugenia; Racugno, Walter

    2015-12-01

    Multiple hypothesis testing collects a series of techniques usually based on p-values as a summary of the available evidence from many statistical tests. In hypothesis testing, under a Bayesian perspective, the evidence for a specified hypothesis against an alternative, conditionally on data, is given by the Bayes factor. In this study, we approach multiple hypothesis testing based on both Bayes factors and p-values, regarding multiple hypothesis testing as a multiple model selection problem. To obtain the Bayes factors we assume default priors that are typically improper. In this case, the Bayes factor is usually undetermined due to the ratio of prior pseudo-constants. We show that ignoring prior pseudo-constants leads to unscaled Bayes factor which do not invalidate the inferential procedure in multiple hypothesis testing, because they are used within a comparative scheme. In fact, using partial information from the p-values, we are able to approximate the sampling null distribution of the unscaled Bayes factor and use it within Efron's multiple testing procedure. The simulation study suggests that under normal sampling model and even with small sample sizes, our approach provides false positive and false negative proportions that are less than other common multiple hypothesis testing approaches based only on p-values. The proposed procedure is illustrated in two simulation studies, and the advantages of its use are showed in the analysis of two microarray experiments. © The Author(s) 2011.

  5. Use of Latent Class Analysis to define groups based on validity, cognition, and emotional functioning.

    PubMed

    Morin, Ruth T; Axelrod, Bradley N

    Latent Class Analysis (LCA) was used to classify a heterogeneous sample of neuropsychology data. In particular, we used measures of performance validity, symptom validity, cognition, and emotional functioning to assess and describe latent groups of functioning in these areas. A data-set of 680 neuropsychological evaluation protocols was analyzed using a LCA. Data were collected from evaluations performed for clinical purposes at an urban medical center. A four-class model emerged as the best fitting model of latent classes. The resulting classes were distinct based on measures of performance validity and symptom validity. Class A performed poorly on both performance and symptom validity measures. Class B had intact performance validity and heightened symptom reporting. The remaining two Classes performed adequately on both performance and symptom validity measures, differing only in cognitive and emotional functioning. In general, performance invalidity was associated with worse cognitive performance, while symptom invalidity was associated with elevated emotional distress. LCA appears useful in identifying groups within a heterogeneous sample with distinct performance patterns. Further, the orthogonal nature of performance and symptom validities is supported.

  6. Android platform based smartphones for a logistical remote association repair framework.

    PubMed

    Lien, Shao-Fan; Wang, Chun-Chieh; Su, Juhng-Perng; Chen, Hong-Ming; Wu, Chein-Hsing

    2014-06-25

    The maintenance of large-scale systems is an important issue for logistics support planning. In this paper, we developed a Logistical Remote Association Repair Framework (LRARF) to aid repairmen in keeping the system available. LRARF includes four subsystems: smart mobile phones, a Database Management System (DBMS), a Maintenance Support Center (MSC) and wireless networks. The repairman uses smart mobile phones to capture QR-codes and the images of faulty circuit boards. The captured QR-codes and images are transmitted to the DBMS so the invalid modules can be recognized via the proposed algorithm. In this paper, the Linear Projective Transform (LPT) is employed for fast QR-code calibration. Moreover, the ANFIS-based data mining system is used for module identification and searching automatically for the maintenance manual corresponding to the invalid modules. The inputs of the ANFIS-based data mining system are the QR-codes and image features; the output is the module ID. DBMS also transmits the maintenance manual back to the maintenance staff. If modules are not recognizable, the repairmen and center engineers can obtain the relevant information about the invalid modules through live video. The experimental results validate the applicability of the Android-based platform in the recognition of invalid modules. In addition, the live video can also be recorded synchronously on the MSC for later use.

  7. [How valid are student self-reports of bullying in schools?].

    PubMed

    Morbitzer, Petra; Spröber, Nina; Hautzinger, Martin

    2009-01-01

    In this study we examine the reliability and validity of students' self-reports about bullying and victimization in schools. 208 5th class students of four "middle schools" in Southern Germany filled in the Bully-Victim-Questionnaire (Olweus, 1989, adapted by Lösel, Bliesener, Averbeck, 1997) and the School Climate Survey (Brockenborough, 2001) to assess the prevalence of bullying/victimization, and to evaluate attitudes towards aggression and support for victims. By using reliability and validity criteria, one third (31%) of the questionnaires was classified as "unreliable/invalid". Mean comparisons of the "unreliable/invalid" group and the "valid" group of the subscales concerning bullying/victimization found significant differences. The "unreliable/invalid" group stated higher values of bullying and victimization. Based on the "unreliable/invalid" questionnaires more students could be identified as bullies/victims or bully-victims. The prevalence of bullying/victimization in the whole sample was reduced if "unreliable/invalid" questionnaires were excluded. The results are discussed in the framework of theories about the presentation of the self ("impression management', "social desirability") and systematic response patterns ("extreme response bias").

  8. Calibrating Parameters of Power System Stability Models using Advanced Ensemble Kalman Filter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Diao, Ruisheng; Li, Yuanyuan

    With the ever increasing penetration of renewable energy, smart loads, energy storage, and new market behavior, today’s power grid becomes more dynamic and stochastic, which may invalidate traditional study assumptions and pose great operational challenges. Thus, it is of critical importance to maintain good-quality models for secure and economic planning and real-time operation. Following the 1996 Western Systems Coordinating Council (WSCC) system blackout, North American Electric Reliability Corporation (NERC) and Western Electricity Coordinating Council (WECC) in North America enforced a number of policies and standards to guide the power industry to periodically validate power grid models and calibrate poor parametersmore » with the goal of building sufficient confidence in model quality. The PMU-based approach using online measurements without interfering with the operation of generators provides a low-cost alternative to meet NERC standards. This paper presents an innovative procedure and tool suites to validate and calibrate models based on a trajectory sensitivity analysis method and an advanced ensemble Kalman filter algorithm. The developed prototype demonstrates excellent performance in identifying and calibrating bad parameters of a realistic hydro power plant against multiple system events.« less

  9. A test of inflated zeros for Poisson regression models.

    PubMed

    He, Hua; Zhang, Hui; Ye, Peng; Tang, Wan

    2017-01-01

    Excessive zeros are common in practice and may cause overdispersion and invalidate inference when fitting Poisson regression models. There is a large body of literature on zero-inflated Poisson models. However, methods for testing whether there are excessive zeros are less well developed. The Vuong test comparing a Poisson and a zero-inflated Poisson model is commonly applied in practice. However, the type I error of the test often deviates seriously from the nominal level, rendering serious doubts on the validity of the test in such applications. In this paper, we develop a new approach for testing inflated zeros under the Poisson model. Unlike the Vuong test for inflated zeros, our method does not require a zero-inflated Poisson model to perform the test. Simulation studies show that when compared with the Vuong test our approach not only better at controlling type I error rate, but also yield more power.

  10. Functional mechanisms of probabilistic inference in feature- and space-based attentional systems.

    PubMed

    Dombert, Pascasie L; Kuhns, Anna; Mengotti, Paola; Fink, Gereon R; Vossel, Simone

    2016-11-15

    Humans flexibly attend to features or locations and these processes are influenced by the probability of sensory events. We combined computational modeling of response times with fMRI to compare the functional correlates of (re-)orienting, and the modulation by probabilistic inference in spatial and feature-based attention systems. Twenty-four volunteers performed two task versions with spatial or color cues. Percentage of cue validity changed unpredictably. A hierarchical Bayesian model was used to derive trial-wise estimates of probability-dependent attention, entering the fMRI analysis as parametric regressors. Attentional orienting activated a dorsal frontoparietal network in both tasks, without significant parametric modulation. Spatially invalid trials activated a bilateral frontoparietal network and the precuneus, while invalid feature trials activated the left intraparietal sulcus (IPS). Probability-dependent attention modulated activity in the precuneus, left posterior IPS, middle occipital gyrus, and right temporoparietal junction for spatial attention, and in the left anterior IPS for feature-based and spatial attention. These findings provide novel insights into the generality and specificity of the functional basis of attentional control. They suggest that probabilistic inference can distinctively affect each attentional subsystem, but that there is an overlap in the left IPS, which responds to both spatial and feature-based expectancy violations. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Valid randomization-based p-values for partially post hoc subgroup analyses.

    PubMed

    Lee, Joseph J; Rubin, Donald B

    2015-10-30

    By 'partially post-hoc' subgroup analyses, we mean analyses that compare existing data from a randomized experiment-from which a subgroup specification is derived-to new, subgroup-only experimental data. We describe a motivating example in which partially post hoc subgroup analyses instigated statistical debate about a medical device's efficacy. We clarify the source of such analyses' invalidity and then propose a randomization-based approach for generating valid posterior predictive p-values for such partially post hoc subgroups. Lastly, we investigate the approach's operating characteristics in a simple illustrative setting through a series of simulations, showing that it can have desirable properties under both null and alternative hypotheses. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Instrumental variables and Mendelian randomization with invalid instruments

    NASA Astrophysics Data System (ADS)

    Kang, Hyunseung

    Instrumental variables (IV) methods have been widely used to determine the causal effect of a treatment, exposure, policy, or an intervention on an outcome of interest. The IV method relies on having a valid instrument, a variable that is (A1) associated with the exposure, (A2) has no direct effect on the outcome, and (A3) is unrelated to the unmeasured confounders associated with the exposure and the outcome. However, in practice, finding a valid instrument, especially those that satisfy (A2) and (A3), can be challenging. For example, in Mendelian randomization studies where genetic markers are used as instruments, complete knowledge about instruments' validity is equivalent to complete knowledge about the involved genes' functions. The dissertation explores the theory, methods, and application of IV methods when invalid instruments are present. First, when we have multiple candidate instruments, we establish a theoretical bound whereby causal effects are only identified as long as less than 50% of instruments are invalid, without knowing which of the instruments are invalid. We also propose a fast penalized method, called sisVIVE, to estimate the causal effect. We find that sisVIVE outperforms traditional IV methods when invalid instruments are present both in simulation studies as well as in real data analysis. Second, we propose a robust confidence interval under the multiple invalid IV setting. This work is an extension of our work on sisVIVE. However, unlike sisVIVE which is robust to violations of (A2) and (A3), our confidence interval procedure provides honest coverage even if all three assumptions, (A1)-(A3), are violated. Third, we study the single IV setting where the one IV we have may actually be invalid. We propose a nonparametric IV estimation method based on full matching, a technique popular in causal inference for observational data, that leverages observed covariates to make the instrument more valid. We propose an estimator along with inferential results that are robust to mis-specifications of the covariate-outcome model. We also provide a sensitivity analysis should the instrument turn out to be invalid, specifically violate (A3). Fourth, in application work, we study the causal effect of malaria on stunting among children in Ghana. Previous studies on the effect of malaria and stunting were observational and contained various unobserved confounders, most notably nutritional deficiencies. To infer causality, we use the sickle cell genotype, a trait that confers some protection against malaria and was randomly assigned at birth, as an IV and apply our nonparametric IV method. We find that the risk of stunting increases by 0.22 (95% CI: 0.044,1) for every malaria episode and is sensitive to unmeasured confounders.

  13. [Structure of childhood and adolescent invalidity in persons with chronic somatic diseases].

    PubMed

    Korenev, N M; Bogmat, L F; Tolmacheva, S R; Timofeeva, O N

    2002-01-01

    Based on the analysis of statistical data, prevalence is estimated of disorders with invalidism patterns outlined among those children and young adults under 40 years of age presenting with chronic somatic disorders in Kharkov. Both in children (52.4%) and in young adults (43.9%) diseases of the nervous system held the prominent place. Invalidity due to formed somatic disorders was identified in 10.9% of children and 24.3% of those persons less than 40 years old. There prevailed diseases of the circulation organs. The necessity is substantiated for the rehabilitation to be carried out of children with somatic disorders to prevent their disability.

  14. Android Platform Based Smartphones for a Logistical Remote Association Repair Framework

    PubMed Central

    Lien, Shao-Fan; Wang, Chun-Chieh; Su, Juhng-Perng; Chen, Hong-Ming; Wu, Chein-Hsing

    2014-01-01

    The maintenance of large-scale systems is an important issue for logistics support planning. In this paper, we developed a Logistical Remote Association Repair Framework (LRARF) to aid repairmen in keeping the system available. LRARF includes four subsystems: smart mobile phones, a Database Management System (DBMS), a Maintenance Support Center (MSC) and wireless networks. The repairman uses smart mobile phones to capture QR-codes and the images of faulty circuit boards. The captured QR-codes and images are transmitted to the DBMS so the invalid modules can be recognized via the proposed algorithm. In this paper, the Linear Projective Transform (LPT) is employed for fast QR-code calibration. Moreover, the ANFIS-based data mining system is used for module identification and searching automatically for the maintenance manual corresponding to the invalid modules. The inputs of the ANFIS-based data mining system are the QR-codes and image features; the output is the module ID. DBMS also transmits the maintenance manual back to the maintenance staff. If modules are not recognizable, the repairmen and center engineers can obtain the relevant information about the invalid modules through live video. The experimental results validate the applicability of the Android-based platform in the recognition of invalid modules. In addition, the live video can also be recorded synchronously on the MSC for later use. PMID:24967603

  15. Extraction of linear features on SAR imagery

    NASA Astrophysics Data System (ADS)

    Liu, Junyi; Li, Deren; Mei, Xin

    2006-10-01

    Linear features are usually extracted from SAR imagery by a few edge detectors derived from the contrast ratio edge detector with a constant probability of false alarm. On the other hand, the Hough Transform is an elegant way of extracting global features like curve segments from binary edge images. Randomized Hough Transform can reduce the computation time and memory usage of the HT drastically. While Randomized Hough Transform will bring about a great deal of cells invalid during the randomized sample. In this paper, we propose a new approach to extract linear features on SAR imagery, which is an almost automatic algorithm based on edge detection and Randomized Hough Transform. The presented improved method makes full use of the directional information of each edge candidate points so as to solve invalid cumulate problems. Applied result is in good agreement with the theoretical study, and the main linear features on SAR imagery have been extracted automatically. The method saves storage space and computational time, which shows its effectiveness and applicability.

  16. An experimental pilot study of response to invalidation in young women with features of borderline personality disorder.

    PubMed

    Woodberry, Kristen A; Gallo, Kaitlin P; Nock, Matthew K

    2008-01-15

    One of the leading biosocial theories of borderline personality disorder (BPD) suggests that individuals with BPD have biologically based abnormalities in emotion regulation contributing to more intense and rapid responses to emotional stimuli, in particular, invalidation [Linehan, M.M., 1993. Cognitive-Behavioral Treatment of Borderline Personality Disorder. Guilford, New York.]. This study used a 2 by 2 experimental design to test whether young women with features of BPD actually show increased physiological arousal in response to invalidation. Twenty-three women ages 18 to 29 who endorsed high levels of BPD symptoms and 18 healthy controls were randomly assigned to hear either a validating or invalidating comment during a frustrating task. Although we found preliminary support for differential response to these stimuli in self-report of valence, we found neither self-report nor physiological evidence of hyperarousal in the BPD features group, either at baseline or in response to invalidation. Interestingly, the BPD features group reported significantly lower comfort with emotion, and comfort was significantly associated with affective valence but not arousal. We discuss implications for understanding and responding to the affective intensity of this population.

  17. Using global sensitivity analysis to evaluate the uncertainties of future shoreline changes under the Bruun rule assumption

    NASA Astrophysics Data System (ADS)

    Le Cozannet, Gonéri; Oliveros, Carlos; Castelle, Bruno; Garcin, Manuel; Idier, Déborah; Pedreros, Rodrigo; Rohmer, Jeremy

    2016-04-01

    Future sandy shoreline changes are often assed by summing the contributions of longshore and cross-shore effects. In such approaches, a contribution of sea-level rise can be incorporated by adding a supplementary term based on the Bruun rule. Here, our objective is to identify where and when the use of the Bruun rule can be (in)validated, in the case of wave-exposed beaches with gentle slopes. We first provide shoreline change scenarios that account for all uncertain hydrosedimentary processes affecting the idealized low- and high-energy coasts described by Stive (2004)[Stive, M. J. F. 2004, How important is global warming for coastal erosion? an editorial comment, Climatic Change, vol. 64, n 12, doi:10.1023/B:CLIM.0000024785.91858. ISSN 0165-0009]. Then, we generate shoreline change scenarios based on probabilistic sea-level rise projections based on IPCC. For scenario RCP 6.0 and 8.5 and in the absence of coastal defenses, the model predicts an observable shift toward generalized beach erosion by the middle of the 21st century. On the contrary, the model predictions are unlikely to differ from the current situation in case of scenario RCP 2.6. To get insight into the relative importance of each source of uncertainties, we quantify each contributions to the variance of the model outcome using a global sensitivity analysis. This analysis shows that by the end of the 21st century, a large part of shoreline change uncertainties are due to the climate change scenario if all anthropogenic greenhousegas emission scenarios are considered equiprobable. To conclude, the analysis shows that under the assumptions above, (in)validating the Bruun rule should be straightforward during the second half of the 21st century and for the RCP 8.5 scenario. Conversely, for RCP 2.6, the noise in shoreline change evolution should continue dominating the signal due to the Bruun effect. This last conclusion can be interpreted as an important potential benefit of climate change mitigation.

  18. The Perceived Invalidation of Emotion Scale (PIES): Development and psychometric properties of a novel measure of current emotion invalidation.

    PubMed

    Zielinski, Melissa J; Veilleux, Jennifer C

    2018-05-24

    Emotion invalidation is theoretically and empirically associated with mental and physical health problems. However, existing measures of invalidation focus on past (e.g., childhood) invalidation and/or do not specifically emphasize invalidation of emotion. In this article, the authors articulate a clarified operational definition of emotion invalidation and use that definition as the foundation for development of a new measure of current perceived emotion invalidation across a series of five studies. Study 1 was a qualitative investigation of people's experiences with emotional invalidation from which we generated items. An initial item pool was vetted by expert reviewers in Study 2 and examined via exploratory factor analysis in Study 3 within both college student and online samples. The scale was reduced to 10 items via confirmatory factor analysis in Study 4, resulting in a brief but psychometrically promising measure, the Perceived Invalidation of Emotion Scale (PIES). A short-term longitudinal investigation (Study 5) revealed that PIES scores had strong test-retest reliability, and that greater perceived emotion invalidation was associated with greater emotion dysregulation, borderline features and symptoms of emotional distress. In addition, the PIES predicted changes in relational health and psychological health over a 1-month period. The current set of studies thus presents a psychometrically promising and practical measure of perceived emotion invalidation that can provide a foundation for future research in this burgeoning area. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  19. Perceived family and peer invalidation as predictors of adolescent suicidal behaviors and self-mutilation.

    PubMed

    Yen, Shirley; Kuehn, Kevin; Tezanos, Katherine; Weinstock, Lauren M; Solomon, Joel; Spirito, Anthony

    2015-03-01

    The present study investigates the longitudinal relationship between perceived family and peer invalidation and adolescent suicidal events (SE) and self-mutilation (SM) in a 6 month follow-up (f/u) study of adolescents admitted to an inpatient psychiatric unit for suicide risk. Adolescents (n=119) and their parent(s) were administered interviews and self-report assessments at baseline and at a 6 month f/u, with 99 (83%) completing both assessments. The Adolescent Longitudinal Interval Follow-Up Evaluation (A-LIFE) was modified to provide weekly ratings (baseline and each week of f/u) for perceived family and peer invalidation. Regression analyses examined whether: 1) Prospectively rated perceived family and peer invalidation at baseline predicted SE and SM during f/u; and 2) chronicity of perceived invalidation operationalized as proportion of weeks at moderate to high invalidation during f/u was associated with SE and SM during f/u. Multiple regression analyses, controlling for previously identified covariates, revealed that perceived family invalidation predicted SE over f/u for boys only and perceived peer invalidation predicted SM over f/u in the overall sample. This was the case for both baseline and f/u ratings of perceived invalidation. Our results demonstrate the adverse impact of perceived family and peer invalidation. Specifically, boys who experienced high perceived family invalidation were more likely to have an SE over f/u. Both boys and girls who experienced high perceived peer invalidation were more likely to engage in SM over f/u.

  20. Connectivity-based, all-hexahedral mesh generation method and apparatus

    DOEpatents

    Tautges, T.J.; Mitchell, S.A.; Blacker, T.D.; Murdoch, P.

    1998-06-16

    The present invention is a computer-based method and apparatus for constructing all-hexahedral finite element meshes for finite element analysis. The present invention begins with a three-dimensional geometry and an all-quadrilateral surface mesh, then constructs hexahedral element connectivity from the outer boundary inward, and then resolves invalid connectivity. The result of the present invention is a complete representation of hex mesh connectivity only; actual mesh node locations are determined later. The basic method of the present invention comprises the step of forming hexahedral elements by making crossings of entities referred to as ``whisker chords.`` This step, combined with a seaming operation in space, is shown to be sufficient for meshing simple block problems. Entities that appear when meshing more complex geometries, namely blind chords, merged sheets, and self-intersecting chords, are described. A method for detecting invalid connectivity in space, based on repeated edges, is also described, along with its application to various cases of invalid connectivity introduced and resolved by the method. 79 figs.

  1. Connectivity-based, all-hexahedral mesh generation method and apparatus

    DOEpatents

    Tautges, Timothy James; Mitchell, Scott A.; Blacker, Ted D.; Murdoch, Peter

    1998-01-01

    The present invention is a computer-based method and apparatus for constructing all-hexahedral finite element meshes for finite element analysis. The present invention begins with a three-dimensional geometry and an all-quadrilateral surface mesh, then constructs hexahedral element connectivity from the outer boundary inward, and then resolves invalid connectivity. The result of the present invention is a complete representation of hex mesh connectivity only; actual mesh node locations are determined later. The basic method of the present invention comprises the step of forming hexahedral elements by making crossings of entities referred to as "whisker chords." This step, combined with a seaming operation in space, is shown to be sufficient for meshing simple block problems. Entities that appear when meshing more complex geometries, namely blind chords, merged sheets, and self-intersecting chords, are described. A method for detecting invalid connectivity in space, based on repeated edges, is also described, along with its application to various cases of invalid connectivity introduced and resolved by the method.

  2. Accounting For Gains And Orientations In Polarimetric SAR

    NASA Technical Reports Server (NTRS)

    Freeman, Anthony

    1992-01-01

    Calibration method accounts for characteristics of real radar equipment invalidating standard 2 X 2 complex-amplitude R (receiving) and T (transmitting) matrices. Overall gain in each combination of transmitting and receiving channels assumed different even when only one transmitter and one receiver used. One characterizes departure of polarimetric Synthetic Aperture Radar (SAR) system from simple 2 X 2 model in terms of single parameter used to transform measurements into format compatible with simple 2 X 2 model. Data processed by applicable one of several prior methods based on simple model.

  3. Identification of growth phases and influencing factors in cultivations with AGE1.HN cells using set-based methods.

    PubMed

    Borchers, Steffen; Freund, Susann; Rath, Alexander; Streif, Stefan; Reichl, Udo; Findeisen, Rolf

    2013-01-01

    Production of bio-pharmaceuticals in cell culture, such as mammalian cells, is challenging. Mathematical models can provide support to the analysis, optimization, and the operation of production processes. In particular, unstructured models are suited for these purposes, since they can be tailored to particular process conditions. To this end, growth phases and the most relevant factors influencing cell growth and product formation have to be identified. Due to noisy and erroneous experimental data, unknown kinetic parameters, and the large number of combinations of influencing factors, currently there are only limited structured approaches to tackle these issues. We outline a structured set-based approach to identify different growth phases and the factors influencing cell growth and metabolism. To this end, measurement uncertainties are taken explicitly into account to bound the time-dependent specific growth rate based on the observed increase of the cell concentration. Based on the bounds on the specific growth rate, we can identify qualitatively different growth phases and (in-)validate hypotheses on the factors influencing cell growth and metabolism. We apply the approach to a mammalian suspension cell line (AGE1.HN). We show that growth in batch culture can be divided into two main growth phases. The initial phase is characterized by exponential growth dynamics, which can be described consistently by a relatively simple unstructured and segregated model. The subsequent phase is characterized by a decrease in the specific growth rate, which, as shown, results from substrate limitation and the pH of the medium. An extended model is provided which describes the observed dynamics of cell growth and main metabolites, and the corresponding kinetic parameters as well as their confidence intervals are estimated. The study is complemented by an uncertainty and outlier analysis. Overall, we demonstrate utility of set-based methods for analyzing cell growth and metabolism under conditions of uncertainty.

  4. Identification of Growth Phases and Influencing Factors in Cultivations with AGE1.HN Cells Using Set-Based Methods

    PubMed Central

    Borchers, Steffen; Freund, Susann; Rath, Alexander; Streif, Stefan; Reichl, Udo; Findeisen, Rolf

    2013-01-01

    Production of bio-pharmaceuticals in cell culture, such as mammalian cells, is challenging. Mathematical models can provide support to the analysis, optimization, and the operation of production processes. In particular, unstructured models are suited for these purposes, since they can be tailored to particular process conditions. To this end, growth phases and the most relevant factors influencing cell growth and product formation have to be identified. Due to noisy and erroneous experimental data, unknown kinetic parameters, and the large number of combinations of influencing factors, currently there are only limited structured approaches to tackle these issues. We outline a structured set-based approach to identify different growth phases and the factors influencing cell growth and metabolism. To this end, measurement uncertainties are taken explicitly into account to bound the time-dependent specific growth rate based on the observed increase of the cell concentration. Based on the bounds on the specific growth rate, we can identify qualitatively different growth phases and (in-)validate hypotheses on the factors influencing cell growth and metabolism. We apply the approach to a mammalian suspension cell line (AGE1.HN). We show that growth in batch culture can be divided into two main growth phases. The initial phase is characterized by exponential growth dynamics, which can be described consistently by a relatively simple unstructured and segregated model. The subsequent phase is characterized by a decrease in the specific growth rate, which, as shown, results from substrate limitation and the pH of the medium. An extended model is provided which describes the observed dynamics of cell growth and main metabolites, and the corresponding kinetic parameters as well as their confidence intervals are estimated. The study is complemented by an uncertainty and outlier analysis. Overall, we demonstrate utility of set-based methods for analyzing cell growth and metabolism under conditions of uncertainty. PMID:23936299

  5. Comparisons of patch-use models for wintering American tree sparrows

    USGS Publications Warehouse

    Tome, M.W.

    1990-01-01

    Optimal foraging theory has stimulated numerous theoretical and empirical studies of foraging behavior for >20 years. These models provide a valuable tool for studying the foraging behavior of an organism. As with any other tool, the models are most effective when properly used. For example, to obtain a robust test of a foraging model, Stephens and Krebs (1986) recommend experimental designs in which four questions are answered in the affirmative. First, do the foragers play the same "game" as the model? Sec- ond, are the assumptions of the model met? Third, does the test rule out alternative possibilities? Finally, are the appropriate variables measured? Negative an- swers to any of these questions could invalidate the model and lead to confusion over the usefulness of foraging theory in conducting ecological studies. Gaines (1989) attempted to determine whether American Tree Sparrows (Spizella arborea) foraged by a time (Krebs 1973) or number expectation rule (Gibb 1962), or in a manner consistent with the predictions of Charnov's (1976) marginal value theorem (MVT). Gaines (1989: 118) noted appropriately that field tests of foraging models frequently involve uncontrollable circumstances; thus, it is often difficult to meet the assumptions of the models. Gaines also states (1989: 118) that "violations of the assumptions are also in- formative but do not constitute robust tests of predicted hypotheses," and that "the problem can be avoided by experimental analyses which concurrently test mutually exclusive hypotheses so that alter- native predictions will be eliminated if falsified." There is a problem with this approach because, when major assumptions of models are not satisfied, it is not justifiable to compare a predator's foraging behavior with the model's predictions. I submit that failing to follow the advice offered by Stephens and Krebs (1986) can invalidate tests of foraging models.

  6. 7 CFR 27.44 - Invalidity of cotton class certificates.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Invalidity of cotton class certificates. 27.44 Section... CONTAINER REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Cotton Class Certificates § 27.44 Invalidity of cotton class certificates. Any cotton class certificate shall become invalid...

  7. 7 CFR 27.44 - Invalidity of cotton class certificates.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Invalidity of cotton class certificates. 27.44 Section... CONTAINER REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Cotton Class Certificates § 27.44 Invalidity of cotton class certificates. Any cotton class certificate shall become invalid...

  8. 7 CFR 27.44 - Invalidity of cotton class certificates.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Invalidity of cotton class certificates. 27.44 Section... CONTAINER REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Cotton Class Certificates § 27.44 Invalidity of cotton class certificates. Any cotton class certificate shall become invalid...

  9. 7 CFR 27.44 - Invalidity of cotton class certificates.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Invalidity of cotton class certificates. 27.44 Section... CONTAINER REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Cotton Class Certificates § 27.44 Invalidity of cotton class certificates. Any cotton class certificate shall become invalid...

  10. 7 CFR 27.44 - Invalidity of cotton class certificates.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Invalidity of cotton class certificates. 27.44 Section... CONTAINER REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Cotton Class Certificates § 27.44 Invalidity of cotton class certificates. Any cotton class certificate shall become invalid...

  11. 36 CFR 1150.114 - Effect of partial invalidity.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Effect of partial invalidity... COMPLIANCE BOARD PRACTICE AND PROCEDURES FOR COMPLIANCE HEARINGS Miscellaneous Provisions § 1150.114 Effect... from the invalid part shall remain in full force and effect. If a part of these regulations is invalid...

  12. 14 CFR 47.43 - Invalid registration.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... REGISTRATION Certificates of Aircraft Registration § 47.43 Invalid registration. (a) The registration of an...) compliance with 49 U.S.C. 44101-44104. (b) If the registration of an aircraft is invalid under paragraph (a) of this section, the holder of the invalid Certificate of Aircraft Registration, AC Form 8050-3, must...

  13. 14 CFR 47.43 - Invalid registration.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... REGISTRATION Certificates of Aircraft Registration § 47.43 Invalid registration. (a) The registration of an...) compliance with 49 U.S.C. 44101-44104. (b) If the registration of an aircraft is invalid under paragraph (a) of this section, the holder of the invalid Certificate of Aircraft Registration, AC Form 8050-3, must...

  14. 14 CFR 47.43 - Invalid registration.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REGISTRATION Certificates of Aircraft Registration § 47.43 Invalid registration. (a) The registration of an...) compliance with 49 U.S.C. 44101-44104. (b) If the registration of an aircraft is invalid under paragraph (a) of this section, the holder of the invalid Certificate of Aircraft Registration shall return it as...

  15. The practice of quality-associated costing: application to transfusion manufacturing processes.

    PubMed

    Trenchard, P M; Dixon, R

    1997-01-01

    This article applies the new method of quality-associated costing (QAC) to the mixture of processes that create red cell and plasma products from whole blood donations. The article compares QAC with two commonly encountered but arbitrary models and illustrates the invalidity of clinical cost-benefit analysis based on these models. The first, an "isolated" cost model, seeks to allocate each whole process cost to only one product class. The other is a "shared" cost model, and it seeks to allocate an approximately equal share of all process costs to all associated products.

  16. 14 CFR 47.43 - Invalid registration.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... REGISTRATION Certificates of Aircraft Registration § 47.43 Invalid registration. Link to an amendment published... registration of an aircraft is invalid if, at the time it is made— (1) The aircraft is registered in a foreign... knowledge) compliance with 49 U.S.C. 44101-44104. (b) If the registration of an aircraft is invalid under...

  17. Working with Missing Values

    ERIC Educational Resources Information Center

    Acock, Alan C.

    2005-01-01

    Less than optimum strategies for missing values can produce biased estimates, distorted statistical power, and invalid conclusions. After reviewing traditional approaches (listwise, pairwise, and mean substitution), selected alternatives are covered including single imputation, multiple imputation, and full information maximum likelihood…

  18. Target identification of small molecules based on chemical biology approaches.

    PubMed

    Futamura, Yushi; Muroi, Makoto; Osada, Hiroyuki

    2013-05-01

    Recently, a phenotypic approach-screens that assess the effects of compounds on cells, tissues, or whole organisms-has been reconsidered and reintroduced as a complementary strategy of a target-based approach for drug discovery. Although the finding of novel bioactive compounds from large chemical libraries has become routine, the identification of their molecular targets is still a time-consuming and difficult process, making this step rate-limiting in drug development. In the last decade, we and other researchers have amassed a large amount of phenotypic data through progress in omics research and advances in instrumentation. Accordingly, the profiling methodologies using these datasets expertly have emerged to identify and validate specific molecular targets of drug candidates, attaining some progress in current drug discovery (e.g., eribulin). In the case of a compound that shows an unprecedented phenotype likely by inhibiting a first-in-class target, however, such phenotypic profiling is invalid. Under the circumstances, a photo-crosslinking affinity approach should be beneficial. In this review, we describe and summarize recent progress in both affinity-based (direct) and phenotypic profiling (indirect) approaches for chemical biology target identification.

  19. A Worst-Case Approach for On-Line Flutter Prediction

    NASA Technical Reports Server (NTRS)

    Lind, Rick C.; Brenner, Martin J.

    1998-01-01

    Worst-case flutter margins may be computed for a linear model with respect to a set of uncertainty operators using the structured singular value. This paper considers an on-line implementation to compute these robust margins in a flight test program. Uncertainty descriptions are updated at test points to account for unmodeled time-varying dynamics of the airplane by ensuring the robust model is not invalidated by measured flight data. Robust margins computed with respect to this uncertainty remain conservative to the changing dynamics throughout the flight. A simulation clearly demonstrates this method can improve the efficiency of flight testing by accurately predicting the flutter margin to improve safety while reducing the necessary flight time.

  20. A 3D numerical study of LO2/GH2 supercritical combustion in the ONERA-Mascotte Test-rig configuration

    NASA Astrophysics Data System (ADS)

    Benmansour, Abdelkrim; Liazid, Abdelkrim; Logerais, Pierre-Olivier; Durastanti, Jean-Félix

    2016-02-01

    Cryogenic propellants LOx/H2 are used at very high pressure in rocket engine combustion. The description of the combustion process in such application is very complex due essentially to the supercritical regime. Ideal gas law becomes invalid. In order to try to capture the average characteristics of this combustion process, numerical computations are performed using a model based on a one-phase multi-component approach. Such work requires fluid properties and a correct definition of the mixture behavior generally described by cubic equations of state with appropriated thermodynamic relations validated against the NIST data. In this study we consider an alternative way to get the effect of real gas by testing the volume-weighted-mixing-law with association of the component transport properties using directly the NIST library data fitting including the supercritical regime range. The numerical simulations are carried out using 3D RANS approach associated with two tested turbulence models, the standard k-Epsilon model and the realizable k-Epsilon one. The combustion model is also associated with two chemical reaction mechanisms. The first one is a one-step generic chemical reaction and the second one is a two-step chemical reaction. The obtained results like temperature profiles, recirculation zones, visible flame lengths and distributions of OH species are discussed.

  1. Proactive routing mutation against stealthy Distributed Denial of Service attacks: metrics, modeling, and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Qi; Al-Shaer, Ehab; Chatterjee, Samrat

    The Infrastructure Distributed Denial of Service (IDDoS) attacks continue to be one of the most devastating challenges facing cyber systems. The new generation of IDDoS attacks exploit the inherent weakness of cyber infrastructure including deterministic nature of routes, skew distribution of flows, and Internet ossification to discover the network critical links and launch highly stealthy flooding attacks that are not observable at the victim end. In this paper, first, we propose a new metric to quantitatively measure the potential susceptibility of any arbitrary target server or domain to stealthy IDDoS attacks, and es- timate the impact of such susceptibility onmore » enterprises. Second, we develop a proactive route mutation technique to minimize the susceptibility to these attacks by dynamically changing the flow paths periodically to invalidate the adversary knowledge about the network and avoid targeted critical links. Our proposed approach actively changes these network paths while satisfying security and qualify of service requirements. We present an integrated approach of proactive route mutation that combines both infrastructure-based mutation that is based on reconfiguration of switches and routers, and middle-box approach that uses an overlay of end-point proxies to construct a virtual network path free of critical links to reach a destination. We implemented the proactive path mutation technique on a Software Defined Network using the OpendDaylight controller to demonstrate a feasible deployment of this approach. Our evaluation validates the correctness, effectiveness, and scalability of the proposed approaches.« less

  2. 38 CFR 17.151 - Invalid lifts for recipients of aid and attendance allowance or special monthly compensation.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Invalid lifts for... Rehabilitative Aids § 17.151 Invalid lifts for recipients of aid and attendance allowance or special monthly compensation. An invalid lift may be furnished if: (a) The applicant is a veteran who is receiving (1) special...

  3. 38 CFR 17.151 - Invalid lifts for recipients of aid and attendance allowance or special monthly compensation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Invalid lifts for... Rehabilitative Aids § 17.151 Invalid lifts for recipients of aid and attendance allowance or special monthly compensation. An invalid lift may be furnished if: (a) The applicant is a veteran who is receiving (1) special...

  4. Attitudes towards emotional expression mediate the relationship between childhood invalidation and adult eating concern.

    PubMed

    Haslam, Michelle; Arcelus, Jon; Farrow, Claire; Meyer, Caroline

    2012-11-01

    Previous research has suggested that invalidating childhood environments are positively related to the symptoms of eating disorders. However, it is unclear how childhood environments might impact upon the development of eating disorder symptoms. This study examined the relationship between parental invalidation and eating disorder-related attitudes in a nonclinical sample and tested the mediating effect of attitudes towards emotional expression. Two hundred women, with a mean age of 21 years, completed measures of invalidating childhood environments, attitudes towards emotional expression, and eating pathology. Eating concerns were positively associated with recollections of an invalidating parental environment. The belief that the expression of emotions is a sign of weakness fully mediated the relationship between childhood maternal invalidation and adult eating concern. Following replication and extension to a clinical sample, these results suggest that targeting the individual's attitude towards emotional expression might reduce eating attitudes among women who have experienced an invalidating childhood environment. Copyright © 2012 John Wiley & Sons, Ltd and Eating Disorders Association.

  5. Fundamental understanding of distracted oxygen delignification efficiency by dissolved lignin during biorefinery process of eucalyptus.

    PubMed

    Zhao, Huifang; Li, Jing; Zhang, Xuejin

    2018-06-01

    In this work, a fundamental understanding of oxygen delignification distracted by dissolved lignin was investigated. In the new biorefinery model of shortening kraft pulping integrated with extended oxygen delignification process, increasing content of residual lignin in the original pulp could result in enhanced delignification efficiency, higher pulp viscosity and less carbonyl groups. However, the invalid oxygen consumption by dissolved lignin could be increased with the increase of process temperature and alkali dosage. The normalized ultraviolet absorbance (divided by absorbance at 280 nm) also showed that the content of chromophoric group in dissolved lignin decreased with oxygen delignification proceeded, both of which indicated that dissolved lignin could enhance the invalid oxygen consumption. Therefore, a conclusion that replacement of the liquor at the initial phase of oxygen delignification process would balance the enhancement of delignification efficiency and invalid oxygen consumption was achieved. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Forensic identification of resampling operators: A semi non-intrusive approach.

    PubMed

    Cao, Gang; Zhao, Yao; Ni, Rongrong

    2012-03-10

    Recently, several new resampling operators have been proposed and successfully invalidate the existing resampling detectors. However, the reliability of such anti-forensic techniques is unaware and needs to be investigated. In this paper, we focus on the forensic identification of digital image resampling operators including the traditional type and the anti-forensic type which hides the trace of traditional resampling. Various resampling algorithms involving geometric distortion (GD)-based, dual-path-based and postprocessing-based are investigated. The identification is achieved in the manner of semi non-intrusive, supposing the resampling software could be accessed. Given an input pattern of monotone signal, polarity aberration of GD-based resampled signal's first derivative is analyzed theoretically and measured by effective feature metric. Dual-path-based and postprocessing-based resampling can also be identified by feeding proper test patterns. Experimental results on various parameter settings demonstrate the effectiveness of the proposed approach. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  7. Inhibitory mechanism of the matching heuristic in syllogistic reasoning.

    PubMed

    Tse, Ping Ping; Moreno Ríos, Sergio; García-Madruga, Juan Antonio; Bajo Molina, María Teresa

    2014-11-01

    A number of heuristic-based hypotheses have been proposed to explain how people solve syllogisms with automatic processes. In particular, the matching heuristic employs the congruency of the quantifiers in a syllogism—by matching the quantifier of the conclusion with those of the two premises. When the heuristic leads to an invalid conclusion, successful solving of these conflict problems requires the inhibition of automatic heuristic processing. Accordingly, if the automatic processing were based on processing the set of quantifiers, no semantic contents would be inhibited. The mental model theory, however, suggests that people reason using mental models, which always involves semantic processing. Therefore, whatever inhibition occurs in the processing implies the inhibition of the semantic contents. We manipulated the validity of the syllogism and the congruency of the quantifier of its conclusion with those of the two premises according to the matching heuristic. A subsequent lexical decision task (LDT) with related words in the conclusion was used to test any inhibition of the semantic contents after each syllogistic evaluation trial. In the LDT, the facilitation effect of semantic priming diminished after correctly solved conflict syllogisms (match-invalid or mismatch-valid), but was intact after no-conflict syllogisms. The results suggest the involvement of an inhibitory mechanism of semantic contents in syllogistic reasoning when there is a conflict between the output of the syntactic heuristic and actual validity. Our results do not support a uniquely syntactic process of syllogistic reasoning but fit with the predictions based on mental model theory. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Invalid corps.

    PubMed

    Lande, R Gregory

    2008-06-01

    This article explores America's historical experience with medical disability compensation programs during the Revolutionary War and the Civil War. Contemporary newspaper reports, complemented by book and journal articles, provide an understanding of the medical disability compensation programs offered during the Revolutionary War and the Civil War. Military planners, politicians, and service members struggled to develop a fair and balanced medical disability compensation program during the Revolutionary War and the Civil War. Based on America's extensive experience with the Civil War Invalid Corps, an alternative for motivated military personnel could be developed.

  9. Leveraging the BPEL Event Model to Support QoS-aware Process Execution

    NASA Astrophysics Data System (ADS)

    Zaid, Farid; Berbner, Rainer; Steinmetz, Ralf

    Business processes executed using compositions of distributed Web Services are susceptible to different fault types. The Web Services Business Process Execution Language (BPEL) is widely used to execute such processes. While BPEL provides fault handling mechanisms to handle functional faults like invalid message types, it still lacks a flexible native mechanism to handle non-functional exceptions associated with violations of QoS levels that are typically specified in a governing Service Level Agreement (SLA), In this paper, we present an approach to complement BPEL's fault handling, where expected QoS levels and necessary recovery actions are specified declaratively in form of Event-Condition-Action (ECA) rules. Our main contribution is leveraging BPEL's standard event model which we use as an event space for the created ECA rules. We validate our approach by an extension to an open source BPEL engine.

  10. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    PubMed

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.

  11. Identifying the Safety Factors over Traffic Signs in State Roads using a Panel Quantile Regression Approach.

    PubMed

    Šarić, Željko; Xu, Xuecai; Duan, Li; Babić, Darko

    2018-06-20

    This study intended to investigate the interactions between accident rate and traffic signs in state roads located in Croatia, and accommodate the heterogeneity attributed to unobserved factors. The data from 130 state roads between 2012 and 2016 were collected from Traffic Accident Database System maintained by the Republic of Croatia Ministry of the Interior. To address the heterogeneity, a panel quantile regression model was proposed, in which quantile regression model offers a more complete view and a highly comprehensive analysis of the relationship between accident rate and traffic signs, while the panel data model accommodates the heterogeneity attributed to unobserved factors. Results revealed that (1) low visibility of material damage (MD) and death or injured (DI) increased the accident rate; (2) the number of mandatory signs and the number of warning signs were more likely to reduce the accident rate; (3)average speed limit and the number of invalid traffic signs per km exhibited a high accident rate. To our knowledge, it's the first attempt to analyze the interactions between accident consequences and traffic signs by employing a panel quantile regression model; by involving the visibility, the present study demonstrates that the low visibility causes a relatively higher risk of MD and DI; It is noteworthy that average speed limit corresponds with accident rate positively; The number of mandatory signs and the number of warning signs are more likely to reduce the accident rate; The number of invalid traffic signs per km are significant for accident rate, thus regular maintenance should be kept for a safer roadway environment.

  12. Adaptive Modeling of the International Space Station Electrical Power System

    NASA Technical Reports Server (NTRS)

    Thomas, Justin Ray

    2007-01-01

    Software simulations provide NASA engineers the ability to experiment with spacecraft systems in a computer-imitated environment. Engineers currently develop software models that encapsulate spacecraft system behavior. These models can be inaccurate due to invalid assumptions, erroneous operation, or system evolution. Increasing accuracy requires manual calibration and domain-specific knowledge. This thesis presents a method for automatically learning system models without any assumptions regarding system behavior. Data stream mining techniques are applied to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). We also explore a knowledge fusion approach that uses traditional engineered EPS models to supplement the learned models. We observed that these engineered EPS models provide useful background knowledge to reduce predictive error spikes when confronted with making predictions in situations that are quite different from the training scenarios used when learning the model. Evaluations using ISS sensor data and existing EPS models demonstrate the success of the adaptive approach. Our experimental results show that adaptive modeling provides reductions in model error anywhere from 80% to 96% over these existing models. Final discussions include impending use of adaptive modeling technology for ISS mission operations and the need for adaptive modeling in future NASA lunar and Martian exploration.

  13. A hybrid model for computing nonthermal ion distributions in a long mean-free-path plasma

    NASA Astrophysics Data System (ADS)

    Tang, Xianzhu; McDevitt, Chris; Guo, Zehua; Berk, Herb

    2014-10-01

    Non-thermal ions, especially the suprathermal ones, are known to make a dominant contribution to a number of important physics such as the fusion reactivity in controlled fusion, the ion heat flux, and in the case of a tokamak, the ion bootstrap current. Evaluating the deviation from a local Maxwellian distribution of these non-thermal ions can be a challenging task in the context of a global plasma fluid model that evolves the plasma density, flow, and temperature. Here we describe a hybrid model for coupling such constrained kinetic calculation to global plasma fluid models. The key ingredient is a non-perturbative treatment of the tail ions where the ion Knudsen number approaches or surpasses order unity. This can be sharply constrasted with the standard Chapman-Enskog approach which relies on a perturbative treatment that is frequently invalidated. The accuracy of our coupling scheme is controlled by the precise criteria for matching the non-perturbative kinetic model to perturbative solutions in both configuration space and velocity space. Although our specific application examples will be drawn from laboratory controlled fusion experiments, the general approach is applicable to space and astrophysical plasmas as well. Work supported by DOE.

  14. Time varying prediction of thoughts of death and suicidal ideation in adolescents: weekly ratings over 6-month follow-up.

    PubMed

    Selby, Edward A; Yen, Shirley; Spirito, Anthony

    2013-01-01

    Suicidal ideation (SI) and thoughts of death are often experienced as fluctuating; therefore a dynamic representation of this highly important indicator of suicide risk is warranted. Theoretical accounts have suggested that affective, behavioral, and interpersonal factors may influence the experience of thoughts of death/SI. This study aimed to examine the prospective and dynamic impact of these constructs in relation to thoughts of death and SI. We assessed adolescents with a recent hospitalization for elevated suicide risk over 6 months. Using the methodology of the Longitudinal Interval Follow-Up Evaluation, weekly ratings for SI, course of depressive illness, affect sensitivity, negative affect intensity, behavioral dysregulation, peer invalidation, and family invalidation were obtained. Using multilevel modeling, results indicated that (a) same-week ratings between these constructs and SI were highly correlated at baseline and throughout follow-up; (b) baseline ratings of affect sensitivity, behavioral dysregulation, and peer invalidation were positive prospective predictors of SI at any week of follow-up; (c) weekly ratings of each of these constructs had significant associations with next-week ratings of SI; and (d) ratings of SI had positive significant associations with next-week ratings on each of the constructs. These results suggest that affective sensitivity, behavioral dysregulation, peer invalidation, and SI are highly associated with SI levels both chronically (over months) and acutely (one week to the next), whereas depression, negative affect intensity, and family invalidation were more acutely predictive of SI. Elevated SI may then aggravate all these factors in a reciprocal manner.

  15. Time Varying Prediction of Thoughts of Death and Suicidal Ideation in Adolescents: Weekly Ratings over Six Month Follow-Up

    PubMed Central

    Selby, Edward A.; Yen, Shirley; Spirito, Anthony

    2012-01-01

    Objective Suicidal ideation (SI) and thoughts of death are often experienced as fluctuating; therefore a dynamic representation of this highly important indicator of suicide risk is warranted. Theoretical accounts have suggested that affective, behavioral, and interpersonal factors may influence the experience of thoughts of death/suicidal ideation. This study aimed to examine the prospective and dynamic impact of these constructs in relation to thoughts of death and SI. Method We assessed adolescents with a recent hospitalization for elevated suicide risk over six months. Using the methodology of the Longitudinal Interval Follow-Up Evaluation (LIFE), weekly ratings for SI, course of depressive illness, affect sensitivity, negative affect intensity, behavioral dysregulation, peer invalidation, and family invalidation were obtained. Results Using multilevel modeling, results indicated that: 1) same-week ratings between these constructs and SI were highly correlated at baseline and throughout follow-up; 2) baseline ratings of affect sensitivity, behavioral dysregulation, and peer invalidation were positive prospective predictors of SI at any week of follow-up; 3) weekly ratings of each of these constructs had significant associations with next-week ratings of SI; and 4) ratings of SI had positive significant associations with next-week ratings on each of the constructs. Conclusions These results suggest that affective sensitivity, behavioral dysregulation, peer invalidation, and suicidal ideation are highly associated with SI levels both chronically (over months) and acutely (one week to the next), while depression, negative affect intensity, and family invalidation were more acutely predictive of SI. Elevated SI may then aggravate all these factors in a reciprocal manner. PMID:23148530

  16. Predictors of invalid neuropsychological test performance after traumatic brain injury.

    PubMed

    Moore, Bret A; Donders, Jacobus

    2004-10-01

    To investigate the usefulness of the Test of Memory Malingering (TOMM) and the California Verbal Learning Test-Second Edition (CVLT-II) in assessing invalid test performance after traumatic brain injury (TBI). Consecutive 3-year series of rehabilitation referrals (n = 132). Percentage of participants who failed validity criteria was determined. Hierarchical logistic regression analysis and odds ratios were used to identify predictors of invalid test performance. Twenty patients (15%) performed in the invalid range when held to a priori specified criteria for invalid test performance (i.e. TOMM <45/50 on Trial 2 or CVLT-II <15/16 on Forced-Choice recognition trial). Both psychiatric history and financial compensation seeking were associated with an almost 4-fold increase in likelihood of invalid responding. The TOMM and CVLT-II are sensitive to the potential impact of current financial compensation seeking and prior psychiatric history on neuropsychological test performance after TBI.

  17. Editorial Decisions May Perpetuate Belief in Invalid Research Findings

    PubMed Central

    Eriksson, Kimmo; Simpson, Brent

    2013-01-01

    Social psychology and related disciplines are seeing a resurgence of interest in replication, as well as actual replication efforts. But prior work suggests that even a clear demonstration that a finding is invalid often fails to shake acceptance of the finding. This threatens the full impact of these replication efforts. Here we show that the actions of two key players – journal editors and the authors of original (invalidated) research findings – are critical to the broader public’s continued belief in an invalidated research conclusion. Across three experiments, we show that belief in an invalidated finding falls sharply when a critical failed replication is published in the same – versus different – journal as the original finding, and when the authors of the original finding acknowledge that the new findings invalidate their conclusions. We conclude by discussing policy implications of our key findings. PMID:24023863

  18. Robust biological parametric mapping: an improved technique for multimodal brain image analysis

    NASA Astrophysics Data System (ADS)

    Yang, Xue; Beason-Held, Lori; Resnick, Susan M.; Landman, Bennett A.

    2011-03-01

    Mapping the quantitative relationship between structure and function in the human brain is an important and challenging problem. Numerous volumetric, surface, region of interest and voxelwise image processing techniques have been developed to statistically assess potential correlations between imaging and non-imaging metrics. Recently, biological parametric mapping has extended the widely popular statistical parametric approach to enable application of the general linear model to multiple image modalities (both for regressors and regressands) along with scalar valued observations. This approach offers great promise for direct, voxelwise assessment of structural and functional relationships with multiple imaging modalities. However, as presented, the biological parametric mapping approach is not robust to outliers and may lead to invalid inferences (e.g., artifactual low p-values) due to slight mis-registration or variation in anatomy between subjects. To enable widespread application of this approach, we introduce robust regression and robust inference in the neuroimaging context of application of the general linear model. Through simulation and empirical studies, we demonstrate that our robust approach reduces sensitivity to outliers without substantial degradation in power. The robust approach and associated software package provides a reliable way to quantitatively assess voxelwise correlations between structural and functional neuroimaging modalities.

  19. Prevalence of Invalid Computerized Baseline Neurocognitive Test Results in High School and Collegiate Athletes

    PubMed Central

    Schatz, Philip; Moser, Rosemarie Scolaro; Solomon, Gary S.; Ott, Summer D.; Karpf, Robin

    2012-01-01

    Context: Limited data are available regarding the prevalence and nature of invalid computerized baseline neurocognitive test data. Objective: To identify the prevalence of invalid baselines on the desktop and online versions of ImPACT and to document the utility of correcting for left-right (L-R) confusion on the desktop version of ImPACT. Design: Cross-sectional study of independent samples of high school (HS) and collegiate athletes who completed the desktop or online versions of ImPACT. Participants or Other Participants: A total of 3769 HS (desktop  =  1617, online  =  2152) and 2130 collegiate (desktop  =  742, online  =  1388) athletes completed preseason baseline assessments. Main Outcome Measure(s): Prevalence of 5 ImPACT validity indicators, with correction for L-R confusion (reversing left and right mouse-click responses) on the desktop version, by test version and group. Chi-square analyses were conducted for sex and attentional or learning disorders. Results: At least 1 invalid indicator was present on 11.9% (desktop) versus 6.3% (online) of the HS baselines and 10.2% (desktop) versus 4.1% (online) of collegiate baselines; correcting for L-R confusion (desktop) decreased this overall prevalence to 8.4% (HS) and 7.5% (collegiate). Online Impulse Control scores alone yielded 0.4% (HS) and 0.9% (collegiate) invalid baselines, compared with 9.0% (HS) and 5.4% (collegiate) on the desktop version; correcting for L-R confusion (desktop) decreased the prevalence of invalid Impulse Control scores to 5.4% (HS) and 2.6% (collegiate). Male athletes and HS athletes with attention deficit or learning disorders who took the online version were more likely to have at least 1 invalid indicator. Utility of additional invalidity indicators is reported. Conclusions: The online ImPACT version appeared to yield fewer invalid baseline results than did the desktop version. Identification of L-R confusion reduces the prevalence of invalid baselines (desktop only) and the potency of Impulse Control as a validity indicator. We advise test administrators to be vigilant in identifying invalid baseline results as part of routine concussion management and prevention programs. PMID:22892410

  20. 30 CFR 250.530 - When does my casing pressure request approval become invalid?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... request approval become invalid? A casing pressure request becomes invalid when: (a) The casing or riser... different casing or riser on the same well requires a casing pressure request; or (e) A well has more than...

  1. 30 CFR 250.530 - When does my casing pressure request approval become invalid?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... become invalid? A casing pressure request becomes invalid when: (a) The casing or riser pressure... casing or riser on the same well requires a casing pressure request; or (e) A well has more than one...

  2. 30 CFR 250.531 - When does my casing pressure request approval become invalid?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... request approval become invalid? A casing pressure request becomes invalid when: (a) The casing or riser... different casing or riser on the same well requires a casing pressure request; or (e) A well has more than...

  3. 30 CFR 250.531 - When does my casing pressure request approval become invalid?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... request approval become invalid? A casing pressure request becomes invalid when: (a) The casing or riser... different casing or riser on the same well requires a casing pressure request; or (e) A well has more than...

  4. Medical school libraries' handling of articles that report invalid science.

    PubMed

    Pfeifer, M P; Snodgrass, G L

    1992-02-01

    In 1989-90 the authors conducted a nationwide study to examine how academic medical libraries handled articles that report invalid science and to determine the effectiveness of any policies implemented to limit the use of such articles. Ninety-five of the 127 medical school libraries the authors surveyed completed questionnaires analyzing policy and attitude issues. Eighty-four of these libraries manually reviewed the available copies they held of ten retracted articles. Of the 811 copies of the retracted, invalid articles reviewed, 742 (91.5%) were not tagged as being invalid. Seventy-nine percent of the libraries had tagged none of the retracted studies and only 16% had policies for managing articles that report invalid science. Academic librarians reflected a common attitude against perceived library censorship and emphasized the user's role in assuring validity. The nation's medical libraries, at least in part by intent, do not commonly identify or have policies to handle the invalid articles they hold. The authors conclude that biomedical researchers, clinicians, and teachers should not assume published studies held in libraries are inherently valid. The lack of stated policy and the disparate assumptions about the role libraries play in this area may perpetuate the use of invalid articles.

  5. 5 CFR 1203.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... RULES AND REGULATIONS OF THE OFFICE OF PERSONNEL MANAGEMENT General § 1203.2 Definitions. (a) Invalid... to commit a prohibited personnel practice if any agency implemented the regulation. (b) Invalidly... employee to commit a prohibited personnel practice. A valid regulation may be invalidly implemented. (c...

  6. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Bacon, Diana H.; Fang, Yilin

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  7. An approach to rescheduling activities based on determination of priority and disruptivity

    NASA Technical Reports Server (NTRS)

    Sponsler, Jeffrey L.; Johnston, Mark D.

    1990-01-01

    A constraint-based scheduling system called SPIKE is being used to create long term schedules for the Hubble Space Telescope. Feedback for the spacecraft or from other ground support systems may invalidate some scheduling decisions and those activities concerned must be reconsidered. A function rescheduling priority is defined which for a given activity performs a heuristic analysis and produces a relative numerical value which is used to rank all such entities in the order that they should be rescheduled. A function disruptivity is also defined that is used to place a relative numeric value on how much a pre-existing schedule would be changed in order to reschedule an activity. Using these functions, two algorithms (a stochastic neural network approach and an exhaustive search approach) are proposed to find the best place to reschedule an activity. Prototypes were implemented and preliminary testing reveals that the exhaustive technique produces only marginally better results at much greater computational cost.

  8. Invalidating childhood environments and core beliefs in women with eating disorders.

    PubMed

    Ford, Gillian; Waller, Glenn; Mountford, Victoria

    2011-01-01

    It can be hypothesised that invalidating environments in childhood influence the negative core beliefs that are found in the eating disorders. This study of eating-disordered women aimed to test the relationships between perceived childhood invalidating environments and negative core beliefs. Forty-one eating-disordered females completed the measures of childhood invalidating experiences and core beliefs. Such core beliefs were most closely related to the individuals' perceptions of having grown up in a 'chaotic' family environment. Future clinical practice should continue to target core beliefs in formulating cases of eating disorders. Explaining those core beliefs may depend on understanding the individual's experiences of invalidation in early years. Copyright © 2010 John Wiley & Sons, Ltd and Eating Disorders Association.

  9. 14 CFR 47.43 - Invalid registration.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Invalid registration. 47.43 Section 47.43 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRCRAFT... knowledge) compliance with 49 U.S.C. 44101-44104. (b) If the registration of an aircraft is invalid under...

  10. 20 CFR 655.209 - Invalidation of temporary labor certifications.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Invalidation of temporary labor... LABOR TEMPORARY EMPLOYMENT OF FOREIGN WORKERS IN THE UNITED STATES Labor Certification Process for Logging Employment and Non-H-2A Agricultural Employment § 655.209 Invalidation of temporary labor...

  11. Traditions, Paradigms and Basic Concepts in Islamic Psychology.

    PubMed

    Skinner, Rasjid

    2018-03-23

    The conceptual tools of psychology aim to explain the complexity of phenomena that psychotherapists observe in their patients and within themselves, as well as to predict the outcome of therapy. Naturally, Muslim psychologists have sought satisfaction in the conceptual tools of their trade and in what has been written in Islamic psychology-notably by Badri (The dilemma of Muslim psychologists, MWH London, London, 1979), who critiqued Western psychology from an Islamic perspective, arguing the need to filter out from Western Psychology which was cross-culturally invalid or was in conflict with Islamic precept. In this paper, I advocate an extension of Badri's (1979) approach and present a working model of the self derived from traditional Islamic thought. This model, though rudimentary and incomplete, I believe, makes better sense of my perceptions as a clinician than any other psychological model within my knowledge.

  12. Invalid before impaired: an emerging paradox of embedded validity indicators.

    PubMed

    Erdodi, Laszlo A; Lichtenstein, Jonathan D

    Embedded validity indicators (EVIs) are cost-effective psychometric tools to identify non-credible response sets during neuropsychological testing. As research on EVIs expands, assessors are faced with an emerging contradiction: the range of credible impairment disappears between the 'normal' and 'invalid' range of performance. We labeled this phenomenon as the invalid-before-impaired paradox. This study was designed to explore the origin of this psychometric anomaly, subject it to empirical investigation, and generate potential solutions. Archival data were analyzed from a mixed clinical sample of 312 (M Age  = 45.2; M Education  = 13.6) patients medically referred for neuropsychological assessment. The distribution of scores on eight subtests of the third and fourth editions of Wechsler Adult Intelligence Scale (WAIS) were examined in relation to the standard normal curve and two performance validity tests (PVTs). Although WAIS subtests varied in their sensitivity to non-credible responding, they were all significant predictors of performance validity. While subtests previously identified as EVIs (Digit Span, Coding, and Symbol Search) were comparably effective at differentiating credible and non-credible response sets, their classification accuracy was driven by their base rate of low scores, requiring different cutoffs to achieve comparable specificity. Invalid performance had a global effect on WAIS scores. Genuine impairment and non-credible performance can co-exist, are often intertwined, and may be psychometrically indistinguishable. A compromise between the alpha and beta bias on PVTs based on a balanced, objective evaluation of the evidence that requires concessions from both sides is needed to maintain/restore the credibility of performance validity assessment.

  13. 32 CFR 538.5 - Conversion of invalidated military payment certificates.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... certificates. (a) When converted. Time limit on filing claims for the conversion of invalidated Series 461, 471... in effects of deceased personnel. Invalidated series of military payment certificates in amounts not... of death or entry into missing status was prior to the date the series of military payment...

  14. 32 CFR 538.5 - Conversion of invalidated military payment certificates.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... certificates. (a) When converted. Time limit on filing claims for the conversion of invalidated Series 461, 471... in effects of deceased personnel. Invalidated series of military payment certificates in amounts not... of death or entry into missing status was prior to the date the series of military payment...

  15. 32 CFR 538.5 - Conversion of invalidated military payment certificates.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... certificates. (a) When converted. Time limit on filing claims for the conversion of invalidated Series 461, 471... in effects of deceased personnel. Invalidated series of military payment certificates in amounts not... of death or entry into missing status was prior to the date the series of military payment...

  16. 32 CFR 538.5 - Conversion of invalidated military payment certificates.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... certificates. (a) When converted. Time limit on filing claims for the conversion of invalidated Series 461, 471... in effects of deceased personnel. Invalidated series of military payment certificates in amounts not... of death or entry into missing status was prior to the date the series of military payment...

  17. 32 CFR 538.5 - Conversion of invalidated military payment certificates.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... certificates. (a) When converted. Time limit on filing claims for the conversion of invalidated Series 461, 471... in effects of deceased personnel. Invalidated series of military payment certificates in amounts not... of death or entry into missing status was prior to the date the series of military payment...

  18. Improved Correction of Misclassification Bias With Bootstrap Imputation.

    PubMed

    van Walraven, Carl

    2018-07-01

    Diagnostic codes used in administrative database research can create bias due to misclassification. Quantitative bias analysis (QBA) can correct for this bias, requires only code sensitivity and specificity, but may return invalid results. Bootstrap imputation (BI) can also address misclassification bias but traditionally requires multivariate models to accurately estimate disease probability. This study compared misclassification bias correction using QBA and BI. Serum creatinine measures were used to determine severe renal failure status in 100,000 hospitalized patients. Prevalence of severe renal failure in 86 patient strata and its association with 43 covariates was determined and compared with results in which renal failure status was determined using diagnostic codes (sensitivity 71.3%, specificity 96.2%). Differences in results (misclassification bias) were then corrected with QBA or BI (using progressively more complex methods to estimate disease probability). In total, 7.4% of patients had severe renal failure. Imputing disease status with diagnostic codes exaggerated prevalence estimates [median relative change (range), 16.6% (0.8%-74.5%)] and its association with covariates [median (range) exponentiated absolute parameter estimate difference, 1.16 (1.01-2.04)]. QBA produced invalid results 9.3% of the time and increased bias in estimates of both disease prevalence and covariate associations. BI decreased misclassification bias with increasingly accurate disease probability estimates. QBA can produce invalid results and increase misclassification bias. BI avoids invalid results and can importantly decrease misclassification bias when accurate disease probability estimates are used.

  19. Forgotten but not gone: Retro-cue costs and benefits in a double-cueing paradigm suggest multiple states in visual short-term memory.

    PubMed

    van Moorselaar, Dirk; Olivers, Christian N L; Theeuwes, Jan; Lamme, Victor A F; Sligte, Ilja G

    2015-11-01

    Visual short-term memory (VSTM) performance is enhanced when the to-be-tested item is cued after encoding. This so-called retro-cue benefit is typically accompanied by a cost for the noncued items, suggesting that information is lost from VSTM upon presentation of a retrospective cue. Here we assessed whether noncued items can be restored to VSTM when made relevant again by a subsequent second cue. We presented either 1 or 2 consecutive retro-cues (80% valid) during the retention interval of a change-detection task. Relative to no cue, a valid cue increased VSTM capacity by 2 items, while an invalid cue decreased capacity by 2. Importantly, when a second, valid cue followed an invalid cue, capacity regained 2 items, so that performance was back on par. In addition, when the second cue was also invalid, there was no extra loss of information from VSTM, suggesting that those items that survived a first invalid cue, automatically also survived a second. We conclude that these results are in support of a very versatile VSTM system, in which memoranda adopt different representational states depending on whether they are deemed relevant now, in the future, or not at all. We discuss a neural model that is consistent with this conclusion. (c) 2015 APA, all rights reserved).

  20. Relative Velocity as a Metric for Probability of Collision Calculations

    NASA Technical Reports Server (NTRS)

    Frigm, Ryan Clayton; Rohrbaugh, Dave

    2008-01-01

    Collision risk assessment metrics, such as the probability of collision calculation, are based largely on assumptions about the interaction of two objects during their close approach. Specifically, the approach to probabilistic risk assessment can be performed more easily if the relative trajectories of the two close approach objects are assumed to be linear during the encounter. It is shown in this analysis that one factor in determining linearity is the relative velocity of the two encountering bodies, in that the assumption of linearity breaks down at low relative approach velocities. The first part of this analysis is the determination of the relative velocity threshold below which the assumption of linearity becomes invalid. The second part is a statistical study of conjunction interactions between representative asset spacecraft and the associated debris field environment to determine the likelihood of encountering a low relative velocity close approach. This analysis is performed for both the LEO and GEO orbit regimes. Both parts comment on the resulting effects to collision risk assessment operations.

  1. 16 CFR 306.3 - Stayed or invalid parts.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Stayed or invalid parts. 306.3 Section 306.3 Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS AUTOMOTIVE FUEL RATINGS, CERTIFICATION AND POSTING General § 306.3 Stayed or invalid parts. If any part of this rule is...

  2. 16 CFR 306.3 - Stayed or invalid parts.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Stayed or invalid parts. 306.3 Section 306.3 Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS AUTOMOTIVE FUEL RATINGS, CERTIFICATION AND POSTING General § 306.3 Stayed or invalid parts. If any part of this rule is...

  3. 16 CFR 306.3 - Stayed or invalid parts.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Stayed or invalid parts. 306.3 Section 306.3 Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS AUTOMOTIVE FUEL RATINGS, CERTIFICATION AND POSTING General § 306.3 Stayed or invalid parts. If any part of this rule is...

  4. 16 CFR 306.3 - Stayed or invalid parts.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Stayed or invalid parts. 306.3 Section 306.3 Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS AUTOMOTIVE FUEL RATINGS, CERTIFICATION AND POSTING General § 306.3 Stayed or invalid parts. If any part of this rule is...

  5. 16 CFR 306.3 - Stayed or invalid parts.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Stayed or invalid parts. 306.3 Section 306.3 Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS AUTOMOTIVE FUEL RATINGS, CERTIFICATION AND POSTING General § 306.3 Stayed or invalid parts. If any part of this rule is...

  6. Studying Teachers' Mathematical Argumentation in the Context of Refuting Students' Invalid Claims

    ERIC Educational Resources Information Center

    Giannakoulias, Eusthathios; Mastorides, Eleutherios; Potari, Despina; Zachariades, Theodossios

    2010-01-01

    This study investigates teachers' argumentation aiming to convince students about the invalidity of their mathematical claims in the context of calculus. 18 secondary school mathematics teachers were given three hypothetical scenarios of a student's proof that included an invalid algebraic claim. The teachers were asked to identify possible…

  7. 25 CFR 11.603 - Invalid or prohibited marriages.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Invalid or prohibited marriages. 11.603 Section 11.603... LAW AND ORDER CODE Domestic Relations § 11.603 Invalid or prohibited marriages. (a) The following marriages are prohibited: (1) A marriage entered into prior to the dissolution of an earlier marriage of one...

  8. Self-compassion and emotional invalidation mediate the effects of parental indifference on psychopathology.

    PubMed

    Westphal, Maren; Leahy, Robert L; Pala, Andrea Norcini; Wupperman, Peggilee

    2016-08-30

    This study investigated whether self-compassion and emotional invalidation (perceiving others as indifferent to one's emotions) may explain the relationship of childhood exposure to adverse parenting and adult psychopathology in psychiatric outpatients (N=326). Path analysis was used to investigate associations between exposure to adverse parenting (abuse and indifference), self-compassion, emotional invalidation, and mental health when controlling for gender and age. Self-compassion was strongly inversely associated with emotional invalidation, suggesting that a schema that others will be unsympathetic or indifferent toward one's emotions may affect self-compassion and vice versa. Both self-compassion and emotional invalidation mediated the relationship between parental indifference and mental health outcomes. These preliminary findings suggest the potential utility of self-compassion and emotional schemas as transdiagnostic treatment targets. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. Updating of Attentional and Premotor Allocation Resources as function of previous trial outcome

    PubMed Central

    Arjona, Antonio; Escudero, Miguel; Gómez, Carlos M.

    2014-01-01

    The neural bases of the inter-trial validity/invalidity sequential effects in a visuo-auditory modified version of the Central Cue Posner's Paradigm (CCPP) are analyzed by means of Early Directing Attention Negativity (EDAN), Contingent Negative Variation (CNV) and Lateralized Readiness Potential (LRP). ERPs results indicated an increase in CNV and LRP in trials preceded by valid trials compared to trials preceded by invalid trials. The CNV and LRP pattern would be highly related to the behavioral pattern of lower RTs and higher number of anticipations in trials preceded by valid with respect to trials preceded by invalid trials. This effect was not preceded by a modulation of the EDAN as a result of the previous trial condition. The results suggest that there is a trial-by-trial dynamic modulation of the attentional system as a function of the validity assigned to the cue, in which conditional probabilities between cue and target are continuously updated. PMID:24681570

  10. A comparison of four embedded validity indices for the RBANS in a memory disorders clinic.

    PubMed

    Paulson, Daniel; Horner, Michael David; Bachman, David

    2015-05-01

    This examination of four embedded validity indices for the Repeated Battery for the Assessment of Neuropsychological Status (RBANS) explores the potential utility of integrating cognitive and self-reported depressive measures. Examined indices include the proposed RBANS Performance Validity Index (RBANS PVI) and the Charleston Revised Index of Effort for the RBANS (CRIER). The CRIER represented the novel integration of cognitive test performance and depression self-report information. The sample included 234 patients without dementia who could be identified as having demonstrated either valid or invalid responding, based on standardized criteria. Sensitivity and specificity for invalid responding varied widely, with the CRIER emerging as the best all-around index (sensitivity = 0.84, specificity = 0.90, AUC = 0.94). Findings support the use of embedded response validity indices, and suggest that the integration of cognitive and self-report depression data may optimize detection of invalid responding among older Veterans. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Illusory inferences from a disjunction of conditionals: a new mental models account.

    PubMed

    Barrouillet, P; Lecas, J F

    2000-08-14

    (Johnson-Laird, P.N., & Savary, F. (1999, Illusory inferences: a novel class of erroneous deductions. Cognition, 71, 191-229.) have recently presented a mental models account, based on the so-called principle of truth, for the occurrence of inferences that are compelling but invalid. This article presents an alternative account of the illusory inferences resulting from a disjunction of conditionals. In accordance with our modified theory of mental models of the conditional, we show that the way individuals represent conditionals leads them to misinterpret the locus of the disjunction and prevents them from drawing conclusions from a false conditional, thus accounting for the compelling character of the illusory inference.

  12. A study of finite mixture model: Bayesian approach on financial time series data

    NASA Astrophysics Data System (ADS)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-07-01

    Recently, statistician have emphasized on the fitting finite mixture model by using Bayesian method. Finite mixture model is a mixture of distributions in modeling a statistical distribution meanwhile Bayesian method is a statistical method that use to fit the mixture model. Bayesian method is being used widely because it has asymptotic properties which provide remarkable result. In addition, Bayesian method also shows consistency characteristic which means the parameter estimates are close to the predictive distributions. In the present paper, the number of components for mixture model is studied by using Bayesian Information Criterion. Identify the number of component is important because it may lead to an invalid result. Later, the Bayesian method is utilized to fit the k-component mixture model in order to explore the relationship between rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia. Lastly, the results showed that there is a negative effect among rubber price and stock market price for all selected countries.

  13. Physical Activity in German Adolescents Measured by Accelerometry and Activity Diary: Introducing a Comprehensive Approach for Data Management and Preliminary Results

    PubMed Central

    Pfitzner, Rebecca; Gorzelniak, Lukas; Heinrich, Joachim; von Berg, Andrea; Klümper, Claudia; Bauer, Carl P.; Koletzko, Sibylle; Berdel, Dietrich; Horsch, Alexander; Schulz, Holger

    2013-01-01

    Introduction Surveillance of physical activity (PA) is increasingly based on accelerometry. However, data management guidelines are lacking. We propose an approach for combining accelerometry and diary based PA information for assessment of PA in adolescents and provide an example of this approach using data from German adolescents. Methods The 15-year-old participants comprised a subsample the GINIplus birth cohort (n = 328, 42.4% male). Data on PA was obtained from hip-worn accelerometers (ActiGraph GT3X) for seven consecutive days, combined with a prospective activity diary. Major aspects of data management were validity of wear time, handling of non-wear time and diary comments. After data cleaning, PA and percentage of adolescents meeting the recommendations for moderate-to-vigorous activity (MVPA) per day were determined. Results From the 2224 recorded days 493 days (25%) were invalid, mainly due to uncertainties relating to non-wear time (322 days). Ultimately, 269 of 328 subjects (82%) with valid data for at least three weekdays and one weekend day were included in the analysis. Mean MVPA per day was 39.1 minutes (SD ±25.0), with boys being more active than girls (41.8±21.5 minutes vs. 37.1±27.8 minutes, p<0.001). Accordingly, 24.7% of boys and 17.2% of girls (p<0.01) met the WHO recommendations for PA. School sport accounted for only 6% of weekly MVPA. In fact, most MVPA was performed during leisure time, with the majority of adolescents engaging in ball sports (25.4%) and endurance sports (19.7%). Girls also frequently reported dancing and gymnastics (23%). Conclusion For assessment of PA in adolescents, collecting both accelerometry and diary-based information is recommended. The diary is vital for the identification of invalid data and non-compliant participants. Preliminary results suggest that four out of five German adolescents do not meet WHO recommendations for PA and that school sport contributes only little to MVPA. PMID:23750243

  14. The Psychology of Career Theory--A New Perspective?

    ERIC Educational Resources Information Center

    Woodd, Maureen

    2000-01-01

    New perspectives on human behavior have invalidated some assumptions of career theories such as personality type, career stages, and life-cycle models. Other theories, such as Driver's Objective Career Patterns, Schein's Temporal Development Model, and Nicholson's Transition Cycle, are compatible with current psychological understanding. (SK)

  15. Ground-water models: Validate or invalidate

    USGS Publications Warehouse

    Bredehoeft, J.D.; Konikow, Leonard F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  16. Omnidirectional Underwater Camera Design and Calibration

    PubMed Central

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  17. On the (In)Validity of Tests of Simple Mediation: Threats and Solutions

    PubMed Central

    Pek, Jolynn; Hoyle, Rick H.

    2015-01-01

    Mediation analysis is a popular framework for identifying underlying mechanisms in social psychology. In the context of simple mediation, we review and discuss the implications of three facets of mediation analysis: (a) conceptualization of the relations between the variables, (b) statistical approaches, and (c) relevant elements of design. We also highlight the issue of equivalent models that are inherent in simple mediation. The extent to which results are meaningful stem directly from choices regarding these three facets of mediation analysis. We conclude by discussing how mediation analysis can be better applied to examine causal processes, highlight the limits of simple mediation, and make recommendations for better practice. PMID:26985234

  18. Blaming the messenger for the bad news about partner violence by women: the methodological, theoretical, and value basis of the purported invalidity of the conflict tactics scales.

    PubMed

    Straus, Murray A

    2012-01-01

    More than 200 studies have found "gender symmetry" in perpetration of violence against a marital or dating partner in the sense that about the same percent of women as men physically assault a marital or dating partner. Most of these studies obtained the data using the Conflict Tactics Scales (CTS). However, these results have been challenged by numerous articles in the past 25 years that have asserted that the CTS is invalid. This article identifies and responds to 11 purported methodological problems of the CTS, and two other bases for the belief that the CTS is not valid. The discussion argues that the repeated assertion over the past 25 years that the CTS is invalid is not primarily about methodology. Rather it is primarily about theories and values concerning the results of research showing gender symmetry in perpetration. According to the prevailing "patriarchal dominance" theory, these results cannot be true and therefore the CTS must be invalid. The conclusion suggests that an essential part of the effort to prevent and treat violence against women and by women requires taking into account the dyadic nature of partner violence through use of instruments such as the CTS that measure violence by both partners. Copyright © 2012 John Wiley & Sons, Ltd.

  19. Human judgment vs. quantitative models for the management of ecological resources.

    PubMed

    Holden, Matthew H; Ellner, Stephen P

    2016-07-01

    Despite major advances in quantitative approaches to natural resource management, there has been resistance to using these tools in the actual practice of managing ecological populations. Given a managed system and a set of assumptions, translated into a model, optimization methods can be used to solve for the most cost-effective management actions. However, when the underlying assumptions are not met, such methods can potentially lead to decisions that harm the environment and economy. Managers who develop decisions based on past experience and judgment, without the aid of mathematical models, can potentially learn about the system and develop flexible management strategies. However, these strategies are often based on subjective criteria and equally invalid and often unstated assumptions. Given the drawbacks of both methods, it is unclear whether simple quantitative models improve environmental decision making over expert opinion. In this study, we explore how well students, using their experience and judgment, manage simulated fishery populations in an online computer game and compare their management outcomes to the performance of model-based decisions. We consider harvest decisions generated using four different quantitative models: (1) the model used to produce the simulated population dynamics observed in the game, with the values of all parameters known (as a control), (2) the same model, but with unknown parameter values that must be estimated during the game from observed data, (3) models that are structurally different from those used to simulate the population dynamics, and (4) a model that ignores age structure. Humans on average performed much worse than the models in cases 1-3, but in a small minority of scenarios, models produced worse outcomes than those resulting from students making decisions based on experience and judgment. When the models ignored age structure, they generated poorly performing management decisions, but still outperformed students using experience and judgment 66% of the time. © 2016 by the Ecological Society of America.

  20. An "Infusion" Approach to Critical Thinking: Moore on the Critical Thinking Debate

    ERIC Educational Resources Information Center

    Davies, W. Martin

    2006-01-01

    This paper argues that general skills and the varieties of subject-specific discourse are both important for teaching, learning and practising critical thinking. The former is important because it outlines the principles of good reasoning "simpliciter" (what constitutes sound reasoning patterns, invalid inferences, and so on). The latter is…

  1. Using EEG and stimulus context to probe the modelling of auditory-visual speech.

    PubMed

    Paris, Tim; Kim, Jeesun; Davis, Chris

    2016-02-01

    We investigated whether internal models of the relationship between lip movements and corresponding speech sounds [Auditory-Visual (AV) speech] could be updated via experience. AV associations were indexed by early and late event related potentials (ERPs) and by oscillatory power and phase locking. Different AV experience was produced via a context manipulation. Participants were presented with valid (the conventional pairing) and invalid AV speech items in either a 'reliable' context (80% AVvalid items) or an 'unreliable' context (80% AVinvalid items). The results showed that for the reliable context, there was N1 facilitation for AV compared to auditory only speech. This N1 facilitation was not affected by AV validity. Later ERPs showed a difference in amplitude between valid and invalid AV speech and there was significant enhancement of power for valid versus invalid AV speech. These response patterns did not change over the context manipulation, suggesting that the internal models of AV speech were not updated by experience. The results also showed that the facilitation of N1 responses did not vary as a function of the salience of visual speech (as previously reported); in post-hoc analyses, it appeared instead that N1 facilitation varied according to the relative time of the acoustic onset, suggesting for AV events N1 may be more sensitive to the relationship of AV timing than form. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  2. The impact of pediatric neuropsychological consultation in mild traumatic brain injury: a model for providing feedback after invalid performance.

    PubMed

    Connery, Amy K; Peterson, Robin L; Baker, David A; Kirkwood, Michael W

    2016-05-01

    In recent years, pediatric practitioners have increasingly recognized the importance of objectively measuring performance validity during clinical assessments. Yet, no studies have examined the impact of neuropsychological consultation when invalid performance has been identified in pediatric populations and little published guidance exists for clinical management. Here we provide a conceptual model for providing feedback after noncredible performance has been detected. In a pilot study, we examine caregiver satisfaction and postconcussive symptoms following provision of this feedback for patients seen through our concussion program. Participants (N = 70) were 8-17-year-olds with a history of mild traumatic brain injury who underwent an abbreviated neuropsychological evaluation between 2 and 12 months post-injury. We examined postconcussive symptom reduction and caregiver satisfaction after neuropsychological evaluation between groups of patients who were determined to have provided noncredible effort (n = 9) and those for whom no validity concerns were present (n = 61). We found similarly high levels of caregiver satisfaction between groups and greater reduction in self-reported symptoms after feedback was provided using the model with children with noncredible presentations compared to those with credible presentations. The current study lends preliminary support to the idea that the identification and communication of invalid performance can be a beneficial clinical intervention that promotes high levels of caregiver satisfaction and a reduction in self-reported and caregiver-reported symptoms.

  3. Analysis of image formation in optical coherence elastography using a multiphysics approach

    PubMed Central

    Chin, Lixin; Curatolo, Andrea; Kennedy, Brendan F.; Doyle, Barry J.; Munro, Peter R. T.; McLaughlin, Robert A.; Sampson, David D.

    2014-01-01

    Image formation in optical coherence elastography (OCE) results from a combination of two processes: the mechanical deformation imparted to the sample and the detection of the resulting displacement using optical coherence tomography (OCT). We present a multiphysics model of these processes, validated by simulating strain elastograms acquired using phase-sensitive compression OCE, and demonstrating close correspondence with experimental results. Using the model, we present evidence that the approximation commonly used to infer sample displacement in phase-sensitive OCE is invalidated for smaller deformations than has been previously considered, significantly affecting the measurement precision, as quantified by the displacement sensitivity and the elastogram signal-to-noise ratio. We show how the precision of OCE is affected not only by OCT shot-noise, as is usually considered, but additionally by phase decorrelation due to the sample deformation. This multiphysics model provides a general framework that could be used to compare and contrast different OCE techniques. PMID:25401007

  4. Genetic and Pharmacological Inhibition of TREM-1 Limits the Development of Experimental Atherosclerosis.

    PubMed

    Joffre, Jeremie; Potteaux, Stephane; Zeboudj, Lynda; Loyer, Xavier; Boufenzer, Amir; Laurans, Ludivine; Esposito, Bruno; Vandestienne, Marie; de Jager, Saskia C A; Hénique, Carole; Zlatanova, Ivana; Taleb, Soraya; Bruneval, Patrick; Tedgui, Alain; Mallat, Ziad; Gibot, Sebastien; Ait-Oufella, Hafid

    2016-12-27

    Innate immune responses activated through myeloid cells contribute to the initiation, progression, and complications of atherosclerosis in experimental models. However, the critical upstream pathways that link innate immune activation to foam cell formation are still poorly identified. This study sought to investigate the hypothesis that activation of the triggering receptor expressed on myeloid cells (TREM-1) plays a determinant role in macrophage atherogenic responses. After genetically invalidating Trem-1 in chimeric Ldlr -/- Trem-1 -/- mice and double knockout ApoE -/- Trem-1 -/- mice, we pharmacologically inhibited Trem-1 using LR12 peptide. Ldlr -/- mice reconstituted with bone marrow deficient for Trem-1 (Trem-1 -/- ) showed a strong reduction of atherosclerotic plaque size in both the aortic sinus and the thoracoabdominal aorta, and were less inflammatory compared to plaques of Trem-1 +/+ chimeric mice. Genetic invalidation of Trem-1 led to alteration of monocyte recruitment into atherosclerotic lesions and inhibited toll-like receptor 4 (TLR 4)-initiated proinflammatory macrophage responses. We identified a critical role for Trem-1 in the upregulation of cluster of differentiation 36 (CD36), thereby promoting the formation of inflammatory foam cells. Genetic invalidation of Trem-1 in ApoE -/- /Trem-1 -/- mice or pharmacological blockade of Trem-1 in ApoE -/- mice using LR-12 peptide also significantly reduced the development of atherosclerosis throughout the vascular tree, and lessened plaque inflammation. TREM-1 was expressed in human atherosclerotic lesions, mainly in lipid-rich areas with significantly higher levels of expression in atheromatous than in fibrous plaques. We identified TREM-1 as a major upstream proatherogenic receptor. We propose that TREM-1 activation orchestrates monocyte/macrophage proinflammatory responses and foam cell formation through coordinated and combined activation of CD36 and TLR4. Blockade of TREM-1 signaling may constitute an attractive novel and double-hit approach for the treatment of atherosclerosis. Copyright © 2016 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  5. Development and necessary norms of reasoning

    PubMed Central

    Markovits, Henry

    2014-01-01

    The question of whether reasoning can, or should, be described by a single normative model is an important one. In the following, I combine epistemological considerations taken from Piaget’s notion of genetic epistemology, a hypothesis about the role of reasoning in communication and developmental data to argue that some basic logical principles are in fact highly normative. I argue here that explicit, analytic human reasoning, in contrast to intuitive reasoning, uniformly relies on a form of validity that allows distinguishing between valid and invalid arguments based on the existence of counterexamples to conclusions. PMID:24904501

  6. Comment on "Scaling regimes and linear/nonlinear responses of last millennium climate to volcanic and solar forcing" by S. Lovejoy and C. Varotsos (2016)

    NASA Astrophysics Data System (ADS)

    Rypdal, Kristoffer; Rypdal, Martin

    2016-07-01

    Lovejoy and Varotsos (2016) (L&V) analyse the temperature response to solar, volcanic, and solar plus volcanic forcing in the Zebiak-Cane (ZC) model, and to solar and solar plus volcanic forcing in the Goddard Institute for Space Studies (GISS) E2-R model. By using a simple wavelet filtering technique they conclude that the responses in the ZC model combine subadditively on timescales from 50 to 1000 years. Nonlinear response on shorter timescales is claimed by analysis of intermittencies in the forcing and the temperature signal for both models. The analysis of additivity in the ZC model suffers from a confusing presentation of results based on an invalid approximation, and from ignoring the effect of internal variability. We present tests without this approximation which are not able to detect nonlinearity in the response, even without accounting for internal variability. We also demonstrate that internal variability will appear as subadditivity if it is not accounted for. L&V's analysis of intermittencies is based on a mathematical result stating that the intermittencies of forcing and response are the same if the response is linear. We argue that there are at least three different factors that may invalidate the application of this result for these data. It is valid only for a power-law response function; it assumes power-law scaling of structure functions of forcing as well as temperature signal; and the internal variability, which is strong at least on the short timescales, will exert an influence on temperature intermittence which is independent of the forcing. We demonstrate by a synthetic example that the differences in intermittencies observed by L&V easily can be accounted for by these effects under the assumption of a linear response. Our conclusion is that the analysis performed by L&V does not present valid evidence for a detectable nonlinear response in the global temperature in these climate models.

  7. FAST TRACK COMMUNICATION The Bel-Robinson tensor for topologically massive gravity

    NASA Astrophysics Data System (ADS)

    Deser, S.; Franklin, J.

    2011-02-01

    We construct, and establish the (covariant) conservation of, a 4-index 'super stress tensor' for topologically massive gravity. Separately, we discuss its invalidity in quadratic curvature models and suggest a generalization.

  8. A Semi-Supervised Learning Approach to Enhance Health Care Community–Based Question Answering: A Case Study in Alcoholism

    PubMed Central

    Klabjan, Diego; Jonnalagadda, Siddhartha Reddy

    2016-01-01

    Background Community-based question answering (CQA) sites play an important role in addressing health information needs. However, a significant number of posted questions remain unanswered. Automatically answering the posted questions can provide a useful source of information for Web-based health communities. Objective In this study, we developed an algorithm to automatically answer health-related questions based on past questions and answers (QA). We also aimed to understand information embedded within Web-based health content that are good features in identifying valid answers. Methods Our proposed algorithm uses information retrieval techniques to identify candidate answers from resolved QA. To rank these candidates, we implemented a semi-supervised leaning algorithm that extracts the best answer to a question. We assessed this approach on a curated corpus from Yahoo! Answers and compared against a rule-based string similarity baseline. Results On our dataset, the semi-supervised learning algorithm has an accuracy of 86.2%. Unified medical language system–based (health related) features used in the model enhance the algorithm’s performance by proximately 8%. A reasonably high rate of accuracy is obtained given that the data are considerably noisy. Important features distinguishing a valid answer from an invalid answer include text length, number of stop words contained in a test question, a distance between the test question and other questions in the corpus, and a number of overlapping health-related terms between questions. Conclusions Overall, our automated QA system based on historical QA pairs is shown to be effective according to the dataset in this case study. It is developed for general use in the health care domain, which can also be applied to other CQA sites. PMID:27485666

  9. Distinguishing Valid from Invalid Causal Indicator Models

    ERIC Educational Resources Information Center

    Cadogan, John W.; Lee, Nick

    2016-01-01

    In this commentary from Issue 14, n3, authors John Cadogan and Nick Lee applaud the paper by Aguirre-Urreta, Rönkkö, and Marakas "Measurement: Interdisciplinary Research and Perspectives", 14(3), 75-97 (2016), since their explanations and simulations work toward demystifying causal indicator models, which are often used by scholars…

  10. Modeling the Atmospheric Phase Effects of a Digital Antenna Array Communications System

    NASA Technical Reports Server (NTRS)

    Tkacenko, A.

    2006-01-01

    In an antenna array system such as that used in the Deep Space Network (DSN) for satellite communication, it is often necessary to account for the effects due to the atmosphere. Typically, the atmosphere induces amplitude and phase fluctuations on the transmitted downlink signal that invalidate the assumed stationarity of the signal model. The degree to which these perturbations affect the stationarity of the model depends both on parameters of the atmosphere, including wind speed and turbulence strength, and on parameters of the communication system, such as the sampling rate used. In this article, we focus on modeling the atmospheric phase fluctuations in a digital antenna array communications system. Based on a continuous-time statistical model for the atmospheric phase effects, we show how to obtain a related discrete-time model based on sampling the continuous-time process. The effects of the nonstationarity of the resulting signal model are investigated using the sample matrix inversion (SMI) algorithm for minimum mean-squared error (MMSE) equalization of the received signal

  11. Securing Valid Information for Evaluation of Job Performance of the University Faculty.

    ERIC Educational Resources Information Center

    Donavan, Bruce

    Approaches to obtaining valid information for evaluating faculty and the issue of alcoholism and job performance are addressed. Among the complications to this undertaking is the existence of an invalid self-perception on the part of faculty that they are not employees of the institution, and a tolerance among faculty for deviance or eccentricity.…

  12. Educating Jurors about Forensic Evidence: Using an Expert Witness and Judicial Instructions to Mitigate the Impact of Invalid Forensic Science Testimony.

    PubMed

    Eastwood, Joseph; Caldwell, Jiana

    2015-11-01

    Invalid expert witness testimony that overstated the precision and accuracy of forensic science procedures has been highlighted as a common factor in many wrongful conviction cases. This study assessed the ability of an opposing expert witness and judicial instructions to mitigate the impact of invalid forensic science testimony. Participants (N = 155) acted as mock jurors in a sexual assault trial that contained both invalid forensic testimony regarding hair comparison evidence, and countering testimony from either a defense expert witness or judicial instructions. Results showed that the defense expert witness was successful in educating jurors regarding limitations in the initial expert's conclusions, leading to a greater number of not-guilty verdicts. The judicial instructions were shown to have no impact on verdict decisions. These findings suggest that providing opposing expert witnesses may be an effective safeguard against invalid forensic testimony in criminal trials. © 2015 American Academy of Forensic Sciences.

  13. Moral Dilemmas and Existential Issues Encountered Both in Psychotherapy and Philosophical Counseling Practices.

    PubMed

    Popescu, Beatrice A

    2015-08-01

    This paper stems from clinical observations and empirical data collected in the therapy room over six years. It investigates the relationship between psychotherapy and philosophical counseling, proposing an integrative model of counseling. During cognitive behavior therapy sessions with clients who turn to therapy in order to solve their clinical issues, the author noticed that behind most of the invalidating symptoms classified by the DSM-5 as depression, anxiety, hypochondriac and phobic complaints, usually lies a lack of existential meaning or existential scope and clients are also tormented by moral dilemmas. Following the anamnestic interview and the psychological evaluation, rarely the depression or anxiety diagnosed on Axis I is purely just a sum of invalidating symptoms, which may disappear if treated symptomatically. When applying the Sentence Completion Test, an 80 items test of psychodynamic origin and high-face validity, most of the clients report an entire plethora of conscious or unconscious motivations, distorted cognitions or irrational thinking but also grave existential themes such as scope or meaning of life, professional identity, fear of death, solitude and loneliness, freedom of choice and liberty. Same issues are approached in the philosophical counseling practice, but no systematic research has been done yet in the field. Future research and investigation is needed in order to assess the importance of moral dilemmas and existential issues in both practices.

  14. Moral Dilemmas and Existential Issues Encountered Both in Psychotherapy and Philosophical Counseling Practices

    PubMed Central

    Popescu, Beatrice A.

    2015-01-01

    This paper stems from clinical observations and empirical data collected in the therapy room over six years. It investigates the relationship between psychotherapy and philosophical counseling, proposing an integrative model of counseling. During cognitive behavior therapy sessions with clients who turn to therapy in order to solve their clinical issues, the author noticed that behind most of the invalidating symptoms classified by the DSM-5 as depression, anxiety, hypochondriac and phobic complaints, usually lies a lack of existential meaning or existential scope and clients are also tormented by moral dilemmas. Following the anamnestic interview and the psychological evaluation, rarely the depression or anxiety diagnosed on Axis I is purely just a sum of invalidating symptoms, which may disappear if treated symptomatically. When applying the Sentence Completion Test, an 80 items test of psychodynamic origin and high-face validity, most of the clients report an entire plethora of conscious or unconscious motivations, distorted cognitions or irrational thinking but also grave existential themes such as scope or meaning of life, professional identity, fear of death, solitude and loneliness, freedom of choice and liberty. Same issues are approached in the philosophical counseling practice, but no systematic research has been done yet in the field. Future research and investigation is needed in order to assess the importance of moral dilemmas and existential issues in both practices. PMID:27247674

  15. Field‐readable alphanumeric flags are valuable markers for shorebirds: use of double‐marking to identify cases of misidentification

    USGS Publications Warehouse

    Roche, Erin A.; Dovichin, Colin M.; Arnold, Todd W.

    2014-01-01

    Implicit assumptions for most mark-recapture studies are that individuals do not lose their markers and all observed markers are correctly recorded. If these assumptions are violated, e.g., due to loss or extreme wear of markers, estimates of population size and vital rates will be biased. Double-marking experiments have been widely used to estimate rates of marker loss and adjust for associated bias, and we extended this approach to estimate rates of recording errors. We double-marked 309 Piping Plovers (Charadrius melodus) with unique combinations of color bands and alphanumeric flags and used multi-state mark recapture models to estimate the frequency with which plovers were misidentified. Observers were twice as likely to read and report an invalid color-band combination (2.4% of the time) as an invalid alphanumeric code (1.0%). Observers failed to read matching band combinations or alphanumeric flag codes 4.5% of the time. Unlike previous band resighting studies, use of two resightable markers allowed us to identify when resighting errors resulted in reports of combinations or codes that were valid, but still incorrect; our results suggest this may be a largely unappreciated problem in mark-resight studies. Field-readable alphanumeric flags offer a promising auxiliary marker for identifying and potentially adjusting for false-positive resighting errors that may otherwise bias demographic estimates.

  16. Localization of determinants of fertility through measurement adaptations in developing-country settings: The case of Iran: Comment on "Analysis of economic determinants of fertility in Iran: a multilevel approach".

    PubMed

    Erfani, Amir

    2014-12-01

    Studies investigating fertility decline in developing countries often adopt measures of determinants of fertility behavior developed based on observations from developed countries, without adapting them to the realities of the study setting. As a result, their findings are usually invalid, anomalous or statistically non-significant. This commentary draws on the research article by Moeeni and colleagues, as an exemplary work which has not adapted measures of two key economic determinants of fertility behavior, namely gender inequality and opportunity costs of childbearing, to the realities of Iran's economy. Measurement adaptations that can improve the study are discussed.

  17. Anharmonic quantum mechanical systems do not feature phase space trajectories

    NASA Astrophysics Data System (ADS)

    Oliva, Maxime; Kakofengitis, Dimitris; Steuernagel, Ole

    2018-07-01

    Phase space dynamics in classical mechanics is described by transport along trajectories. Anharmonic quantum mechanical systems do not allow for a trajectory-based description of their phase space dynamics. This invalidates some approaches to quantum phase space studies. We first demonstrate the absence of trajectories in general terms. We then give an explicit proof for all quantum phase space distributions with negative values: we show that the generation of coherences in anharmonic quantum mechanical systems is responsible for the occurrence of singularities in their phase space velocity fields, and vice versa. This explains numerical problems repeatedly reported in the literature, and provides deeper insight into the nature of quantum phase space dynamics.

  18. Intuitive Logic Revisited: New Data and a Bayesian Mixed Model Meta-Analysis

    PubMed Central

    Singmann, Henrik; Klauer, Karl Christoph; Kellen, David

    2014-01-01

    Recent research on syllogistic reasoning suggests that the logical status (valid vs. invalid) of even difficult syllogisms can be intuitively detected via differences in conceptual fluency between logically valid and invalid syllogisms when participants are asked to rate how much they like a conclusion following from a syllogism (Morsanyi & Handley, 2012). These claims of an intuitive logic are at odds with most theories on syllogistic reasoning which posit that detecting the logical status of difficult syllogisms requires effortful and deliberate cognitive processes. We present new data replicating the effects reported by Morsanyi and Handley, but show that this effect is eliminated when controlling for a possible confound in terms of conclusion content. Additionally, we reanalyze three studies () without this confound with a Bayesian mixed model meta-analysis (i.e., controlling for participant and item effects) which provides evidence for the null-hypothesis and against Morsanyi and Handley's claim. PMID:24755777

  19. Manual editing of automatically recorded data in an anesthesia information management system.

    PubMed

    Wax, David B; Beilin, Yaakov; Hossain, Sabera; Lin, Hung-Mo; Reich, David L

    2008-11-01

    Anesthesia information management systems allow automatic recording of physiologic and anesthetic data. The authors investigated the prevalence of such data modification in an academic medical center. The authors queried their anesthesia information management system database of anesthetics performed in 2006 and tabulated the counts of data points for automatically recorded physiologic and anesthetic parameters as well as the subset of those data that were manually invalidated by clinicians (both with and without alternate values manually appended). Patient, practitioner, data source, and timing characteristics of recorded values were also extracted to determine their associations with editing of various parameters in the anesthesia information management system record. A total of 29,491 cases were analyzed, 19% of which had one or more data points manually invalidated. Among 58 attending anesthesiologists, each invalidated data in a median of 7% of their cases when working as a sole practitioner. A minority of invalidated values were manually appended with alternate values. Pulse rate, blood pressure, and pulse oximetry were the most commonly invalidated parameters. Data invalidation usually resulted in a decrease in parameter variance. Factors independently associated with invalidation included extreme physiologic values, American Society of Anesthesiologists physical status classification, emergency status, timing (phase of the procedure/anesthetic), presence of an intraarterial catheter, resident or certified registered nurse anesthetist involvement, and procedure duration. Editing of physiologic data automatically recorded in an anesthesia information management system is a common practice and results in decreased variability of intraoperative data. Further investigation may clarify the reasons for and consequences of this behavior.

  20. Assessment of First- and Second-Order Wave-Excitation Load Models for Cylindrical Substructures: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pereyra, Brandon; Wendt, Fabian; Robertson, Amy

    2017-03-09

    The hydrodynamic loads on an offshore wind turbine's support structure present unique engineering challenges for offshore wind. Two typical approaches used for modeling these hydrodynamic loads are potential flow (PF) and strip theory (ST), the latter via Morison's equation. This study examines the first- and second-order wave-excitation surge forces on a fixed cylinder in regular waves computed by the PF and ST approaches to (1) verify their numerical implementations in HydroDyn and (2) understand when the ST approach breaks down. The numerical implementation of PF and ST in HydroDyn, a hydrodynamic time-domain solver implemented as a module in the FASTmore » wind turbine engineering tool, was verified by showing the consistency in the first- and second-order force output between the two methods across a range of wave frequencies. ST is known to be invalid at high frequencies, and this study investigates where the ST solution diverges from the PF solution. Regular waves across a range of frequencies were run in HydroDyn for a monopile substructure. As expected, the solutions for the first-order (linear) wave-excitation loads resulting from these regular waves are similar for PF and ST when the diameter of the cylinder is small compared to the length of the waves (generally when the diameter-to-wavelength ratio is less than 0.2). The same finding applies to the solutions for second-order wave-excitation loads, but for much smaller diameter-to-wavelength ratios (based on wavelengths of first-order waves).« less

  1. Assessment of First- and Second-Order Wave-Excitation Load Models for Cylindrical Substructures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pereyra, Brandon; Wendt, Fabian; Robertson, Amy

    2016-07-01

    The hydrodynamic loads on an offshore wind turbine's support structure present unique engineering challenges for offshore wind. Two typical approaches used for modeling these hydrodynamic loads are potential flow (PF) and strip theory (ST), the latter via Morison's equation. This study examines the first- and second-order wave-excitation surge forces on a fixed cylinder in regular waves computed by the PF and ST approaches to (1) verify their numerical implementations in HydroDyn and (2) understand when the ST approach breaks down. The numerical implementation of PF and ST in HydroDyn, a hydrodynamic time-domain solver implemented as a module in the FASTmore » wind turbine engineering tool, was verified by showing the consistency in the first- and second-order force output between the two methods across a range of wave frequencies. ST is known to be invalid at high frequencies, and this study investigates where the ST solution diverges from the PF solution. Regular waves across a range of frequencies were run in HydroDyn for a monopile substructure. As expected, the solutions for the first-order (linear) wave-excitation loads resulting from these regular waves are similar for PF and ST when the diameter of the cylinder is small compared to the length of the waves (generally when the diameter-to-wavelength ratio is less than 0.2). The same finding applies to the solutions for second-order wave-excitation loads, but for much smaller diameter-to-wavelength ratios (based on wavelengths of first-order waves).« less

  2. Validity and Reliability of Baseline Testing in a Standardized Environment.

    PubMed

    Higgins, Kathryn L; Caze, Todd; Maerlender, Arthur

    2017-08-11

    The Immediate Postconcussion Assessment and Cognitive Testing (ImPACT) is a computerized neuropsychological test battery commonly used to determine cognitive recovery from concussion based on comparing post-injury scores to baseline scores. This model is based on the premise that ImPACT baseline test scores are a valid and reliable measure of optimal cognitive function at baseline. Growing evidence suggests that this premise may not be accurate and a large contributor to invalid and unreliable baseline test scores may be the protocol and environment in which baseline tests are administered. This study examined the effects of a standardized environment and administration protocol on the reliability and performance validity of athletes' baseline test scores on ImPACT by comparing scores obtained in two different group-testing settings. Three hundred-sixty one Division 1 cohort-matched collegiate athletes' baseline data were assessed using a variety of indicators of potential performance invalidity; internal reliability was also examined. Thirty-one to thirty-nine percent of the baseline cases had at least one indicator of low performance validity, but there were no significant differences in validity indicators based on environment in which the testing was conducted. Internal consistency reliability scores were in the acceptable to good range, with no significant differences between administration conditions. These results suggest that athletes may be reliably performing at levels lower than their best effort would produce. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Evaluating a Novel Eye Tracking Tool to Detect Invalid Responding in Neurocognitive Assessment

    DTIC Science & Technology

    2014-05-07

    Learning Test-II (CVLT-II; 63), Rey Auditory Verbal Learning Test (RAVLT; 231), Warrington’s Recognition Memory Test (RMT; 274), and Seashore Rhythm...history of brain injury (BR) and unbiased responders without a history of brain injury (UR). Demographics (e.g., age, sex , race/ethnicity, years of...project (i.e., “true” invalid responding) is rarely observed with certainty or experimentally induced . However, behavior that approximates true invalid

  4. Bad data packet capture device

    DOEpatents

    Chen, Dong; Gara, Alan; Heidelberger, Philip; Vranas, Pavlos

    2010-04-20

    An apparatus and method for capturing data packets for analysis on a network computing system includes a sending node and a receiving node connected by a bi-directional communication link. The sending node sends a data transmission to the receiving node on the bi-directional communication link, and the receiving node receives the data transmission and verifies the data transmission to determine valid data and invalid data and verify retransmissions of invalid data as corresponding valid data. A memory device communicates with the receiving node for storing the invalid data and the corresponding valid data. A computing node communicates with the memory device and receives and performs an analysis of the invalid data and the corresponding valid data received from the memory device.

  5. Embedded performance validity testing in neuropsychological assessment: Potential clinical tools.

    PubMed

    Rickards, Tyler A; Cranston, Christopher C; Touradji, Pegah; Bechtold, Kathleen T

    2018-01-01

    The article aims to suggest clinically-useful tools in neuropsychological assessment for efficient use of embedded measures of performance validity. To accomplish this, we integrated available validity-related and statistical research from the literature, consensus statements, and survey-based data from practicing neuropsychologists. We provide recommendations for use of 1) Cutoffs for embedded performance validity tests including Reliable Digit Span, California Verbal Learning Test (Second Edition) Forced Choice Recognition, Rey-Osterrieth Complex Figure Test Combination Score, Wisconsin Card Sorting Test Failure to Maintain Set, and the Finger Tapping Test; 2) Selecting number of performance validity measures to administer in an assessment; and 3) Hypothetical clinical decision-making models for use of performance validity testing in a neuropsychological assessment collectively considering behavior, patient reporting, and data indicating invalid or noncredible performance. Performance validity testing helps inform the clinician about an individual's general approach to tasks: response to failure, task engagement and persistence, compliance with task demands. Data-driven clinical suggestions provide a resource to clinicians and to instigate conversation within the field to make more uniform, testable decisions to further the discussion, and guide future research in this area.

  6. Neural correlates of the spatial and expectancy components of endogenous and stimulus-driven orienting of attention in the Posner task.

    PubMed

    Doricchi, Fabrizio; Macci, Enrica; Silvetti, Massimo; Macaluso, Emiliano

    2010-07-01

    Voluntary orienting of visual attention is conventionally measured in tasks with predictive central cues followed by frequent valid targets at the cued location and by infrequent invalid targets at the uncued location. This implies that invalid targets entail both spatial reorienting of attention and breaching of the expected spatial congruency between cues and targets. Here, we used event-related functional magnetic resonance imaging (fMRI) to separate the neural correlates of the spatial and expectancy components of both endogenous orienting and stimulus-driven reorienting of attention. We found that during endogenous orienting with predictive cues, there was a significant deactivation of the right Temporal-Parietal Junction (TPJ). We also discovered that the lack of an equivalent deactivation with nonpredictive cues was matched to drop in attentional costs and preservation of attentional benefits. The right TPJ showed equivalent responses to invalid targets following predictive and nonpredictive cues. On the contrary, infrequent-unexpected invalid targets following predictive cues specifically activated the right Middle and Inferior Frontal Gyrus (MFG-IFG). Additional comparisons with spatially neutral trials demonstrated that, independently of cue predictiveness, valid targets activate the left TPJ, whereas invalid targets activate both the left and right TPJs. These findings show that the selective right TPJ activation that is found in the comparison between invalid and valid trials results from the reciprocal cancelling of the different activations that in the left TPJ are related to the processing of valid and invalid targets. We propose that left and right TPJs provide "matching and mismatching to attentional template" signals. These signals enable reorienting of attention and play a crucial role in the updating of the statistical contingency between cues and targets.

  7. Correlates of invalid neuropsychological test performance after traumatic brain injury.

    PubMed

    Donders, Jacobus; Boonstra, Tyler

    2007-03-01

    To investigate external correlates of invalid test performance after traumatic brain injury, as assessed by the California Verbal Learning Test - Second Edition (CVLT-II) and Word Memory Test (WMT). Consecutive 2-year series of rehabilitation referrals with a diagnosis of traumatic brain injury (n = 87). Logistic regression analysis was used to determine which demographic and neurological variables best differentiated those with vs. without actuarial CVLT-II or WMT evidence for invalid responding. Twenty-one participants (about 24%) performed in the invalid range. The combination of a premorbid psychiatric history with minimal or no coma was associated with an approximately four-fold increase in the likelihood of invalid performance. Premorbid psychosocial complicating factors constitute a significant threat to validity of neuropsychological test results after (especially mild) traumatic brain injury. At the same time, care should be taken to not routinely assume that all persons with mild traumatic brain injury and premorbid psychiatric histories are simply malingering. The WMT appears to be a promising instrument for the purpose of identifying those cases where neuropsychological test results are confounded by factors not directly related to acquired cerebral impairment.

  8. Analytic study of small scale structure on cosmic strings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polchinski, Joseph; Rocha, Jorge V.; Department of Physics, University of California, Santa Barbara, California 93106

    2006-10-15

    The properties of string networks at scales well below the horizon are poorly understood, but they enter critically into many observables. We argue that in some regimes, stretching will be the only relevant process governing the evolution. In this case, the string two-point function is determined up to normalization: the fractal dimension approaches one at short distance, but the rate of approach is characterized by an exponent that plays an essential role in network properties. The smoothness at short distance implies, for example, that cosmic string lensing images are almost undistorted. We then add in loop production as a perturbationmore » and find that it diverges at small scales. This need not invalidate the stretching model, since the loop production occurs in localized regions, but it implies a complicated fragmentation process. Our ability to model this process is limited, but we argue that loop production peaks a few orders of magnitude below the horizon scale, without the inclusion of gravitational radiation. We find agreement with some features of simulations, and interesting discrepancies that must be resolved by future work.« less

  9. Calibration Transfer Between a Bench Scanning and a Submersible Diode Array Spectrophotometer for In Situ Wastewater Quality Monitoring in Sewer Systems.

    PubMed

    Brito, Rita S; Pinheiro, Helena M; Ferreira, Filipa; Matos, José S; Pinheiro, Alexandre; Lourenço, Nídia D

    2016-03-01

    Online monitoring programs based on spectroscopy have a high application potential for the detection of hazardous wastewater discharges in sewer systems. Wastewater hydraulics poses a challenge for in situ spectroscopy, especially when the system includes storm water connections leading to rapid changes in water depth, velocity, and in the water quality matrix. Thus, there is a need to optimize and fix the location of in situ instruments, limiting their availability for calibration. In this context, the development of calibration models on bench spectrophotometers to estimate wastewater quality parameters from spectra acquired with in situ instruments could be very useful. However, spectra contain information not only from the samples, but also from the spectrophotometer generally invalidating this approach. The use of calibration transfer methods is a promising solution to this problem. In this study, calibration models were developed using interval partial least squares (iPLS), for the estimation of total suspended solids (TSS) and chemical oxygen demand (COD) in sewage from Ultraviolet-visible spectra acquired in a bench scanning spectrophotometer. The feasibility of calibration transfer to a submersible, diode array equipment, to be subsequently operated in situ, was assessed using three procedures: slope and bias correction (SBC); single wavelength standardization (SWS) on mean spectra; and local centering (LC). The results showed that SBC was the most adequate for the available data, adding insignificant error to the base model estimates. Single wavelength standardization was a close second best, potentially more robust, and independent of the base iPLS model. Local centering was shown to be inadequate for the samples and instruments used. © The Author(s) 2016.

  10. [Consolidating the medical model of disability: on poliomyelitis and constitution of orthopedic surgery and orthopaedics as a speciality in Spain (1930-1950)].

    PubMed

    Martínez-Pérez, José

    2009-01-01

    At the beginning of the 1930s, various factors made it necessary to transform one of the institutions which was renowned for its work regarding the social reinsertion of the disabled, that is, the Instituto de Reeducación Profesional de Inválidos del Trabajo (Institute for Occupational Retraining of Invalids of Work). The economic crisis of 1929 and the legislative reform aimed at regulating occupational accidents highlighted the failings of this institution to fulfill its objectives. After a time of uncertainty, the centre was renamed the Instituto Nacional de Reeducación de Inválidos (National Institute for Retraining of Invalids). This was done to take advantage of its work in championing the recovery of all people with disabilities.This work aims to study the role played in this process by the poliomyelitis epidemics in Spain at this time. It aims to highlight how this disease justified the need to continue the work of a group of professionals and how it helped to reorient the previous programme to re-educate the "invalids." Thus we shall see the way in which, from 1930 to 1950, a specific medical technology helped to consolidate an "individual model" of disability and how a certain cultural stereotype of those affected developed as a result. Lastly, this work discusses the way in which all this took place in the midst of a process of professional development of orthopaedic surgeons.

  11. Theoretical Issues of Validity in the Measurement of Aided Speech Reception Threshold in Noise for Comparing Nonlinear Hearing Aid Systems.

    PubMed

    Naylor, Graham

    2016-07-01

    Adaptive Speech Reception Threshold in noise (SRTn) measurements are often used to make comparisons between alternative hearing aid (HA) systems. Such measurements usually do not constrain the signal-to-noise ratio (SNR) at which testing takes place. Meanwhile, HA systems increasingly include nonlinear features that operate differently in different SNRs, and listeners differ in their inherent SNR requirements. To show that SRTn measurements, as commonly used in comparisons of alternative HA systems, suffer from threats to their validity, to illustrate these threats with examples of potentially invalid conclusions in the research literature, and to propose ways to tackle these threats. An examination of the nature of SRTn measurements in the context of test theory, modern nonlinear HAs, and listener diversity. Examples from the audiological research literature were used to estimate typical interparticipant variation in SRTn and to illustrate cases where validity may have been compromised. There can be no doubt that SRTn measurements, when used to compare nonlinear HA systems, in principle, suffer from threats to their internal and external/ecological validity. Interactions between HA nonlinearities and SNR, and interparticipant differences in inherent SNR requirements, can act to generate misleading results. In addition, SRTn may lie at an SNR outside the range for which the HA system is designed or expected to operate in. Although the extent of invalid conclusions in the literature is difficult to evaluate, examples of studies were nevertheless identified where the risk of each form of invalidity is significant. Reliable data on ecological SNRs is becoming available, so that ecological validity can be assessed. Methodological developments that can reduce the risk of invalid conclusions include variations on the SRTn measurement procedure itself, manipulations of stimulus or scoring conditions to place SRTn in an ecologically relevant range, and design and analysis approaches that take account of interparticipant differences. American Academy of Audiology.

  12. The Response of the Left Ventral Attentional System to Invalid Targets and its Implication for the Spatial Neglect Syndrome: a Multivariate fMRI Investigation.

    PubMed

    Silvetti, Massimo; Lasaponara, Stefano; Lecce, Francesca; Dragone, Alessio; Macaluso, Emiliano; Doricchi, Fabrizio

    2016-12-01

    In humans, invalid visual targets that mismatch spatial expectations induced by attentional cues are considered to selectively engage a right hemispheric "reorienting" network that includes the temporal parietal junction (TPJ), the inferior frontal gyrus (IFG), and the medial frontal gyrus (MFG). However, recent findings suggest that this hemispheric dominance is not absolute and that it is rather observed because the TPJ and IFG areas in the left hemisphere are engaged both by invalid and valid cued targets. Because of this, the BOLD response of the left hemisphere to invalid targets is usually cancelled out by the standard "invalid versus valid" contrast used in functional magnetic resonance imaging investigations of spatial attention. Here, we used multivariate pattern recognition analysis (MVPA) to gain finer insight into the role played by the left TPJ and IFG in reorienting to invalid targets. We found that in left TPJ and IFG blood oxygen level-dependent (BOLD) responses to invalid and valid targets were associated to different patterns of neural activity, possibly reflecting the presence of functionally distinct neuronal populations. Pattern segregation was significant at group level, it was present in almost all of the participants to the study and was observed both for targets in the left and right side of space. A control whole-brain MVPA ("Searchlight" analysis) confirmed the results obtained in predefined regions of interest and highlighted that also other areas, that is, superior parietal and frontal-polar cortex, show different patterns of BOLD response to valid and invalid targets. These results confirm and expand previous evidence highlighting the involvement of the left hemisphere in reorienting of visual attention (Doricchi et al. 2010; Dragone et al. 2015). These findings suggest that asymmetrical reorienting deficits suffered by right brain damaged patients with left spatial neglect, who have severe impairments in contralesional reorienting and less severe impairments in ipsilesional reorienting, are due to preserved reorienting abilities in the intact left hemisphere. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Can Invalid Bioactives Undermine Natural Product-Based Drug Discovery?

    PubMed Central

    2015-01-01

    High-throughput biology has contributed a wealth of data on chemicals, including natural products (NPs). Recently, attention was drawn to certain, predominantly synthetic, compounds that are responsible for disproportionate percentages of hits but are false actives. Spurious bioassay interference led to their designation as pan-assay interference compounds (PAINS). NPs lack comparable scrutiny, which this study aims to rectify. Systematic mining of 80+ years of the phytochemistry and biology literature, using the NAPRALERT database, revealed that only 39 compounds represent the NPs most reported by occurrence, activity, and distinct activity. Over 50% are not explained by phenomena known for synthetic libraries, and all had manifold ascribed bioactivities, designating them as invalid metabolic panaceas (IMPs). Cumulative distributions of ∼200,000 NPs uncovered that NP research follows power-law characteristics typical for behavioral phenomena. Projection into occurrence–bioactivity–effort space produces the hyperbolic black hole of NPs, where IMPs populate the high-effort base. PMID:26505758

  14. An elastic-plastic contact model for line contact structures

    NASA Astrophysics Data System (ADS)

    Zhu, Haibin; Zhao, Yingtao; He, Zhifeng; Zhang, Ruinan; Ma, Shaopeng

    2018-06-01

    Although numerical simulation tools are now very powerful, the development of analytical models is very important for the prediction of the mechanical behaviour of line contact structures for deeply understanding contact problems and engineering applications. For the line contact structures widely used in the engineering field, few analytical models are available for predicting the mechanical behaviour when the structures deform plastically, as the classic Hertz's theory would be invalid. Thus, the present study proposed an elastic-plastic model for line contact structures based on the understanding of the yield mechanism. A mathematical expression describing the global relationship between load history and contact width evolution of line contact structures was obtained. The proposed model was verified through an actual line contact test and a corresponding numerical simulation. The results confirmed that this model can be used to accurately predict the elastic-plastic mechanical behaviour of a line contact structure.

  15. Singlet model interference effects with high scale UV physics

    DOE PAGES

    Dawson, S.; Lewis, I. M.

    2017-01-06

    One of the simplest extensions of the Standard Model (SM) is the addition of a scalar gauge singlet, S . If S is not forbidden by a symmetry from mixing with the Standard Model Higgs boson, the mixing will generate non-SM rates for Higgs production and decays. Generally, there could also be unknown high energy physics that generates additional effective low energy interactions. We show that interference effects between the scalar resonance of the singlet model and the effective field theory (EFT) operators can have significant effects in the Higgs sector. Here, we examine a non- Z 2 symmetricmore » scalar singlet model and demonstrate that a fit to the 125 GeV Higgs boson couplings and to limits on high mass resonances, S , exhibit an interesting structure and possible large cancellations of effects between the resonance contribution and the new EFT interactions, that invalidate conclusions based on the renormalizable singlet model alone.« less

  16. A nonlinear fracture mechanics approach to the growth of small cracks

    NASA Technical Reports Server (NTRS)

    Newman, J. C., Jr.

    1983-01-01

    An analytical model of crack closure is used to study the crack growth and closure behavior of small cracks in plates and at notches. The calculated crack opening stresses for small and large cracks, together with elastic and elastic plastic fracture mechanics analyses, are used to correlate crack growth rate data. At equivalent elastic stress intensity factor levels, calculations predict that small cracks in plates and at notches should grow faster than large cracks because the applied stress needed to open a small crack is less than that needed to open a large crack. These predictions agree with observed trends in test data. The calculations from the model also imply that many of the stress intensity factor thresholds that are developed in tests with large cracks and with load reduction schemes do not apply to the growth of small cracks. The current calculations are based upon continuum mechanics principles and, thus, some crack size and grain structure exist where the underlying fracture mechanics assumptions become invalid because of material inhomogeneity (grains, inclusions, etc.). Admittedly, much more effort is needed to develop the mechanics of a noncontinuum. Nevertheless, these results indicate the importance of crack closure in predicting the growth of small cracks from large crack data.

  17. Incidents Prediction in Road Junctions Using Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Hajji, Tarik; Alami Hassani, Aicha; Ouazzani Jamil, Mohammed

    2018-05-01

    The implementation of an incident detection system (IDS) is an indispensable operation in the analysis of the road traffics. However the IDS may, in no case, represent an alternative to the classical monitoring system controlled by the human eye. The aim of this work is to increase detection and prediction probability of incidents in camera-monitored areas. Knowing that, these areas are monitored by multiple cameras and few supervisors. Our solution is to use Artificial Neural Networks (ANN) to analyze moving objects trajectories on captured images. We first propose a modelling of the trajectories and their characteristics, after we develop a learning database for valid and invalid trajectories, and then we carry out a comparative study to find the artificial neural network architecture that maximizes the rate of valid and invalid trajectories recognition.

  18. In modelling effects of global warming, invalid assumptions lead to unrealistic projections.

    PubMed

    Lefevre, Sjannie; McKenzie, David J; Nilsson, Göran E

    2018-02-01

    In their recent Opinion, Pauly and Cheung () provide new projections of future maximum fish weight (W ∞ ). Based on criticism by Lefevre et al. (2017) they changed the scaling exponent for anabolism, d G . Here we find that changing both d G and the scaling exponent for catabolism, b, leads to the projection that fish may even become 98% smaller with a 1°C increase in temperature. This unrealistic outcome indicates that the current W ∞ is unlikely to be explained by the Gill-Oxygen Limitation Theory (GOLT) and, therefore, GOLT cannot be used as a mechanistic basis for model projections about fish size in a warmer world. © 2017 John Wiley & Sons Ltd.

  19. Penis invalidating cicatricial outcomes in an enlargement phalloplasty case with polyacrylamide gel (Formacryl).

    PubMed

    Parodi, P C; Dominici, M; Moro, U

    2006-01-01

    The present article reports the case of a patient subjected to polyacrylamide polymers-composed gel cutaneous infiltration in the penis for cosmetic purposes, resulting in severe invalidating outcomes. A significant tissue reaction to the subcutaneous injection of polyacrylamide gel for the penis enlargement purpose resulted in permanent and invalidating scars both on the esthetic and functional levels. Such a result must be simply taken into account both singly and in the light of the international literature to exclude this method as standard uro-andrologic activity.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, S.; Lewis, I. M.

    One of the simplest extensions of the Standard Model (SM) is the addition of a scalar gauge singlet, S . If S is not forbidden by a symmetry from mixing with the Standard Model Higgs boson, the mixing will generate non-SM rates for Higgs production and decays. Generally, there could also be unknown high energy physics that generates additional effective low energy interactions. We show that interference effects between the scalar resonance of the singlet model and the effective field theory (EFT) operators can have significant effects in the Higgs sector. Here, we examine a non- Z 2 symmetricmore » scalar singlet model and demonstrate that a fit to the 125 GeV Higgs boson couplings and to limits on high mass resonances, S , exhibit an interesting structure and possible large cancellations of effects between the resonance contribution and the new EFT interactions, that invalidate conclusions based on the renormalizable singlet model alone.« less

  1. 16 CFR 305.24 - Stayed or invalid parts.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND WATER USE LABELING FOR CONSUMER PRODUCTS UNDER THE ENERGY POLICY AND CONSERVATION ACT (âENERGY LABELING RULEâ) Effect of This Part § 305.24 Stayed or invalid parts. If any section or portion of a...

  2. Validation of measures of biosocial precursors to borderline personality disorder: childhood emotional vulnerability and environmental invalidation.

    PubMed

    Sauer, Shannon E; Baer, Ruth A

    2010-12-01

    Linehan's biosocial theory suggests that borderline personality disorder (BPD) results from a transaction of two childhood precursors: emotional vulnerability and an invalidating environment. Until recently, few empirical studies have explored relationships between these theoretical precursors and symptoms of the disorder. Psychometrically sound assessment tools are essential to this area of research. The present study examined psychometric characteristics of recently developed self-report measures of childhood emotional vulnerability and parental invalidation. A large sample of undergraduates completed these measures; parent reports were collected to examine agreement between young adults' and parents' recollections of their emotional style in childhood and the parenting they received. Both measures were internally consistent, showed clear factor structures, and were significantly correlated with BPD features and related constructs. In addition, both showed modest, yet significant agreement between participants' and parents' reports. Overall, this study supports the utility of these measures of childhood emotional vulnerability and environmental invalidation.

  3. Evaluation of the statutory classification of three-wheeled, motorized invalid vehicles.

    DOT National Transportation Integrated Search

    1978-01-01

    In response to an objection by interested individuals to the fact that Virginia law classifies three-wheeled, motorized invalid vehicles as motorcycles and subjects them to all registration, safety inspection, and operator requirements applicable to ...

  4. Capture-recapture survival models taking account of transients

    USGS Publications Warehouse

    Pradel, R.; Hines, J.E.; Lebreton, J.D.; Nichols, J.D.

    1997-01-01

    The presence of transient animals, common enough in natural populations, invalidates the estimation of survival by traditional capture- recapture (CR) models designed for the study of residents only. Also, the study of transit is interesting in itself. We thus develop here a class of CR models to describe the presence of transients. In order to assess the merits of this approach we examme the bias of the traditional survival estimators in the presence of transients in relation to the power of different tests for detecting transients. We also compare the relative efficiency of an ad hoc approach to dealing with transients that leaves out the first observation of each animal. We then study a real example using lazuli bunting (Passerina amoena) and, in conclusion, discuss the design of an experiment aiming at the estimation of transience. In practice, the presence of transients is easily detected whenever the risk of bias is high. The ad hoc approach, which yields unbiased estimates for residents only, is satisfactory in a time-dependent context but poorly efficient when parameters are constant. The example shows that intermediate situations between strict 'residence' and strict 'transience' may exist in certain studies. Yet, most of the time, if the study design takes into account the expected length of stay of a transient, it should be possible to efficiently separate the two categories of animals.

  5. Neural network-based sliding mode control for atmospheric-actuated spacecraft formation using switching strategy

    NASA Astrophysics Data System (ADS)

    Sun, Ran; Wang, Jihe; Zhang, Dexin; Shao, Xiaowei

    2018-02-01

    This paper presents an adaptive neural networks-based control method for spacecraft formation with coupled translational and rotational dynamics using only aerodynamic forces. It is assumed that each spacecraft is equipped with several large flat plates. A coupled orbit-attitude dynamic model is considered based on the specific configuration of atmospheric-based actuators. For this model, a neural network-based adaptive sliding mode controller is implemented, accounting for system uncertainties and external perturbations. To avoid invalidation of the neural networks destroying stability of the system, a switching control strategy is proposed which combines an adaptive neural networks controller dominating in its active region and an adaptive sliding mode controller outside the neural active region. An optimal process is developed to determine the control commands for the plates system. The stability of the closed-loop system is proved by a Lyapunov-based method. Comparative results through numerical simulations illustrate the effectiveness of executing attitude control while maintaining the relative motion, and higher control accuracy can be achieved by using the proposed neural-based switching control scheme than using only adaptive sliding mode controller.

  6. A streamline splitting pore-network approach for computationally inexpensive and accurate simulation of transport in porous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehmani, Yashar; Oostrom, Martinus; Balhoff, Matthew

    2014-03-20

    Several approaches have been developed in the literature for solving flow and transport at the pore-scale. Some authors use a direct modeling approach where the fundamental flow and transport equations are solved on the actual pore-space geometry. Such direct modeling, while very accurate, comes at a great computational cost. Network models are computationally more efficient because the pore-space morphology is approximated. Typically, a mixed cell method (MCM) is employed for solving the flow and transport system which assumes pore-level perfect mixing. This assumption is invalid at moderate to high Peclet regimes. In this work, a novel Eulerian perspective on modelingmore » flow and transport at the pore-scale is developed. The new streamline splitting method (SSM) allows for circumventing the pore-level perfect mixing assumption, while maintaining the computational efficiency of pore-network models. SSM was verified with direct simulations and excellent matches were obtained against micromodel experiments across a wide range of pore-structure and fluid-flow parameters. The increase in the computational cost from MCM to SSM is shown to be minimal, while the accuracy of SSM is much higher than that of MCM and comparable to direct modeling approaches. Therefore, SSM can be regarded as an appropriate balance between incorporating detailed physics and controlling computational cost. The truly predictive capability of the model allows for the study of pore-level interactions of fluid flow and transport in different porous materials. In this paper, we apply SSM and MCM to study the effects of pore-level mixing on transverse dispersion in 3D disordered granular media.« less

  7. Estimation of sexual behavior in the 18-to-24-years-old Iranian youth based on a crosswise model study.

    PubMed

    Vakilian, Katayon; Mousavi, Seyed Abbas; Keramat, Afsaneh

    2014-01-13

    In many countries, negative social attitude towards sensitive issues such as sexual behavior has resulted in false and invalid data concerning this issue.This is an analytical cross-sectional study, in which a total number of 1500 single students from universities of Shahroud City were sampled using a multi stage technique. The students were assured that their information disclosed for the researcher will be treated as private and confidential. The results were analyzed using crosswise model, Crosswise Regression, T-test and Chi-square tests. It seems that the prevalence of sexual behavior among Iranian youth is 41% (CI = 36-53). Findings showed that estimation sexual relationship in Iranian single youth is high. Thus, devising training models according to the Islamic-Iranian culture is necessary in order to prevent risky sexual behavior.

  8. Cardiorespiratory Fitness Is Associated with Selective Attention in Healthy Male High-School Students.

    PubMed

    Wengaard, Eivind; Kristoffersen, Morten; Harris, Anette; Gundersen, Hilde

    2017-01-01

    Background : Previous studies have shown associations of physical fitness and cognition in children and in younger and older adults. However, knowledge about associations in high-school adolescents and young adults is sparse. Thus, the aim of this study was to evaluate the association of physical fitness, measured as maximal oxygen uptake ([Formula: see text]), muscle mass, weekly training, and cognitive function in the executive domains of selective attention and inhibitory control, in healthy male high-school students. Methods : Fifty-four males (17.9 ± 0.9 years, 72 ± 11 kg and 182 ± 7 cm) completed a [Formula: see text] test, a body composition test and a visual cognitive task based on the Posner cue paradigm with three types of stimuli with different attentional demands (i.e., stimuli presentation following no cue, valid cue or invalid cue presentations). The task consisted of 336 target stimuli, where 56 (17%) of the target stimuli appeared without a cue (no cue), 224 (67%) appeared in the same rectangle as the cue (valid cue) and 56 (17%) appeared in the rectangle opposite to the cue (invalid cue). Mean reaction time (RT) and corresponding errors was calculated for each stimuli type. Total task duration was 9 min and 20 s In addition, relevant background information was obtained in a questionnaire. Results : Linear mixed model analyses showed that higher [Formula: see text] was associated with faster RT for stimuli following invalid cue (Estimate = -2.69, SE = 1.03, p = 0.011), and for stimuli following valid cue (Estimate = -2.08, SE = 1.03, p = 0.048). There was no association of muscle mass and stimuli ( F = 1.01, p = 0.397) or of weekly training and stimuli ( F = 0.99, p = 0.405). Conclusion : The results suggest that cardiorespiratory fitness is associated with cognitive performance in healthy male high-school students in the executive domains of selective attention.

  9. Spatio-temporal precipitation climatology over complex terrain using a censored additive regression model.

    PubMed

    Stauffer, Reto; Mayr, Georg J; Messner, Jakob W; Umlauf, Nikolaus; Zeileis, Achim

    2017-06-15

    Flexible spatio-temporal models are widely used to create reliable and accurate estimates for precipitation climatologies. Most models are based on square root transformed monthly or annual means, where a normal distribution seems to be appropriate. This assumption becomes invalid on a daily time scale as the observations involve large fractions of zero observations and are limited to non-negative values. We develop a novel spatio-temporal model to estimate the full climatological distribution of precipitation on a daily time scale over complex terrain using a left-censored normal distribution. The results demonstrate that the new method is able to account for the non-normal distribution and the large fraction of zero observations. The new climatology provides the full climatological distribution on a very high spatial and temporal resolution, and is competitive with, or even outperforms existing methods, even for arbitrary locations.

  10. Design of a Field Test for Probability of Hit by Antiaircraft Guns

    DTIC Science & Technology

    1973-02-01

    not available. • The cost of conducting the numerous field test trials that would be needed to establish the loss rates of aircraft to antiaircraft...mathematical models provide a readily available and relatively inexpensive way to obtain estimates of aircraft losses to antiaircraft guns. Because these...aircraft losses to antiaircraft guns, the use of the models can contribute greatly to better decisions. But if the models produce invalid estimates

  11. Stability of the thermodynamic equilibrium - A test of the validity of dynamic models as applied to gyroviscous perpendicular magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Faghihi, Mustafa; Scheffel, Jan; Spies, Guenther O.

    1988-05-01

    Stability of the thermodynamic equilibrium is put forward as a simple test of the validity of dynamic equations, and is applied to perpendicular gyroviscous magnetohydrodynamics (i.e., perpendicular magnetohydrodynamics with gyroviscosity added). This model turns out to be invalid because it predicts exponentially growing Alfven waves in a spatially homogeneous static equilibrium with scalar pressure.

  12. Bio-medicolegal scientific research in Europe. A country-based analysis.

    PubMed

    Viel, Guido; Boscolo-Berto, Rafael; Cecchi, Rossana; Bajanowski, Thomas; Vieira, Nuno Duarte; Ferrara, Santo Davide

    2011-09-01

    The European mosaic of socio-cultural, economic and legal realities is reflected in forensic and legal medicine, in which a great variety of operational modes of forensic medical services, organisational systems, structures, functional competences and scientific research strategies can be observed. The present work analyses the European bio-medicolegal scientific output of the last 5.5 years (exact time window, January 1, 2005-June 1, 2010), categorising papers by nationality of the corresponding author and forensic sub-discipline in question, in order to identify the peculiarities of national sub-specialised competences and to build up international research projects. This country-based bibliometric analysis, based on the number of articles and the impact factor produced by each European country, also considering its economic profile (gross domestic product and per capita gross domestic product), highlights the prevailing productive role of Western and Southern Europe (Germany, Great Britain, Italy, Switzerland, Spain and France). Categorising scientific output by forensic sub-discipline and branch, significant in terms of impact factor are contributions from Germany (coming first in Pathology, Toxicology, Genetics, Anthropology and Biological Criminalistics), Great Britain (first in Clinical Forensic Medicine, Malpractice and Invalidity-Social Insurance), Switzerland (first in Criminology), Italy (second in Toxicology, Anthropology and Invalidity-Social Insurance), The Netherlands (third in Clinical Forensic Medicine and Medical Law and Ethics), Spain (third in Genetics, Criminalistics and Invalidity-Social Insurance) and France (third in Toxicology and Malpractice). Interestingly, several countries with low gross domestic product, such as Poland, Turkey and other Eastern European nations, show notable scientific production in specific sub-disciplines such as Pathology, Toxicology and Forensic Genetics, suggesting that fruitful international cooperation could be planned and be of interest to funding sources within the European Community, also taking into account funds reserved for depressed areas undergoing development.

  13. The Probabilistic Admissible Region with Additional Constraints

    NASA Astrophysics Data System (ADS)

    Roscoe, C.; Hussein, I.; Wilkins, M.; Schumacher, P.

    The admissible region, in the space surveillance field, is defined as the set of physically acceptable orbits (e.g., orbits with negative energies) consistent with one or more observations of a space object. Given additional constraints on orbital semimajor axis, eccentricity, etc., the admissible region can be constrained, resulting in the constrained admissible region (CAR). Based on known statistics of the measurement process, one can replace hard constraints with a probabilistic representation of the admissible region. This results in the probabilistic admissible region (PAR), which can be used for orbit initiation in Bayesian tracking and prioritization of tracks in a multiple hypothesis tracking framework. The PAR concept was introduced by the authors at the 2014 AMOS conference. In that paper, a Monte Carlo approach was used to show how to construct the PAR in the range/range-rate space based on known statistics of the measurement, semimajor axis, and eccentricity. An expectation-maximization algorithm was proposed to convert the particle cloud into a Gaussian Mixture Model (GMM) representation of the PAR. This GMM can be used to initialize a Bayesian filter. The PAR was found to be significantly non-uniform, invalidating an assumption frequently made in CAR-based filtering approaches. Using the GMM or particle cloud representations of the PAR, orbits can be prioritized for propagation in a multiple hypothesis tracking (MHT) framework. In this paper, the authors focus on expanding the PAR methodology to allow additional constraints, such as a constraint on perigee altitude, to be modeled in the PAR. This requires re-expressing the joint probability density function for the attributable vector as well as the (constrained) orbital parameters and range and range-rate. The final PAR is derived by accounting for any interdependencies between the parameters. Noting that the concepts presented are general and can be applied to any measurement scenario, the idea will be illustrated using a short-arc, angles-only observation scenario.

  14. The Stroop test as a measure of performance validity in adults clinically referred for neuropsychological assessment.

    PubMed

    Erdodi, Laszlo A; Sagar, Sanya; Seke, Kristian; Zuccato, Brandon G; Schwartz, Eben S; Roth, Robert M

    2018-06-01

    This study was designed to develop performance validity indicators embedded within the Delis-Kaplan Executive Function Systems (D-KEFS) version of the Stroop task. Archival data from a mixed clinical sample of 132 patients (50% male; M Age = 43.4; M Education = 14.1) clinically referred for neuropsychological assessment were analyzed. Criterion measures included the Warrington Recognition Memory Test-Words and 2 composites based on several independent validity indicators. An age-corrected scaled score ≤6 on any of the 4 trials reliably differentiated psychometrically defined credible and noncredible response sets with high specificity (.87-.94) and variable sensitivity (.34-.71). An inverted Stroop effect was less sensitive (.14-.29), but comparably specific (.85-90) to invalid performance. Aggregating the newly developed D-KEFS Stroop validity indicators further improved classification accuracy. Failing the validity cutoffs was unrelated to self-reported depression or anxiety. However, it was associated with elevated somatic symptom report. In addition to processing speed and executive function, the D-KEFS version of the Stroop task can function as a measure of performance validity. A multivariate approach to performance validity assessment is generally superior to univariate models. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  15. Regularized Embedded Multiple Kernel Dimensionality Reduction for Mine Signal Processing.

    PubMed

    Li, Shuang; Liu, Bing; Zhang, Chen

    2016-01-01

    Traditional multiple kernel dimensionality reduction models are generally based on graph embedding and manifold assumption. But such assumption might be invalid for some high-dimensional or sparse data due to the curse of dimensionality, which has a negative influence on the performance of multiple kernel learning. In addition, some models might be ill-posed if the rank of matrices in their objective functions was not high enough. To address these issues, we extend the traditional graph embedding framework and propose a novel regularized embedded multiple kernel dimensionality reduction method. Different from the conventional convex relaxation technique, the proposed algorithm directly takes advantage of a binary search and an alternative optimization scheme to obtain optimal solutions efficiently. The experimental results demonstrate the effectiveness of the proposed method for supervised, unsupervised, and semisupervised scenarios.

  16. Invalidity of the Fermi liquid theory and magnetic phase transition in quasi-1D dopant-induced armchair-edged graphene nanoribbons

    NASA Astrophysics Data System (ADS)

    Hoi, Bui Dinh; Davoudiniya, Masoumeh; Yarmohammadi, Mohsen

    2018-04-01

    Based on theoretically tight-binding calculations considering nearest neighbors and Green's function technique, we show that the magnetic phase transition in both semiconducting and metallic armchair graphene nanoribbons with width ranging from 9.83 Å to 69.3 Å would be observed in the presence of injecting electrons by doping. This transition is explained by the temperature-dependent static charge susceptibility through calculation of the correlation function of charge density operators. This work showed that charge concentration of dopants in such system plays a crucial role in determining the magnetic phase. A variety of multicritical points such as transition temperatures and maximum susceptibility are compared in undoped and doped cases. Our findings show that there exist two different transition temperatures and maximum susceptibility depending on the ribbon width in doped structures. Another remarkable point refers to the invalidity (validity) of the Fermi liquid theory in nanoribbons-based systems at weak (strong) concentration of dopants. The obtained interesting results of magnetic phase transition in such system create a new potential for magnetic graphene nanoribbon-based devices.

  17. Internet of Things Platform for Smart Farming: Experiences and Lessons Learnt.

    PubMed

    Jayaraman, Prem Prakash; Yavari, Ali; Georgakopoulos, Dimitrios; Morshed, Ahsan; Zaslavsky, Arkady

    2016-11-09

    Improving farm productivity is essential for increasing farm profitability and meeting the rapidly growing demand for food that is fuelled by rapid population growth across the world. Farm productivity can be increased by understanding and forecasting crop performance in a variety of environmental conditions. Crop recommendation is currently based on data collected in field-based agricultural studies that capture crop performance under a variety of conditions (e.g., soil quality and environmental conditions). However, crop performance data collection is currently slow, as such crop studies are often undertaken in remote and distributed locations, and such data are typically collected manually. Furthermore, the quality of manually collected crop performance data is very low, because it does not take into account earlier conditions that have not been observed by the human operators but is essential to filter out collected data that will lead to invalid conclusions (e.g., solar radiation readings in the afternoon after even a short rain or overcast in the morning are invalid, and should not be used in assessing crop performance). Emerging Internet of Things (IoT) technologies, such as IoT devices (e.g., wireless sensor networks, network-connected weather stations, cameras, and smart phones) can be used to collate vast amount of environmental and crop performance data, ranging from time series data from sensors, to spatial data from cameras, to human observations collected and recorded via mobile smart phone applications. Such data can then be analysed to filter out invalid data and compute personalised crop recommendations for any specific farm. In this paper, we present the design of SmartFarmNet, an IoT-based platform that can automate the collection of environmental, soil, fertilisation, and irrigation data; automatically correlate such data and filter-out invalid data from the perspective of assessing crop performance; and compute crop forecasts and personalised crop recommendations for any particular farm. SmartFarmNet can integrate virtually any IoT device, including commercially available sensors, cameras, weather stations, etc., and store their data in the cloud for performance analysis and recommendations. An evaluation of the SmartFarmNet platform and our experiences and lessons learnt in developing this system concludes the paper. SmartFarmNet is the first and currently largest system in the world (in terms of the number of sensors attached, crops assessed, and users it supports) that provides crop performance analysis and recommendations.

  18. 20 CFR 655.1132 - When will the Department suspend or invalidate an approved Attestation?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Requirements Must a Facility Meet to Employ H-1C Nonimmigrant Workers as Registered Nurses? § 655.1132 When... is suspended, invalidated or expired, as long as any H-1C nurse is at the facility, unless the...

  19. 20 CFR 655.1132 - When will the Department suspend or invalidate an approved Attestation?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Requirements Must a Facility Meet to Employ H-1C Nonimmigrant Workers as Registered Nurses? § 655.1132 When... is suspended, invalidated or expired, as long as any H-1C nurse is at the facility, unless the...

  20. Effects of invalid feedback on learning and feedback-related brain activity in decision-making.

    PubMed

    Ernst, Benjamin; Steinhauser, Marco

    2015-10-01

    For adaptive decision-making it is important to utilize only relevant, valid and to ignore irrelevant feedback. The present study investigated how feedback processing in decision-making is impaired when relevant feedback is combined with irrelevant and potentially invalid feedback. We analyzed two electrophysiological markers of feedback processing, the feedback-related negativity (FRN) and the P300, in a simple decision-making task, in which participants processed feedback stimuli consisting of relevant and irrelevant feedback provided by the color and meaning of a Stroop stimulus. We found that invalid, irrelevant feedback not only impaired learning, it also altered the amplitude of the P300 to relevant feedback, suggesting an interfering effect of irrelevant feedback on the processing of relevant feedback. In contrast, no such effect on the FRN was obtained. These results indicate that detrimental effects of invalid, irrelevant feedback result from failures of controlled feedback processing. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Underwater Sensor Network Redeployment Algorithm Based on Wolf Search

    PubMed Central

    Jiang, Peng; Feng, Yang; Wu, Feng

    2016-01-01

    This study addresses the optimization of node redeployment coverage in underwater wireless sensor networks. Given that nodes could easily become invalid under a poor environment and the large scale of underwater wireless sensor networks, an underwater sensor network redeployment algorithm was developed based on wolf search. This study is to apply the wolf search algorithm combined with crowded degree control in the deployment of underwater wireless sensor networks. The proposed algorithm uses nodes to ensure coverage of the events, and it avoids the prematurity of the nodes. The algorithm has good coverage effects. In addition, considering that obstacles exist in the underwater environment, nodes are prevented from being invalid by imitating the mechanism of avoiding predators. Thus, the energy consumption of the network is reduced. Comparative analysis shows that the algorithm is simple and effective in wireless sensor network deployment. Compared with the optimized artificial fish swarm algorithm, the proposed algorithm exhibits advantages in network coverage, energy conservation, and obstacle avoidance. PMID:27775659

  2. Retrieval of volcanic ash composition and particle size using high spatial resolution satellite data

    NASA Astrophysics Data System (ADS)

    Williams, D.; Ramsey, M. S.

    2017-12-01

    Volcanic ash plumes are a complex mixture of glass, mineral and lithic fragments in suspension with multiple gas species. These plumes are rapidly injected into the atmosphere, traveling thousands of kilometers from their source and affecting lives and property. One important use of satellite-based data has been to monitor volcanic plumes and their associated hazards. For distal plumes, the transmissive properties of volcanic ash in the thermal infrared (TIR) region allows the effective radii, composition, and density to be determined using approaches such as radiative transfer modelling. Proximal to the vent, however, the plume remains opaque, rendering this method invalid. We take a new approach to proximal plume analysis by assuming the plume's upper layer behaves spectrally as a solid surface in the TIR, due to the temperature and density of the plume soon after ejection from the vent. If this hypothesis is true, linear mixing models can be employed together with an accurate spectral library to compute both the particle size and petrology of every plume pixel. This method is being applied to high spatial resolution TIR data from the ASTER sensor using the newly developed ASTER Volcanic Ash Library (AVAL). AVAL serves as the spectral end-member suite from which to model plume data of 4 volcanoes: Chaitén, Puyehue-Cordón Caulle, Sakurajima and Soufrière Hills Volcano (SHV). Preliminary results indicate that this approach may be valid. The Sakurajima and SHV AVAL spectra provide an excellent fit to the ASTER data, whereas crushed high silica glass served as an appropriate end-member for both Chaitén and Puyehue-Cordón Caulle. In all cases, the best-fit size fractions are < 45 µm. Analysis of the proximal plume is essential in understanding the volcanic processes occurring within the vent. This study provides unprecedented detail of this region of the plume, further demonstrating the need for the continuation of high spatial resolution TIR satellite missions.

  3. What can molecular modelling bring to the design of artificial inorganic cofactors?

    PubMed

    Muñoz Robles, Victor; Ortega-Carrasco, Elisabeth; González Fuentes, Eric; Lledós, Agustí; Maréchal, Jean-Didier

    2011-01-01

    In recent years, the development of synthetic metalloenzymes based on the insertion of inorganic catalysts into biological macromolecules has become a vivid field of investigation. The success of the design of these composites is highly dependent on an atomic understanding of the recognition process between inorganic and biological entities. Despite facing several challenging complexities, molecular modelling techniques could be particularly useful in providing such knowledge. This study aims to discuss how the prediction of the structural and energetic properties of the host-cofactor interactions can be performed by computational means. To do so, we designed a protocol that combines several methodologies like protein-ligand dockings and QM/MM techniques. The overall approach considers fundamental bioinorganic questions like the participation of the amino acids of the receptor to the first coordination sphere of the metal, the impact of the receptor/cofactor flexibility on the structure of the complex, the cost of inserting the inorganic catalyst in place of the natural ligand/substrate into the host and how experimental knowledge can improve or invalidate a theoretical model. As a real case system, we studied an artificial metalloenzyme obtained by the insertion of a Fe(Schiff base) moiety into the heme oxygenase of Corynebacterium diphtheriae. The experimental structure of this species shows a distorted cofactor leading to an unusual octahedral configuration of the iron with two proximal residues chelating the metal and no external ligand. This geometry is far from the conformation adopted by similar cofactors in other hosts and shows that a fine tuning exists between the coordination environment of the metal, the deformability of its organic ligand and the conformational adaptability of the receptor. In a field where very little structural information is yet available, this work should help in building an initial molecular modelling framework for the discovery, design and optimization of inorganic cofactors. Moreover, the approach used in this study also lays the groundwork for the development of computational methods adequate for studying several metal mediated biological processes like the generation of realistic three dimensional models of metalloproteins bound to their natural cofactor or the folding of metal containing peptides.

  4. 7 CFR 1230.633 - Canvassing ballots.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... CONSUMER INFORMATION Procedures for the Conduct of Referendum Referendum § 1230.633 Canvassing ballots. (a) Producers. (1) Counting the ballots. Under the supervision of FSA CED, acting on behalf of the Administrator... spoiled ballots. (2) Invalid ballots. Ballots will be declared invalid if a producer voting in-person has...

  5. 7 CFR 1230.633 - Canvassing ballots.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... CONSUMER INFORMATION Procedures for the Conduct of Referendum Referendum § 1230.633 Canvassing ballots. (a) Producers. (1) Counting the ballots. Under the supervision of FSA CED, acting on behalf of the Administrator... spoiled ballots. (2) Invalid ballots. Ballots will be declared invalid if a producer voting in-person has...

  6. 7 CFR 1230.633 - Canvassing ballots.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... CONSUMER INFORMATION Procedures for the Conduct of Referendum Referendum § 1230.633 Canvassing ballots. (a) Producers. (1) Counting the ballots. Under the supervision of FSA CED, acting on behalf of the Administrator... spoiled ballots. (2) Invalid ballots. Ballots will be declared invalid if a producer voting in-person has...

  7. 7 CFR 1230.633 - Canvassing ballots.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... CONSUMER INFORMATION Procedures for the Conduct of Referendum Referendum § 1230.633 Canvassing ballots. (a) Producers. (1) Counting the ballots. Under the supervision of FSA CED, acting on behalf of the Administrator... spoiled ballots. (2) Invalid ballots. Ballots will be declared invalid if a producer voting in-person has...

  8. 7 CFR 1230.633 - Canvassing ballots.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... CONSUMER INFORMATION Procedures for the Conduct of Referendum Referendum § 1230.633 Canvassing ballots. (a) Producers. (1) Counting the ballots. Under the supervision of FSA CED, acting on behalf of the Administrator... spoiled ballots. (2) Invalid ballots. Ballots will be declared invalid if a producer voting in-person has...

  9. Global dynamics of a delay differential equation with spatial non-locality in an unbounded domain

    NASA Astrophysics Data System (ADS)

    Yi, Taishan; Zou, Xingfu

    In this paper, we study the global dynamics of a class of differential equations with temporal delay and spatial non-locality in an unbounded domain. Adopting the compact open topology, we describe the delicate asymptotic properties of the nonlocal delayed effect and establish some a priori estimate for nontrivial solutions which enables us to show the permanence of the equation. Combining these results with a dynamical systems approach, we determine the global dynamics of the equation under appropriate conditions. Applying the main results to the model with Ricker's birth function and Mackey-Glass's hematopoiesis function, we obtain threshold results for the global dynamics of these two models. We explain why our results on the global attractivity of the positive equilibrium in C∖{0} under the compact open topology becomes invalid in C∖{0} with respect to the usual supremum norm, and we identify a subset of C∖{0} in which the positive equilibrium remains attractive with respect to the supremum norm.

  10. Validation of Groundwater Models: Meaningful or Meaningless?

    NASA Astrophysics Data System (ADS)

    Konikow, L. F.

    2003-12-01

    Although numerical simulation models are valuable tools for analyzing groundwater systems, their predictive accuracy is limited. People who apply groundwater flow or solute-transport models, as well as those who make decisions based on model results, naturally want assurance that a model is "valid." To many people, model validation implies some authentication of the truth or accuracy of the model. History matching is often presented as the basis for model validation. Although such model calibration is a necessary modeling step, it is simply insufficient for model validation. Because of parameter uncertainty and solution non-uniqueness, declarations of validation (or verification) of a model are not meaningful. Post-audits represent a useful means to assess the predictive accuracy of a site-specific model, but they require the existence of long-term monitoring data. Model testing may yield invalidation, but that is an opportunity to learn and to improve the conceptual and numerical models. Examples of post-audits and of the application of a solute-transport model to a radioactive waste disposal site illustrate deficiencies in model calibration, prediction, and validation.

  11. Reflexive Orienting in Response to Short- and Long-Duration Gaze Cues in Young, Young-Old, and Old-Old Adults

    PubMed Central

    Gayzur, Nora D.; Langley, Linda K.; Kelland, Chris; Wyman, Sara V.; Saville, Alyson L.; Ciernia, Annie T.; Padmanabhan, Ganesh

    2013-01-01

    Shifting visual focus based on the perceived gaze direction of another person is one form of joint attention. The present study investigated if this socially-relevant form of orienting is reflexive and whether it is influenced by age. Green and Woldorff (2012) argued that rapid cueing effects (faster responses to validly-cued targets than to invalidly-cued targets) were limited to conditions in which a cue overlapped in time with a target. They attributed slower responses following invalid cues to the time needed to resolve incongruent spatial information provided by the concurrently-presented cue and target. The present study examined orienting responses of young (18-31 years), young-old (60-74 years), and old-old adults (75-91 years) following uninformative central gaze cues that overlapped in time with the target (Experiment 1) or that were removed prior to target presentation (Experiment 2). When the cue and target overlapped, all three groups localized validly-cued targets faster than invalidly-cued targets, and validity effects emerged earlier for the two younger groups (at 100 ms post cue onset) than for the old-old group (at 300 ms post cue onset). With a short duration cue (Experiment 2), validity effects developed rapidly (by 100 ms) for all three groups, suggesting that validity effects resulted from reflexive orienting based on gaze cue information rather than from cue-target conflict. Thus, although old-old adults may be slow to disengage from persistent gaze cues, attention continues to be reflexively guided by gaze cues late in life. PMID:24170377

  12. Likert or Not, Survey (In)Validation Requires Explicit Theories and True Grit

    ERIC Educational Resources Information Center

    McGrane, Joshua A.; Nowland, Trisha

    2017-01-01

    From the time of Likert (1932) on, attitudes of expediency regarding both theory and methodology became apparent with reference to survey construction and validation practices. In place of theory and more--theoretically minded methods, such as those found in the early work of Thurstone (1928) and Coombs (1964), statistical models and…

  13. 43 CFR 3286.1 - Model Unit Agreement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... continue such drilling diligently until the ___ formation has been tested or until at a lesser depth... Operator shall not in any event be required to drill said well to a depth in excess of ___ feet. 11.5The... assert any legal or constitutional right or defense pertaining to the validity or invalidity of any law...

  14. Why style matters - uncertainty and structural interpretation in thrust belts.

    NASA Astrophysics Data System (ADS)

    Butler, Rob; Bond, Clare; Watkins, Hannah

    2016-04-01

    Structural complexity together with challenging seismic imaging make for significant uncertainty in developing geometric interpretations of fold and thrust belts. Here we examine these issues and develop more realistic approaches to building interpretations. At all scales, the best tests of the internal consistency of individual interpretations come from structural restoration (section balancing), provided allowance is made for heterogeneity in stratigraphy and strain. However, many existing balancing approaches give misleading perceptions of interpretational risk - both on the scale of individual fold-thrust (trap) structures and in regional cross-sections. At the trap-scale, idealised models are widely cited - fault-bend-fold, fault-propagation folding and trishear. These make entirely arbitrary choices for fault localisation and layer-by-layer deformation: precise relationships between faults and fold geometry are generally invalidated by real-world conditions of stratigraphic variation and distributed strain. Furthermore, subsurface predictions made using these idealisations for hydrocarbon exploration commonly fail the test of drilling. Rarely acknowledged, the geometric reliability of seismic images depends on the assigned seismic velocity model, which in turn relies on geological interpretation. Thus iterative approaches are required between geology and geophysics. The portfolio of commonly cited outcrop analogues is strongly biased to examples that simply conform to idealised models - apparently abnormal structures are rarely described - or even photographed! Insight can come from gravity-driven deep-water fold-belts where part of the spectrum of fold-thrust complexity is resolved through seismic imaging. This imagery shows deformation complexity in fold forelimbs and backlimbs. However, the applicability of these, weakly lithified systems to well-lithified successions (e.g. carbonates) of many foreland thrust belts remains conjectural. Examples of lithified systems will be drawn from the foothills of the Colombian Andes and the Papuan fold-belt. These show major forelimb structures with segmented steep-limbs containing substantial oil-columns, suggesting forelimb complexity in lithified sections maybe more common than predicted by idealised models. As with individual fold-thrust structures, regional cross-sections are commonly open to multiple interpretations. To date the over-reliance on comparative approaches with a narrow range of published studies (e.g. Canadian cordilleran foothills) has biased global interpretations of thrust systems. Perhaps the most significant issues relate to establishing a depth to detachment - specifically the involvement of basement at depth - especially the role of pre-existing (rift-originated) faults and their inversion. Not only do these choices impact on the local interpretation, the inferred shortening values, obtained by comparing restored section-lengths, can be radically different. Further issues arise for emergent, syn-depositional thrust systems where sedimentation prohibits flat-on-flat thrusting in favour of continuously ramping thrust trajectories. Inappropriate adoption of geometries gathered from buried (duplex) systems can create geometric interpretations that are tectono-stratigraphically invalid. This presentation illustrates these topics using a variety of thrust systems with the aim of promoting discussion on developing better interpretative strategies than those adopted hitherto.

  15. Four Methods for Analyzing Partial Interval Recording Data, with Application to Single-Case Research.

    PubMed

    Pustejovsky, James E; Swan, Daniel M

    2015-01-01

    Partial interval recording (PIR) is a procedure for collecting measurements during direct observation of behavior. It is used in several areas of educational and psychological research, particularly in connection with single-case research. Measurements collected using partial interval recording suffer from construct invalidity because they are not readily interpretable in terms of the underlying characteristics of the behavior. Using an alternating renewal process model for the behavior under observation, we demonstrate that ignoring the construct invalidity of PIR data can produce misleading inferences, such as inferring that an intervention reduces the prevalence of an undesirable behavior when in fact it has the opposite effect. We then propose four different methods for analyzing PIR summary measurements, each of which can be used to draw inferences about interpretable behavioral parameters. We demonstrate the methods by applying them to data from two single-case studies of problem behavior.

  16. Validation of Measures of Biosocial Precursors to Borderline Personality Disorder: Childhood Emotional Vulnerability and Environmental Invalidation

    ERIC Educational Resources Information Center

    Sauer, Shannon E.; Baer, Ruth A.

    2010-01-01

    Linehan's biosocial theory suggests that borderline personality disorder (BPD) results from a transaction of two childhood precursors: emotional vulnerability and an invalidating environment. Until recently, few empirical studies have explored relationships between these theoretical precursors and symptoms of the disorder. Psychometrically sound…

  17. The Role of Maternal Emotional Validation and Invalidation on Children's Emotional Awareness

    ERIC Educational Resources Information Center

    Lambie, John A.; Lindberg, Anja

    2016-01-01

    Emotional awareness--that is, accurate emotional self-report--has been linked to positive well-being and mental health. However, it is still unclear how emotional awareness is socialized in young children. This observational study examined how a particular parenting communicative style--emotional validation versus emotional invalidation--was…

  18. Studies of the Seriousness of Three Threats to Passage Dependence.

    ERIC Educational Resources Information Center

    Hanna, Gerald S.; Oaster, Thomas R.

    1980-01-01

    Certain kinds of multiple-choice reading comprehension questions may be answered correctly at the higher-than-chance level when they are administered without the accompanying passage. These high risk questions do not necessarily lead to passage dependence invalidity. They threaten but do not prove invalidity. (Author/CP)

  19. 25 CFR 11.604 - Declaration of invalidity.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... enter a decree declaring the invalidity of a marriage entered into under the following circumstances: (1) A party lacked capacity to consent to the marriage, either because of mental incapacity or infirmity... to enter into a marriage by fraud or duress; or (3) A party lacks the physical capacity to consummate...

  20. Effect of Bending Stiffness of the Electroactive Polymer Element on the Performance of a Hybrid Actuator System (HYBAS)

    NASA Technical Reports Server (NTRS)

    Xu, Tian-Bing; Su, Ji; Jiang, Xiaoning; Rehrig, Paul W.; Zhang, Shujun; Shrout, Thomas R.; Zhang, Qiming

    2006-01-01

    An electroactive polymer (EAP)-ceramic hybrid actuation system (HYBAS) was developed recently at NASA Langley Research Center. This paper focuses on the effect of the bending stiffness of the EAP component on the performance of a HYBAS, in which the actuation of the EAP element can match the theoretical prediction at various length/thickness ratios for a constant elastic modulus of the EAP component. The effects on the bending stiffness of the elastic modulus and length/thickness ratio of the EAP component were studied. A critical bending stiffness to keep the actuation of the EAP element suitable for a rigid beam theory-based modeling was found for electron irradiated P(VDF-TrFE) copolymer. For example, the agreement of experimental data and theoretical modeling for a HYBAS with the length/thickness ratio of EAP element at 375 times is demonstrated. However, the beam based theoretical modeling becomes invalid (i.e., the profile of the HYBAS movement does not follow the prediction of theoretical modeling) when the bending stiffness is lower than a critical value.

  1. Adherence to and effectiveness of highly active antiretroviral treatment for HIV infection: assessing the bidirectional relationship.

    PubMed

    Lamiraud, Karine; Moatti, Jean-Paul; Raffi, François; Carrieri, Maria-Patrizia; Protopopescu, Camelia; Michelet, Christian; Schneider, Luminita; Collin, Fideline; Leport, Catherine; Spire, Bruno

    2012-05-01

    It is well established that high adherence to HIV-infected patients on highly active antiretroviral treatment (HAART) is a major determinant of virological and immunologic success. Furthermore, psychosocial research has identified a wide range of adherence factors including patients' subjective beliefs about the effectiveness of HAART. Current statistical approaches, mainly based on the separate identification either of factors associated with treatment effectiveness or of those associated with adherence, fail to properly explore the true relationship between adherence and treatment effectiveness. Adherence behavior may be influenced not only by perceived benefits-which are usually the focus of related studies-but also by objective treatment benefits reflected in biological outcomes. Our objective was to assess the bidirectional relationship between adherence and response to treatment among patients enrolled in the ANRS CO8 APROCO-COPILOTE study. We compared a conventional statistical approach based on the separate estimations of an adherence and an effectiveness equation to an econometric approach using a 2-equation simultaneous system based on the same 2 equations. Our results highlight a reciprocal relationship between adherence and treatment effectiveness. After controlling for endogeneity, adherence was positively associated with treatment effectiveness. Furthermore, CD4 count gain after baseline was found to have a positive significant effect on adherence at each observation period. This immunologic parameter was not significant when the adherence equation was estimated separately. In the 2-equation model, the covariances between disturbances of both equations were found to be significant, thus confirming the statistical appropriacy of studying adherence and treatment effectiveness jointly. Our results, which suggest that positive biological results arising as a result of high adherence levels, in turn reinforce continued adherence and strengthen the argument that patients who do not experience rapid improvement in their immunologic and clinical statuses after HAART initiation should be prioritized when developing adherence support interventions. Furthermore, they invalidate the hypothesis that HAART leads to "false reassurance" among HIV-infected patients.

  2. On the validity of the arithmetic-geometric mean method to locate the optimal solution in a supply chain system

    NASA Astrophysics Data System (ADS)

    Chung, Kun-Jen

    2012-08-01

    Cardenas-Barron [Cardenas-Barron, L.E. (2010) 'A Simple Method to Compute Economic order Quantities: Some Observations', Applied Mathematical Modelling, 34, 1684-1688] indicates that there are several functions in which the arithmetic-geometric mean method (AGM) does not give the minimum. This article presents another situation to reveal that the AGM inequality to locate the optimal solution may be invalid for Teng, Chen, and Goyal [Teng, J.T., Chen, J., and Goyal S.K. (2009), 'A Comprehensive Note on: An Inventory Model under Two Levels of Trade Credit and Limited Storage Space Derived without Derivatives', Applied Mathematical Modelling, 33, 4388-4396], Teng and Goyal [Teng, J.T., and Goyal S.K. (2009), 'Comment on 'Optimal Inventory Replenishment Policy for the EPQ Model under Trade Credit Derived without Derivatives', International Journal of Systems Science, 40, 1095-1098] and Hsieh, Chang, Weng, and Dye [Hsieh, T.P., Chang, H.J., Weng, M.W., and Dye, C.Y. (2008), 'A Simple Approach to an Integrated Single-vendor Single-buyer Inventory System with Shortage', Production Planning and Control, 19, 601-604]. So, the main purpose of this article is to adopt the calculus approach not only to overcome shortcomings of the arithmetic-geometric mean method of Teng et al. (2009), Teng and Goyal (2009) and Hsieh et al. (2008), but also to develop the complete solution procedures for them.

  3. 38 CFR 36.4304 - Deviations; changes of identity.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... identity. 36.4304 Section 36.4304 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS... Deviations; changes of identity. A deviation of more than 5 percent between the estimates upon which a... change in the identity of the property upon which the original appraisal was based, will invalidate the...

  4. 38 CFR 36.4304 - Deviations; changes of identity.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... identity. 36.4304 Section 36.4304 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS... Deviations; changes of identity. A deviation of more than 5 percent between the estimates upon which a... change in the identity of the property upon which the original appraisal was based, will invalidate the...

  5. 38 CFR 36.4304 - Deviations; changes of identity.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... identity. 36.4304 Section 36.4304 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS... Deviations; changes of identity. A deviation of more than 5 percent between the estimates upon which a... change in the identity of the property upon which the original appraisal was based, will invalidate the...

  6. 38 CFR 36.4304 - Deviations; changes of identity.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... identity. 36.4304 Section 36.4304 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS... Deviations; changes of identity. A deviation of more than 5 percent between the estimates upon which a... change in the identity of the property upon which the original appraisal was based, will invalidate the...

  7. 38 CFR 36.4304 - Deviations; changes of identity.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... identity. 36.4304 Section 36.4304 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS... Deviations; changes of identity. A deviation of more than 5 percent between the estimates upon which a... change in the identity of the property upon which the original appraisal was based, will invalidate the...

  8. Implication of correlations among some common stability statistics - a Monte Carlo simulations.

    PubMed

    Piepho, H P

    1995-03-01

    Stability analysis of multilocation trials is often based on a mixed two-way model. Two stability measures in frequent use are the environmental variance (S i (2) )and the ecovalence (W i). Under the two-way model the rank orders of the expected values of these two statistics are identical for a given set of genotypes. By contrast, empirical rank correlations among these measures are consistently low. This suggests that the two-way mixed model may not be appropriate for describing real data. To check this hypothesis, a Monte Carlo simulation was conducted. It revealed that the low empirical rank correlation amongS i (2) and W i is most likely due to sampling errors. It is concluded that the observed low rank correlation does not invalidate the two-way model. The paper also discusses tests for homogeneity of S i (2) as well as implications of the two-way model for the classification of stability statistics.

  9. 30 CFR 62.173 - Follow-up evaluation when an audiogram is invalid.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... invalid. 62.173 Section 62.173 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR UNIFORM MINE HEALTH REGULATIONS OCCUPATIONAL NOISE EXPOSURE § 62.173 Follow-up evaluation when an... occupational exposure to noise or the wearing of hearing protectors, the mine operator must refer the miner for...

  10. 30 CFR 62.173 - Follow-up evaluation when an audiogram is invalid.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... invalid. 62.173 Section 62.173 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR UNIFORM MINE HEALTH REGULATIONS OCCUPATIONAL NOISE EXPOSURE § 62.173 Follow-up evaluation when an... occupational exposure to noise or the wearing of hearing protectors, the mine operator must refer the miner for...

  11. 49 CFR 40.96 - What criteria do laboratories use to establish that a specimen is invalid?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing... laboratory, you must use the invalid test result criteria for the initial and confirmation testing as... whether sending the specimen to another HHS certified laboratory for testing would be useful in being able...

  12. 49 CFR 40.96 - What criteria do laboratories use to establish that a specimen is invalid?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing... laboratory, you must use the invalid test result criteria for the initial and confirmation testing as... whether sending the specimen to another HHS certified laboratory for testing would be useful in being able...

  13. 49 CFR 40.96 - What criteria do laboratories use to establish that a specimen is invalid?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing... laboratory, you must use the invalid test result criteria for the initial and confirmation testing as... whether sending the specimen to another HHS certified laboratory for testing would be useful in being able...

  14. 49 CFR 40.96 - What criteria do laboratories use to establish that a specimen is invalid?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing... laboratory, you must use the invalid test result criteria for the initial and confirmation testing as... whether sending the specimen to another HHS certified laboratory for testing would be useful in being able...

  15. 49 CFR 40.96 - What criteria do laboratories use to establish that a specimen is invalid?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing... laboratory, you must use the invalid test result criteria for the initial and confirmation testing as... whether sending the specimen to another HHS certified laboratory for testing would be useful in being able...

  16. An Overlooked Population in Community College: International Students' (In)Validation Experiences With Academic Advising

    ERIC Educational Resources Information Center

    Zhang, Yi

    2016-01-01

    Objective: Guided by validation theory, this study aims to better understand the role that academic advising plays in international community college students' adjustment. More specifically, this study investigated how academic advising validates or invalidates their academic and social experiences in a community college context. Method: This…

  17. 30 CFR 253.50 - How can MMS refuse or invalidate my OSFR evidence?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false How can MMS refuse or invalidate my OSFR evidence? 253.50 Section 253.50 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION, AND ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL SPILL FINANCIAL RESPONSIBILITY FOR OFFSHORE FACILITIES...

  18. 30 CFR 553.50 - How can BOEM refuse or invalidate my OSFR evidence?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 2 2013-07-01 2013-07-01 false How can BOEM refuse or invalidate my OSFR evidence? 553.50 Section 553.50 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL SPILL FINANCIAL RESPONSIBILITY FOR OFFSHORE FACILITIES Revocation and Penalties § 553...

  19. 30 CFR 553.50 - How can BOEM refuse or invalidate my OSFR evidence?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 2 2014-07-01 2014-07-01 false How can BOEM refuse or invalidate my OSFR evidence? 553.50 Section 553.50 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL SPILL FINANCIAL RESPONSIBILITY FOR OFFSHORE FACILITIES Revocation and Penalties § 553...

  20. 30 CFR 553.50 - How can BOEM refuse or invalidate my OSFR evidence?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 2 2012-07-01 2012-07-01 false How can BOEM refuse or invalidate my OSFR evidence? 553.50 Section 553.50 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL SPILL FINANCIAL RESPONSIBILITY FOR OFFSHORE FACILITIES Revocation and Penalties § 553...

  1. 30 CFR 253.50 - How can MMS refuse or invalidate my OSFR evidence?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false How can MMS refuse or invalidate my OSFR evidence? 253.50 Section 253.50 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR OFFSHORE OIL SPILL FINANCIAL RESPONSIBILITY FOR OFFSHORE FACILITIES Revocation and Penalties § 253.50 How...

  2. Modeling Addictive Consumption as an Infectious Disease*

    PubMed Central

    Alamar, Benjamin; Glantz, Stanton A.

    2011-01-01

    The dominant model of addictive consumption in economics is the theory of rational addiction. The addict in this model chooses how much they are going to consume based upon their level of addiction (past consumption), the current benefits and all future costs. Several empirical studies of cigarette sales and price data have found a correlation between future prices and consumption and current consumption. These studies have argued that the correlation validates the rational addiction model and invalidates any model in which future consumption is not considered. An alternative to the rational addiction model is one in which addiction spreads through a population as if it were an infectious disease, as supported by the large body of empirical research of addictive behaviors. In this model an individual's probability of becoming addicted to a substance is linked to the behavior of their parents, friends and society. In the infectious disease model current consumption is based only on the level of addiction and current costs. Price and consumption data from a simulation of the infectious disease model showed a qualitative match to the results of the rational addiction model. The infectious disease model can explain all of the theoretical results of the rational addiction model with the addition of explaining initial consumption of the addictive good. PMID:21339848

  3. A novel technique for fetal heart rate estimation from Doppler ultrasound signal

    PubMed Central

    2011-01-01

    Background The currently used fetal monitoring instrumentation that is based on Doppler ultrasound technique provides the fetal heart rate (FHR) signal with limited accuracy. It is particularly noticeable as significant decrease of clinically important feature - the variability of FHR signal. The aim of our work was to develop a novel efficient technique for processing of the ultrasound signal, which could estimate the cardiac cycle duration with accuracy comparable to a direct electrocardiography. Methods We have proposed a new technique which provides the true beat-to-beat values of the FHR signal through multiple measurement of a given cardiac cycle in the ultrasound signal. The method consists in three steps: the dynamic adjustment of autocorrelation window, the adaptive autocorrelation peak detection and determination of beat-to-beat intervals. The estimated fetal heart rate values and calculated indices describing variability of FHR, were compared to the reference data obtained from the direct fetal electrocardiogram, as well as to another method for FHR estimation. Results The results revealed that our method increases the accuracy in comparison to currently used fetal monitoring instrumentation, and thus enables to calculate reliable parameters describing the variability of FHR. Relating these results to the other method for FHR estimation we showed that in our approach a much lower number of measured cardiac cycles was rejected as being invalid. Conclusions The proposed method for fetal heart rate determination on a beat-to-beat basis offers a high accuracy of the heart interval measurement enabling reliable quantitative assessment of the FHR variability, at the same time reducing the number of invalid cardiac cycle measurements. PMID:21999764

  4. Embedded performance validity tests within the Hopkins Verbal Learning Test - Revised and the Brief Visuospatial Memory Test - Revised.

    PubMed

    Sawyer, R John; Testa, S Marc; Dux, Moira

    2017-01-01

    Various research studies and neuropsychology practice organizations have reiterated the importance of developing embedded performance validity tests (PVTs) to detect potentially invalid neurocognitive test data. This study investigated whether measures within the Hopkins Verbal Learning Test - Revised (HVLT-R) and the Brief Visuospatial Memory Test - Revised (BVMT-R) could accurately classify individuals who fail two or more PVTs during routine clinical assessment. The present sample of 109 United States military veterans (Mean age = 52.4, SD = 13.3), all consisted of clinically referred patients and received a battery of neuropsychological tests. Based on performance validity findings, veterans were assigned to valid (n = 86) or invalid (n = 23) groups. Of the 109 patients in the overall sample, 77 were administered the HLVT-R and 75 were administered the BVMT-R, which were examined for classification accuracy. The HVLT-R Recognition Discrimination Index and the BVMT-R Retention Percentage showed good to adequate discrimination with an area under the curve of .78 and .70, respectively. The HVLT-R Recognition Discrimination Index showed sensitivity of .53 with specificity of .93. The BVMT-R Retention Percentage demonstrated sensitivity of .31 with specificity of .92. When used in conjunction with other PVTs, these new embedded PVTs may be effective in the detection of invalid test data, although they are not intended for use in patients with dementia.

  5. The Effect of Differential Motivation on IRT Linking

    ERIC Educational Resources Information Center

    Mittelhaëuser, Marie-Anne; Béguin, Anton A.; Sijtsma, Klaas

    2015-01-01

    The purpose of this study was to investigate whether simulated differential motivation between the stakes for operational tests and anchor items produces an invalid linking result if the Rasch model is used to link the operational tests. This was done for an external anchor design and a variation of a pretest design. The study also investigated…

  6. fMRI-constrained source analysis reveals early top-down modulations of interference processing using a flanker task.

    PubMed

    Siemann, Julia; Herrmann, Manfred; Galashan, Daniela

    2016-08-01

    Usually, incongruent flanker stimuli provoke conflict processing whereas congruent flankers should facilitate task performance. Various behavioral studies reported improved or even absent conflict processing with correctly oriented selective attention. In the present study we attempted to reinvestigate these behavioral effects and to disentangle neuronal activity patterns underlying the attentional cueing effect taking advantage of a combination of the high temporal resolution of Electroencephalographic (EEG) and the spatial resolution of functional magnetic resonance imaging (fMRI). Data from 20 participants were acquired in different sessions per method. We expected the conflict-related N200 event-related potential (ERP) component and areas associated with flanker processing to show validity-specific modulations. Additionally, the spatio-temporal dynamics during cued flanker processing were examined using an fMRI-constrained source analysis approach. In the ERP data we found early differences in flanker processing between validity levels. An early centro-parietal relative positivity for incongruent stimuli occurred only with valid cueing during the N200 time window, while a subsequent fronto-central negativity was specific to invalidly cued interference processing. The source analysis additionally pointed to separate neural generators of these effects. Regional sources in visual areas were involved in conflict processing with valid cueing, while a regional source in the anterior cingulate cortex (ACC) seemed to contribute to the ERP differences with invalid cueing. Moreover, the ACC and precentral gyrus demonstrated an early and a late phase of congruency-related activity differences with invalid cueing. We discuss the first effect to reflect conflict detection and response activation while the latter more likely originated from conflict monitoring and control processes during response competition. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. The importance of assessing for validity of symptom report and performance in attention deficit/hyperactivity disorder (ADHD): Introduction to the special section on noncredible presentation in ADHD.

    PubMed

    Suhr, Julie A; Berry, David T R

    2017-12-01

    Invalid self-report and invalid performance occur with high base rates in attention deficit/hyperactivity disorder (ADHD; Harrison, 2006; Musso & Gouvier, 2014). Although much research has focused on the development and validation of symptom validity tests (SVTs) and performance validity tests (PVTs) for psychiatric and neurological presentations, less attention has been given to the use of SVTs and PVTs in ADHD evaluation. This introduction to the special section describes a series of studies examining the use of SVTs and PVTs in adult ADHD evaluation. We present the series of studies in the context of prior research on noncredible presentation and call for future research using improved research methods and with a focus on assessment issues specific to ADHD evaluation. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Accurate Grid-based Clustering Algorithm with Diagonal Grid Searching and Merging

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Ye, Chengcheng; Zhu, Erzhou

    2017-09-01

    Due to the advent of big data, data mining technology has attracted more and more attentions. As an important data analysis method, grid clustering algorithm is fast but with relatively lower accuracy. This paper presents an improved clustering algorithm combined with grid and density parameters. The algorithm first divides the data space into the valid meshes and invalid meshes through grid parameters. Secondly, from the starting point located at the first point of the diagonal of the grids, the algorithm takes the direction of “horizontal right, vertical down” to merge the valid meshes. Furthermore, by the boundary grid processing, the invalid grids are searched and merged when the adjacent left, above, and diagonal-direction grids are all the valid ones. By doing this, the accuracy of clustering is improved. The experimental results have shown that the proposed algorithm is accuracy and relatively faster when compared with some popularly used algorithms.

  9. 22 CFR 51.63 - Passports invalid for travel into or through restricted areas; prohibition on passports valid...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Passports invalid for travel into or through restricted areas; prohibition on passports valid only for travel to Israel. 51.63 Section 51.63 Foreign Relations DEPARTMENT OF STATE NATIONALITY AND PASSPORTS PASSPORTS Denial, Revocation, and Restriction of...

  10. 22 CFR 51.63 - Passports invalid for travel into or through restricted areas; prohibition on passports valid...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 22 Foreign Relations 1 2014-04-01 2014-04-01 false Passports invalid for travel into or through restricted areas; prohibition on passports valid only for travel to Israel. 51.63 Section 51.63 Foreign Relations DEPARTMENT OF STATE NATIONALITY AND PASSPORTS PASSPORTS Denial, Revocation, and Restriction of...

  11. 22 CFR 51.63 - Passports invalid for travel into or through restricted areas; prohibition on passports valid...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Passports invalid for travel into or through restricted areas; prohibition on passports valid only for travel to Israel. 51.63 Section 51.63 Foreign Relations DEPARTMENT OF STATE NATIONALITY AND PASSPORTS PASSPORTS Denial, Revocation, and Restriction of...

  12. 22 CFR 51.63 - Passports invalid for travel into or through restricted areas; prohibition on passports valid...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 22 Foreign Relations 1 2013-04-01 2013-04-01 false Passports invalid for travel into or through restricted areas; prohibition on passports valid only for travel to Israel. 51.63 Section 51.63 Foreign Relations DEPARTMENT OF STATE NATIONALITY AND PASSPORTS PASSPORTS Denial, Revocation, and Restriction of...

  13. 22 CFR 51.63 - Passports invalid for travel into or through restricted areas; prohibition on passports valid...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 22 Foreign Relations 1 2012-04-01 2012-04-01 false Passports invalid for travel into or through restricted areas; prohibition on passports valid only for travel to Israel. 51.63 Section 51.63 Foreign Relations DEPARTMENT OF STATE NATIONALITY AND PASSPORTS PASSPORTS Denial, Revocation, and Restriction of...

  14. Special Educational Needs and Art and Design Education: Plural Perspectives on Exclusion

    ERIC Educational Resources Information Center

    Penketh, Claire

    2016-01-01

    Education policy proposals by the UK Coalition government appeared to be based on a process of consultation, participation and representation. However, policy formation seems to prioritise and confirm particular ways of knowing and being in the world. This article recognises the ontological and epistemological invalidation at work in education…

  15. Internet of Things Platform for Smart Farming: Experiences and Lessons Learnt

    PubMed Central

    Jayaraman, Prem Prakash; Yavari, Ali; Georgakopoulos, Dimitrios; Morshed, Ahsan; Zaslavsky, Arkady

    2016-01-01

    Improving farm productivity is essential for increasing farm profitability and meeting the rapidly growing demand for food that is fuelled by rapid population growth across the world. Farm productivity can be increased by understanding and forecasting crop performance in a variety of environmental conditions. Crop recommendation is currently based on data collected in field-based agricultural studies that capture crop performance under a variety of conditions (e.g., soil quality and environmental conditions). However, crop performance data collection is currently slow, as such crop studies are often undertaken in remote and distributed locations, and such data are typically collected manually. Furthermore, the quality of manually collected crop performance data is very low, because it does not take into account earlier conditions that have not been observed by the human operators but is essential to filter out collected data that will lead to invalid conclusions (e.g., solar radiation readings in the afternoon after even a short rain or overcast in the morning are invalid, and should not be used in assessing crop performance). Emerging Internet of Things (IoT) technologies, such as IoT devices (e.g., wireless sensor networks, network-connected weather stations, cameras, and smart phones) can be used to collate vast amount of environmental and crop performance data, ranging from time series data from sensors, to spatial data from cameras, to human observations collected and recorded via mobile smart phone applications. Such data can then be analysed to filter out invalid data and compute personalised crop recommendations for any specific farm. In this paper, we present the design of SmartFarmNet, an IoT-based platform that can automate the collection of environmental, soil, fertilisation, and irrigation data; automatically correlate such data and filter-out invalid data from the perspective of assessing crop performance; and compute crop forecasts and personalised crop recommendations for any particular farm. SmartFarmNet can integrate virtually any IoT device, including commercially available sensors, cameras, weather stations, etc., and store their data in the cloud for performance analysis and recommendations. An evaluation of the SmartFarmNet platform and our experiences and lessons learnt in developing this system concludes the paper. SmartFarmNet is the first and currently largest system in the world (in terms of the number of sensors attached, crops assessed, and users it supports) that provides crop performance analysis and recommendations. PMID:27834862

  16. Study on UKF based federal integrated navigation for high dynamic aviation

    NASA Astrophysics Data System (ADS)

    Zhao, Gang; Shao, Wei; Chen, Kai; Yan, Jie

    2011-08-01

    High dynamic aircraft is a very attractive new generation vehicles, in which provides near space aviation with large flight envelope both speed and altitude, for example the hypersonic vehicles. The complex flight environments for high dynamic vehicles require high accuracy and stability navigation scheme. Since the conventional Strapdown Inertial Navigation System (SINS) and Global Position System (GPS) federal integrated scheme based on EKF (Extended Kalman Filter) is invalidation in GPS single blackout situation because of high speed flight, a new high precision and stability integrated navigation approach is presented in this paper, in which the SINS, GPS and Celestial Navigation System (CNS) is combined as a federal information fusion configuration based on nonlinear Unscented Kalman Filter (UKF) algorithm. Firstly, the new integrated system state error is modeled. According to this error model, the SINS system is used as the navigation solution mathematic platform. The SINS combine with GPS constitute one error estimation filter subsystem based on UKF to obtain local optimal estimation, and the SINS combine with CNS constitute another error estimation subsystem. A non-reset federated configuration filter based on partial information is proposed to fuse two local optimal estimations to get global optimal error estimation, and the global optimal estimation is used to correct the SINS navigation solution. The χ 2 fault detection method is used to detect the subsystem fault, and the fault subsystem is isolation through fault interval to protect system away from the divergence. The integrated system takes advantages of SINS, GPS and CNS to an immense improvement for high accuracy and reliably high dynamic navigation application. Simulation result shows that federated fusion of using GPS and CNS to revise SINS solution is reasonable and availably with good estimation performance, which are satisfied with the demands of high dynamic flight navigation. The UKF is superior than EKF based integrated scheme, in which has smaller estimation error and quickly convergence rate.

  17. Real-time emergency forecasting technique for situation management systems

    NASA Astrophysics Data System (ADS)

    Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.

    2018-05-01

    The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.

  18. Comparative Analysis of Two Inquiry Observational Protocols: Striving to Better Understand the Quality of Teacher-Facilitated Inquiry-Based Instruction

    ERIC Educational Resources Information Center

    Marshall, Jeff C.; Smart, Julie; Lotter, Christine; Sirbu, Cristina

    2011-01-01

    With inquiry being one of the central tenets of the national and most state standards, it is imperative that we have a solid means to measure the quality of inquiry-based instruction being led in classrooms. Many instruments are available and used for this purpose, but many are either invalid or too global. This study sought to compare two…

  19. 'Add women & stir'--the biomedical approach to cardiac research!

    PubMed

    O'Donnell, Sharon; Condell, Sarah; Begley, Cecily M

    2004-07-01

    In conditions shared by women and men, the biomedical model of disease assumes that illness-symptoms and outcomes are biologically and socially 'neutral'. Consequently, up until a decade ago, white middle-aged men were the model subjects in most funded cardiac trials, with the assumption that whatever the findings, the results would also hold true for women. This 'add women and stir' approach has resulted in imbalances in cardiac care and an image of coronary artery disease, which portrays a middle-aged male as its victim. Moreover, cardiac health care has been designed with the male anatomy and male experience of illness in mind, and health promotional measures have been targeted towards men. Women have received these health promotional messages to protect the hearts of men, and have been less likely to modify their own lifestyles in a cardio-protective manner. However, the biological and social differences that exist between women and men, must surely invalidate such biased biomedical assertions, and signify a need to delve beyond the realm of biomedical reductionism for greater insights and understanding. This review examines how scientific reductionism has failed to explore the impact of coronary artery disease on the lives of women and how the gendered image of this disease has privileged the normative frame.

  20. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    NASA Astrophysics Data System (ADS)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them for future streamflow projections and segregate the contribution of various sources to the uncertainty.

  1. Avoidance of Affect Mediates the Effect of Invalidating Childhood Environments on Borderline Personality Symptomatology in a Non-Clinical Sample

    ERIC Educational Resources Information Center

    Sturrock, Bonnie A.; Francis, Andrew; Carr, Steven

    2009-01-01

    The aim of this study was to test the Linehan (1993) proposal regarding associations between invalidating childhood environments, distress tolerance (e.g., avoidance of affect), and borderline personality disorder (BPD) symptoms. The sample consisted of 141 non-clinical participants (51 men, 89 women, one gender unknown), ranging in age from 18 to…

  2. The Impacts of Tuition Rate Changes on College Undergraduate Headcounts and Credit Hours Over Time--A Case Study.

    ERIC Educational Resources Information Center

    Chressanthis, George A.

    1986-01-01

    Using 1964-1983 enrollment data for a small Michigan state college, this paper charts tuition rate change impacts on college undergraduate headcounts and credit hours over time. Results indicate that student behavior follows the law of demand, varies with class standing, corroborates human capital investment models, and invalidates uniform tuition…

  3. Perils of categorical thinking: "Oxic/anoxic" conceptual model in environmental remediation

    USGS Publications Warehouse

    Bradley, Paul M.

    2012-01-01

    Given ambient atmospheric oxygen concentrations of about 21 percent (by volume), the lower limit for reliable quantitation of dissolved oxygen concentrations in groundwater samples is in the range of 0.1–0.5 mg/L. Frameworks for assessing in situ redox condition are often applied using a simple two-category (oxic/anoxic) model of oxygen condition. The "oxic" category defines the environmental range in which dissolved oxygen concentrations are clearly expected to impact contaminant biodegradation, either by supporting aerobic biodegradation of electron-donor contaminants like petroleum hydrocarbons or by inhibiting anaerobic biodegradation of electron-acceptor contaminants like chloroethenes. The tendency to label the second category "anoxic" leads to an invalid assumption that oxygen is insignificant when, in fact, the dissolved oxygen concentration is less than detection but otherwise unknown. Expressing dissolved oxygen concentrations as numbers of molecules per volume, dissolved oxygen concentrations that fall below the 0.1 mg/L field detection limit range from 1 to 1017 molecules/L. In light of recent demonstrations of substantial oxygen-linked biodegradation of chloroethene contaminants at dissolved oxygen concentrations well below the 0.1–0.5 mg/L field detection limit, characterizing "less than detection" oxygen concentrations as "insignificant" is invalid.

  4. Geochemistry of dissolved inorganic carbon in a Coastal Plain aquifer. 2. Modeling carbon sources, sinks, and δ13C evolution

    USGS Publications Warehouse

    McMahon, Peter B.; Chapelle, Francis H.

    1991-01-01

    Stable isotope data for dissolved inorganic carbon (DIC), carbonate shell material and cements, and microbial CO2 were combined with organic and inorganic chemical data from aquifer and confining-bed pore waters to construct geochemical reaction models along a flowpath in the Black Creek aquifer of South Carolina. Carbon-isotope fractionation between DIC and precipitating cements was treated as a Rayleigh distillation process. Organic matter oxidation was coupled to microbial fermentation and sulfate reduction. All reaction models reproduced the observed chemical and isotopic compositions of final waters. However, model 1, in which all sources of carbon and electron-acceptors were assumed to be internal to the aquifer, was invalidated owing to the large ratio of fermentation CO2 to respiration CO2 predicted by the model (5–49) compared with measured ratios (two or less). In model 2, this ratio was reduced by assuming that confining beds adjacent to the aquifer act as sources of dissolved organic carbon and sulfate. This assumption was based on measured high concentrations of dissolved organic acids and sulfate in confining-bed pore waters (60–100 μM and 100–380 μM, respectively) relative to aquifer pore waters (from less than 30 μM and 2–80 μM, respectively). Sodium was chosen as the companion ion to organic-acid and sulfate transport from confining beds because it is the predominant cation in confining-bed pore waters. As a result, excessive amounts of Na-for-Ca ion exchange and calcite precipitation (three to four times more cement than observed in the aquifer) were required by model 2 to achieve mass and isotope balance of final water. For this reason, model 2 was invalidated. Agreement between model-predicted and measured amounts of carbonate cement and ratios of fermentation CO2 to respiration CO2 were obtained in a reaction model that assumed confining beds act as sources of DIC, as well as organic acids and sulfate. This assumption was supported by measured high concentrations of DIC in confining beds (2.6–2.7 mM). Results from this study show that geochemical models of confined aquifer systems must incorporate the effects of adjacent confining beds to reproduce observed groundwater chemistry accurately.

  5. Geochemistry of dissolved inorganic carbon in a Coastal Plain aquifer. 2. Modeling carbon sources, sinks, and δ13C evolution

    USGS Publications Warehouse

    McMahon, Peter B.; Chapelle, Francis H.

    1991-01-01

    Stable isotope data for dissolved inorganic carbon (DIC), carbonate shell material and cements, and microbial CO2 were combined with organic and inorganic chemical data from aquifer and confining-bed pore waters to construct geochemical reaction models along a flowpath in the Black Creek aquifer of South Carolina. Carbon-isotope fractionation between DIC and precipitating cements was treated as a Rayleigh distillation process. Organic matter oxidation was coupled to microbial fermentation and sulfate reduction. All reaction models reproduced the observed chemical and isotopic compositions of final waters. However, model 1, in which all sources of carbon and electron-acceptors were assumed to be internal to the aquifer, was invalidated owing to the large ratio of fermentation CO2 to respiration CO2 predicted by the model (5–49) compared with measured ratios (two or less). In model 2, this ratio was reduced by assuming that confining beds adjacent to the aquifer act as sources of dissolved organic carbon and sulfate. This assumption was based on measured high concentrations of dissolved organic acids and sulfate in confining-bed pore waters (60–100 μM and 100–380 μM, respectively) relative to aquifer pore waters (from less than 30 μM and 2–80 μM, respectively). Sodium was chosen as the companion ion to organic-acid and sulfate transport from confining beds because it is the predominant cation in confining-bed pore waters. As a result, excessive amounts of Na-for-Ca ion exchange and calcite precipitation (three to four times more cement than observed in the aquifer) were required by model 2 to achieve mass and isotope balance of final water. For this reason, model 2 was invalidated. Agreement between model-predicted and measured amounts of carbonate cement and ratios of fermentation CO2 to respiration CO2 were obtained in a reaction model that assumed confining beds act as sources of DIC, as well as organic acids and sulfate. This assumption was supported by measured high concentrations of DIC in confining beds (2.6–2.7 mM). Results from this study show that geochemical models of confined aquifer systems must incorporate the effects of adjacent confining beds to reproduce observed groundwater chemistry accurately.

  6. Extending amulti-scale parameter regionalization (MPR) method by introducing parameter constrained optimization and flexible transfer functions

    NASA Astrophysics Data System (ADS)

    Klotz, Daniel; Herrnegger, Mathew; Schulz, Karsten

    2015-04-01

    A multi-scale parameter-estimation method, as presented by Samaniego et al. (2010), is implemented and extended for the conceptual hydrological model COSERO. COSERO is a HBV-type model that is specialized for alpine-environments, but has been applied over a wide range of basins all over the world (see: Kling et al., 2014 for an overview). Within the methodology available small-scale information (DEM, soil texture, land cover, etc.) is used to estimate the coarse-scale model parameters by applying a set of transfer-functions (TFs) and subsequent averaging methods, whereby only TF hyper-parameters are optimized against available observations (e.g. runoff data). The parameter regionalisation approach was extended in order to allow for a more meta-heuristical handling of the transfer-functions. The two main novelties are: 1. An explicit introduction of constrains into parameter estimation scheme: The constraint scheme replaces invalid parts of the transfer-function-solution space with valid solutions. It is inspired by applications in evolutionary algorithms and related to the combination of learning and evolution. This allows the consideration of physical and numerical constraints as well as the incorporation of a priori modeller-experience into the parameter estimation. 2. Spline-based transfer-functions: Spline-based functions enable arbitrary forms of transfer-functions: This is of importance since in many cases the general relationship between sub-grid information and parameters are known, but not the form of the transfer-function itself. The contribution presents the results and experiences with the adopted method and the introduced extensions. Simulation are performed for the pre-alpine/alpine Traisen catchment in Lower Austria. References: Samaniego, L., Kumar, R., Attinger, S. (2010): Multiscale parameter regionalization of a grid-based hydrologic model at the mesoscale, Water Resour. Res., doi: 10.1029/2008WR007327 Kling, H., Stanzel, P., Fuchs, M., and Nachtnebel, H. P. (2014): Performance of the COSERO precipitation-runoff model under non-stationary conditions in basins with different climates, Hydrolog. Sci. J., doi: 10.1080/02626667.2014.959956.

  7. A hierarchical model for spatial capture-recapture data

    USGS Publications Warehouse

    Royle, J. Andrew; Young, K.V.

    2008-01-01

    Estimating density is a fundamental objective of many animal population studies. Application of methods for estimating population size from ostensibly closed populations is widespread, but ineffective for estimating absolute density because most populations are subject to short-term movements or so-called temporary emigration. This phenomenon invalidates the resulting estimates because the effective sample area is unknown. A number of methods involving the adjustment of estimates based on heuristic considerations are in widespread use. In this paper, a hierarchical model of spatially indexed capture recapture data is proposed for sampling based on area searches of spatial sample units subject to uniform sampling intensity. The hierarchical model contains explicit models for the distribution of individuals and their movements, in addition to an observation model that is conditional on the location of individuals during sampling. Bayesian analysis of the hierarchical model is achieved by the use of data augmentation, which allows for a straightforward implementation in the freely available software WinBUGS. We present results of a simulation study that was carried out to evaluate the operating characteristics of the Bayesian estimator under variable densities and movement patterns of individuals. An application of the model is presented for survey data on the flat-tailed horned lizard (Phrynosoma mcallii) in Arizona, USA.

  8. Cointegration and why it works for SHM

    NASA Astrophysics Data System (ADS)

    Cross, Elizabeth J.; Worden, Keith

    2012-08-01

    One of the most fundamental problems in Structural Health Monitoring (SHM) is that of projecting out operational and environmental variations from measured feature data. The reason for this is that algorithms used for SHM to detect changes in structural condition should not raise alarms if the structure of interest changes because of benign operational or environmental variations. This is sometimes called the data normalisation problem. Many solutions to this problem have been proposed over the years, but a new approach that uses cointegration, a concept from the field of econometrics, appears to provide a very promising solution. The theory of cointegration is mathematically complex and its use is based on the holding of a number of assumptions on the time series to which it is applied. An interesting observation that has emerged from its applications to SHM data is that the approach works very well even though the aforementioned assumptions do not hold in general. The objective of the current paper is to discuss how the cointegration assumptions break down individually in the context of SHM and to explain why this does not invalidate the application of the algorithm.

  9. All-paths graph kernel for protein-protein interaction extraction with evaluation of cross-corpus learning.

    PubMed

    Airola, Antti; Pyysalo, Sampo; Björne, Jari; Pahikkala, Tapio; Ginter, Filip; Salakoski, Tapio

    2008-11-19

    Automated extraction of protein-protein interactions (PPI) is an important and widely studied task in biomedical text mining. We propose a graph kernel based approach for this task. In contrast to earlier approaches to PPI extraction, the introduced all-paths graph kernel has the capability to make use of full, general dependency graphs representing the sentence structure. We evaluate the proposed method on five publicly available PPI corpora, providing the most comprehensive evaluation done for a machine learning based PPI-extraction system. We additionally perform a detailed evaluation of the effects of training and testing on different resources, providing insight into the challenges involved in applying a system beyond the data it was trained on. Our method is shown to achieve state-of-the-art performance with respect to comparable evaluations, with 56.4 F-score and 84.8 AUC on the AImed corpus. We show that the graph kernel approach performs on state-of-the-art level in PPI extraction, and note the possible extension to the task of extracting complex interactions. Cross-corpus results provide further insight into how the learning generalizes beyond individual corpora. Further, we identify several pitfalls that can make evaluations of PPI-extraction systems incomparable, or even invalid. These include incorrect cross-validation strategies and problems related to comparing F-score results achieved on different evaluation resources. Recommendations for avoiding these pitfalls are provided.

  10. Analysis of an ABE Scheme with Verifiable Outsourced Decryption.

    PubMed

    Liao, Yongjian; He, Yichuan; Li, Fagen; Jiang, Shaoquan; Zhou, Shijie

    2018-01-10

    Attribute-based encryption (ABE) is a popular cryptographic technology to protect the security of users' data in cloud computing. In order to reduce its decryption cost, outsourcing the decryption of ciphertexts is an available method, which enables users to outsource a large number of decryption operations to the cloud service provider. To guarantee the correctness of transformed ciphertexts computed by the cloud server via the outsourced decryption, it is necessary to check the correctness of the outsourced decryption to ensure security for the data of users. Recently, Li et al. proposed a full verifiability of the outsourced decryption of ABE scheme (ABE-VOD) for the authorized users and unauthorized users, which can simultaneously check the correctness of the transformed ciphertext for both them. However, in this paper we show that their ABE-VOD scheme cannot obtain the results which they had shown, such as finding out all invalid ciphertexts, and checking the correctness of the transformed ciphertext for the authorized user via checking it for the unauthorized user. We first construct some invalid ciphertexts which can pass the validity checking in the decryption algorithm. That means their "verify-then-decrypt" skill is unavailable. Next, we show that the method to check the validity of the outsourced decryption for the authorized users via checking it for the unauthorized users is not always correct. That is to say, there exist some invalid ciphertexts which can pass the validity checking for the unauthorized user, but cannot pass the validity checking for the authorized user.

  11. Analysis of an ABE Scheme with Verifiable Outsourced Decryption

    PubMed Central

    He, Yichuan; Li, Fagen; Jiang, Shaoquan; Zhou, Shijie

    2018-01-01

    Attribute-based encryption (ABE) is a popular cryptographic technology to protect the security of users’ data in cloud computing. In order to reduce its decryption cost, outsourcing the decryption of ciphertexts is an available method, which enables users to outsource a large number of decryption operations to the cloud service provider. To guarantee the correctness of transformed ciphertexts computed by the cloud server via the outsourced decryption, it is necessary to check the correctness of the outsourced decryption to ensure security for the data of users. Recently, Li et al. proposed a full verifiability of the outsourced decryption of ABE scheme (ABE-VOD) for the authorized users and unauthorized users, which can simultaneously check the correctness of the transformed ciphertext for both them. However, in this paper we show that their ABE-VOD scheme cannot obtain the results which they had shown, such as finding out all invalid ciphertexts, and checking the correctness of the transformed ciphertext for the authorized user via checking it for the unauthorized user. We first construct some invalid ciphertexts which can pass the validity checking in the decryption algorithm. That means their “verify-then-decrypt” skill is unavailable. Next, we show that the method to check the validity of the outsourced decryption for the authorized users via checking it for the unauthorized users is not always correct. That is to say, there exist some invalid ciphertexts which can pass the validity checking for the unauthorized user, but cannot pass the validity checking for the authorized user. PMID:29320418

  12. Physical examination tests and imaging studies based on arthroscopic assessment of the long head of biceps tendon are invalid.

    PubMed

    Jordan, Robert W; Saithna, Adnan

    2017-10-01

    The aim of this study was to evaluate whether glenohumeral arthroscopy is an appropriate gold standard for the diagnosis of long head of biceps (LHB) tendon pathology. The objectives were to evaluate whether the length of tendon that can be seen at arthroscopy allows visualisation of areas of predilection of pathology and also to determine the rates of missed diagnoses at arthroscopy when compared to an open approach. A systematic review of cadaveric and clinical studies was performed. The search strategy was applied to MEDLINE, PubMed and Google Scholar databases. All relevant articles were included. Critical appraisal of clinical studies was performed using a validated quality assessment scale. Five articles were identified for inclusion in the review. This included both clinical and cadaveric studies. The overall population comprised 18 cadaveric specimens and 575 patients. Out of the five included studies, three reported the length of LHB tendon visualised during arthroscopy and four reported the rate of missed LHB diagnosis. Cadaveric studies showed that the use of a hook probe allowed arthroscopic visualisation of between 34 and 48 % of the overall length of the LHB. In the clinical series, the rate of missed diagnoses at arthroscopy when compared to open exploration ranged between 33 and 49 %. Arthroscopy allows visualisation of only a small part of the extra-articular LHB tendon. This leads to a high rate of missed pathology in the distal part of the tendon. Published figures for sensitivities and specificities of common physical examination and imaging tests for LHB pathology that are based on arthroscopy as the gold standard are therefore invalid. In clinical practice, it is important to note that a "negative" arthroscopic assessment does not exclude a lesion of the LHB tendon as this technique does not allow visualisation of common sites of distal pathology. IV.

  13. Testing the Usability of Interactive Visualizations for Complex Problem-Solving: Findings Related to Improving Interfaces and Help.

    ERIC Educational Resources Information Center

    Mirel, Barbara

    2001-01-01

    Conducts a scenario-based usability test with 10 data analysts using visual querying (visually analyzing data with interactive graphics). Details a range of difficulties found in visual selection that, at times, gave rise to inaccurate selections, invalid conclusions, and misguided decisions. Argues that support for visual selection must be built…

  14. 78 FR 35051 - Certain Encapsulated Integrated Circuit Devices and Products Containing Same; Commission...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-11

    ... `277 patent are invalid under 35 U.S.C. 102(b) as anticipated by certain prior art references, but... under 35 U.S.C. 102(b) as anticipated by certain prior art references; (3) claims 1, 2, 13 and 14 of the '356 patent are not invalid under 35 U.S.C. 102(b) as anticipated by certain prior art references; (4...

  15. Nondiscrimination on the basis of handicap in the provision of health care to handicapped infants--Office of the Secretary, HHS. Notice of court order declaring rule invalid.

    PubMed

    1983-04-25

    This notice is to advise that the interim final rule issued by the Department of Health and Human Services on March 7, 1983, 48 FR 9630, concerning discrimination in the provision of health care to handicapped infants, has been declared invalid and has no further force and effect.

  16. Modeling aerodynamic discontinuities and the onset of chaos in flight dynamical systems

    NASA Technical Reports Server (NTRS)

    Tobak, M.; Chapman, G. T.; Uenal, A.

    1986-01-01

    Various representations of the aerodynamic contribution to the aircraft's equation of motion are shown to be compatible within the common assumption of their Frechet differentiability. Three forms of invalidating Frechet differentiality are identified, and the mathematical model is amended to accommodate their occurrence. Some of the ways in which chaotic behavior may emerge are discussed, first at the level of the aerodynamic contribution to the equation of motion, and then at the level of the equations of motion themselves.

  17. Modeling aerodynamic discontinuities and onset of chaos in flight dynamical systems

    NASA Technical Reports Server (NTRS)

    Tobak, M.; Chapman, G. T.; Unal, A.

    1987-01-01

    Various representations of the aerodynamic contribution to the aircraft's equation of motion are shown to be compatible within the common assumption of their Frechet differentiability. Three forms of invalidating Frechet differentiability are identified, and the mathematical model is amended to accommodate their occurrence. Some of the ways in which chaotic behavior may emerge are discussed, first at the level of the aerodynamic contribution to the equations of motion, and then at the level of the equations of motion themselves.

  18. Modeling Nonlinear Errors in Surface Electromyography Due To Baseline Noise: A New Methodology

    PubMed Central

    Law, Laura Frey; Krishnan, Chandramouli; Avin, Keith

    2010-01-01

    The surface electromyographic (EMG) signal is often contaminated by some degree of baseline noise. It is customary for scientists to subtract baseline noise from the measured EMG signal prior to further analyses based on the assumption that baseline noise adds linearly to the observed EMG signal. The stochastic nature of both the baseline and EMG signal, however, may invalidate this assumption. Alternately, “true” EMG signals may be either minimally or nonlinearly affected by baseline noise. This information is particularly relevant at low contraction intensities when signal-to-noise ratios (SNR) may be lowest. Thus, the purpose of this simulation study was to investigate the influence of varying levels of baseline noise (approximately 2 – 40 % maximum EMG amplitude) on mean EMG burst amplitude and to assess the best means to account for signal noise. The simulations indicated baseline noise had minimal effects on mean EMG activity for maximum contractions, but increased nonlinearly with increasing noise levels and decreasing signal amplitudes. Thus, the simple baseline noise subtraction resulted in substantial error when estimating mean activity during low intensity EMG bursts. Conversely, correcting EMG signal as a nonlinear function of both baseline and measured signal amplitude provided highly accurate estimates of EMG amplitude. This novel nonlinear error modeling approach has potential implications for EMG signal processing, particularly when assessing co-activation of antagonist muscles or small amplitude contractions where the SNR can be low. PMID:20869716

  19. Serum Vitamin D Status in Iranian Fibromyalgia Patients: according to the Symptom Severity and Illness Invalidation

    PubMed Central

    Maafi, Alireza Amir; Haghdoost, Afrooz; Aarabi, Yasaman; Hajiabbasi, Asghar; Shenavar Masooleh, Irandokht; Zayeni, Habib; Ghalebaghi, Babak; Hassankhani, Amir; Bidari, Ali

    2016-01-01

    Background This study was designed to assess serum vitamin D status (25-OHD) in the fibromyalgia (FM) patients and to compare it with a healthy control group. It also aimed to investigate the correlation of serum vitamin D level with FM symptom severity and invalidation experiences. Methods A total of 74 consecutive patients with FM and 68 healthy control participants were enrolled. The eligible FM patients completed the Illness Invalidation Inventory (3*I), the Revised Fibromyalgia Impact Questionnaire (FIQR) and a short-form health survey (SF-12). Venous blood samples were drawn from all participants to evaluate serum 25-OHD levels. Mann-Whitney tests and multiple logistic regression analyses were performed and Spearman's correlations were calculated. Results 88.4% of FM patients had low levels of serum 25-OHD. FM patients had significantly higher level of serum 25-OHD than the control group (17.24 ± 13.50 and 9.91 ± 6.47 respectively, P = 0.0001). There were no significant correlations between serum 25-OHD levels and the clinical measures of disease impact, invalidation dimensions, and health status. Multiple logistic regression analyses revealed that an increased discounting of the disease by the patient's spouse was associated with a 4-fold increased risk for vitamin D deficiency (OR = 4.36; 95% CI, 0.95–19.87, P = 0.05). Conclusions This study showed that although high rates of vitamin D insufficiency or deficiency were seen among FM patients and healthy non-FM participants, but it seems there was no intrinsic association between FM and vitamin D deficiency. Addressing of invalidation experience especially by the patient's spouse is important in management of FM. PMID:27413482

  20. Heuristic and analytic processes in reasoning: an event-related potential study of belief bias.

    PubMed

    Banks, Adrian P; Hope, Christopher

    2014-03-01

    Human reasoning involves both heuristic and analytic processes. This study of belief bias in relational reasoning investigated whether the two processes occur serially or in parallel. Participants evaluated the validity of problems in which the conclusions were either logically valid or invalid and either believable or unbelievable. Problems in which the conclusions presented a conflict between the logically valid response and the believable response elicited a more positive P3 than problems in which there was no conflict. This shows that P3 is influenced by the interaction of belief and logic rather than either of these factors on its own. These findings indicate that belief and logic influence reasoning at the same time, supporting models in which belief-based and logical evaluations occur in parallel but not theories in which belief-based heuristic evaluations precede logical analysis.

  1. A study of changes in middle school teachers' understanding of selected ideas in science as a function of an in-service program focusing on student preconceptions

    NASA Astrophysics Data System (ADS)

    Shymansky, James A.; Woodworth, George; Norman, Obed; Dunkhase, John; Matthews, Charles; Liu, Chin-Tang

    This article examines the impact of a specially designed in-service model on teacher understanding of selected science concepts. The underlying idea of the model is to get teachers to restructure their own understanding of a selected science topic by having them study the structure and evolution of their students' ideas on the same topic. Concepts on topics from the life, earth, and physical sciences served as the content focus and middle school Grades 4-9 served as the context for this study. The in-service experience constituting the main treatment in the study occurred in three distinct phases. In the initial phase, participating teachers interviewed several of their own students to find out what kinds of preconceptions students had about a particular topic. The teachers used concept mapping strategies learned in the in-service to facilitate the interviews. Next the teachers teamed with other teachers with similar topic interests and a science expert to evaluate and explore the scientific merit of the student conceptual frameworks and to develop instructional units, including a summative assessment during a summer workshop. Finally, the student ideas were further evaluated and explored as the teachers taught the topics in their classrooms during the fall term. Concept maps were used to study changes in teacher understanding across the phases of the in-service in a repeated-measures design. Analysis of the maps showed significant growth in the number of valid propositions expressed by teachers between the initial and final mappings in all topic groups. But in half of the groups, this long-term growth was interrupted by a noticeable decline in the number of valid propositions expressed. In addition, analysis of individual teacher maps showed distinctive patterns of initial invalid conceptions being replaced by new invalid conceptions in later mappings. The combination of net growth of valid propositions and the patterns of evolving invalid conceptions is discussed in constructivist terms.

  2. Geological constraints for muon tomography: The world beyond standard rock

    NASA Astrophysics Data System (ADS)

    Lechmann, Alessandro; Mair, David; Ariga, Akitaka; Ariga, Tomoko; Ereditato, Antonio; Käser, Samuel; Nishiyama, Ryuichi; Scampoli, Paola; Vladymyrov, Mykhailo; Schlunegger, Fritz

    2017-04-01

    In present day muon tomography practice, one often encounters an experimental setup in which muons propagate several tens to a few hundreds of meters through a material to the detector. The goal of such an undertaking is usually centred on an attempt to make inferences from the measured muon flux to an anticipated subsurface structure. This can either be an underground interface geometry or a spatial material distribution. Inferences in this direction have until now mostly been done, thereby using the so called "standard rock" approximation. This includes a set of empirically determined parameters from several rocks found in the vicinity of physicist's laboratories. While this approach is reasonable to account for the effects of the tens of meters of soil/rock around a particle accelerator, we show, that for material thicknesses beyond that dimension, the elementary composition of the material (average atomic weight and atomic number) has a noticeable effect on the measured muon flux. Accordingly, the consecutive use of this approximation could potentially lead into a serious model bias, which in turn, might invalidate any tomographic inference, that base on this standard rock approximation. The parameters for standard rock are naturally close to a granitic (SiO2-rich) composition and thus can be safely used in such environments. As geophysical surveys are not restricted to any particular lithology, we investigated the effect of alternative rock compositions (carbonatic, basaltic and even ultramafic) and consequentially prefer to replace the standard rock approach with a dedicated geological investigation. Structural field data and laboratory measurements of density (He-Pycnometer) and composition (XRD) can be merged into an integrative geological model that can be used as an a priori constraint for the rock parameters of interest (density & composition) in the geophysical inversion. Modelling results show that when facing a non-granitic lithology the measured muon flux can vary up to 20-30%, in the case of carbonates and up to 100% for peridotites, compared to standard rock data.

  3. Importance and pitfalls of molecular analysis to parasite epidemiology.

    PubMed

    Constantine, Clare C

    2003-08-01

    Molecular tools are increasingly being used to address questions about parasite epidemiology. Parasites represent a diverse group and they might not fit traditional population genetic models. Testing hypotheses depends equally on correct sampling, appropriate tool and/or marker choice, appropriate analysis and careful interpretation. All methods of analysis make assumptions which, if violated, make the results invalid. Some guidelines to avoid common pitfalls are offered here.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Xinfang; White, Ralph E.; Huang, Kevin

    With the assumption that the Fermi level (electrochemical potential of electrons) is uniform across the thickness of a mixed ionic and electronic conducting (MIEC) electrode, the charge-transport model in the electrode domain can be reduced to the modified Fick’s first law, which includes a thermodynamic factor A. A transient numerical solution of the Nernst-Planck theory was obtained for a symmetric cell with MIEC electrodes to illustrate the validity of the assumption of a uniform Fermi level. Subsequently, an impedance numerical solution based on the modified Fick’s first law is compared with that from the Nernst-Planck theory. The results show thatmore » Nernst-Planck charge-transport model is essentially the same as the modified Fick’s first law model as long as the MIEC electrodes have a predominant electronic conductivity. However, because of the invalidity of the uniform Fermi level assumption for aMIEC electrolyte with a predominant ionic conductivity, Nernst-Planck theory is needed to describe the charge transport behaviors.« less

  5. Correlates and consequences of the disclosure of pain-related distress to one’s spouse

    PubMed Central

    Cano, Annmarie; Leong, Laura E. M.; Williams, Amy M.; May, Dana K. K.; Lutz, Jillian R.

    2012-01-01

    The communication of pain has received a great deal of attention in the pain literature; however, one form of pain communication—emotional disclosure of pain-related distress (e.g., sadness, worry, anger about pain)—has not been studied extensively. The current study examined the extent to which this form of pain communication occurred during an observed conversation with one’s spouse and also investigated the correlates and consequences of disclosure. Individuals with chronic pain (ICPs) and their spouses (N = 95 couples) completed several questionnaires regarding pain, psychological distress, and relationship distress as well as video recorded interactions about the impact of pain on their lives. Approximately two-thirds of ICPs (n = 65) disclosed their pain-related distress to their spouses. ICPs who reported greater pain severity, ruminative catastrophizing and affective distress about pain, and depressive and anxiety symptoms were more likely to disclose their distress to their spouses. Spouses of ICPs who disclosed only once or twice were significantly less likely to invalidate their partners whereas spouses of ICPs who disclosed at a higher rate were significantly more likely to validate their partners. Furthermore, spouses were more likely to engage in invalidation after attempting more neutral or validating responses, suggesting an erosion of support when ICPs engaged in high rates of disclosure. Correlates of spousal invalidation included both spouses’ helplessness catastrophizing, ICPs’ affective distress about pain, and spouses’ anxiety, suggesting that both partners’ distress are implicated in maladaptive disclosure-response patterns. Findings are discussed in light of pain communication and empathy models of pain. PMID:23059054

  6. 49 CFR 40.159 - What does the MRO do when a drug test result is invalid?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 1 2013-10-01 2013-10-01 false What does the MRO do when a drug test result is invalid? 40.159 Section 40.159 Transportation Office of the Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers and the Verification Process § 40.159 What does the MRO do when a...

  7. The Role of Depressive Symptoms, Family Invalidation and Behavioral Impulsivity in the Occurrence and Repetition of Non-Suicidal Self-Injury in Chinese Adolescents: A 2-Year Follow-Up Study

    ERIC Educational Resources Information Center

    You, Jianing; Leung, Freedom

    2012-01-01

    This study used zero-inflated poisson regression analysis to examine the role of depressive symptoms, family invalidation, and behavioral impulsivity in the occurrence and repetition of non-suicidal self-injury among Chinese community adolescents over a 2-year period. Participants, 4782 high school students, were assessed twice during the…

  8. A Heuristic Approach to Remove the Background Intensity on White-light Solar Images. I. STEREO /HI-1 Heliospheric Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stenborg, Guillermo; Howard, Russell A.

    White-light coronal and heliospheric imagers observe scattering of photospheric light from both dust particles (the F-Corona) and free electrons in the corona (the K-corona). The separation of the two coronae is thus vitally important to reveal the faint K-coronal structures (e.g., streamers, co-rotating interaction regions, coronal mass ejections, etc.). However, the separation of the two coronae is very difficult, so we are content in defining a background corona that contains the F- and as little K- as possible. For both the LASCO-C2 and LASCO-C3 coronagraphs aboard the Solar and Heliospheric Observatory ( SOHO ) and the white-light imagers of themore » SECCHI suite aboard the Solar Terrestrial Relationships Observatory ( STEREO ), a time-dependent model of the background corona is generated from about a month of similar images. The creation of such models is possible because the missions carrying these instruments are orbiting the Sun at about 1 au. However, the orbit profiles for the upcoming Solar Orbiter and Solar Probe Plus missions are very different. These missions will have elliptic orbits with a rapidly changing radial distance, hence invalidating the techniques in use for the SOHO /LASCO and STEREO /SECCHI instruments. We have been investigating techniques to generate background models out of just single images that could be used for the Solar Orbiter Heliospheric Imager and the Wide-field Imager for the Solar Probe Plus packages on board the respective spacecraft. In this paper, we introduce a state-of-the-art, heuristic technique to create the background intensity models of STEREO /HI-1 data based solely on individual images, report on new results derived from its application, and discuss its relevance to instrumental and operational issues.« less

  9. Differences in MMPI-2 FBS and RBS scores in brain injury, probable malingering, and conversion disorder groups: a preliminary study.

    PubMed

    Peck, C P; Schroeder, R W; Heinrichs, R J; Vondran, E J; Brockman, C J; Webster, B K; Baade, L E

    2013-01-01

    This study examined differences in raw scores on the Symptom Validity Scale and Response Bias Scale (RBS) from the Minnesota Multiphasic Personality Inventory-2 in three criterion groups: (i) valid traumatic brain injured, (ii) invalid traumatic brain injured, and (iii) psychogenic non-epileptic seizure disorders. Results indicate that a >30 raw score cutoff for the Symptom Validity Scale accurately identified 50% of the invalid traumatic brain injured group, while misclassifying none of the valid traumatic brain injured group and 6% of the psychogenic non-epileptic seizure disorder group. Using a >15 RBS raw cutoff score accurately classified 50% of the invalid traumatic brain injured group and misclassified fewer than 10% of the valid traumatic brain injured and psychogenic non-epileptic seizure disorder groups. These cutoff scores used conjunctively did not misclassify any members of the psychogenic non-epileptic seizure disorder or valid traumatic brain injured groups, while accurately classifying 44% of the invalid traumatic brain injured individuals. Findings from this preliminary study suggest that the conjunctive use of the Symptom Validity Scale and the RBS from the Minnesota Multiphasic Personality Inventory-2 may be useful in differentiating probable malingering from individuals with brain injuries and conversion disorders.

  10. Mechanisms of contextual risk for adolescent self-injury: invalidation and conflict escalation in mother-child interactions.

    PubMed

    Crowell, Sheila E; Baucom, Brian R; McCauley, Elizabeth; Potapova, Natalia V; Fitelson, Martha; Barth, Heather; Smith, Cindy J; Beauchaine, Theodore P

    2013-01-01

    According to developmental theories of self-injury, both child characteristics and environmental contexts shape and maintain problematic behaviors. Although progress has been made toward identifying biological vulnerabilities to self-injury, mechanisms underlying psychosocial risk have received less attention. In the present study, we compared self-injuring adolescents (n = 17) with typical controls (n = 20) during a mother-child conflict discussion. Dyadic interactions were coded using both global and microanalytic systems, allowing for a highly detailed characterization of mother-child interactions. We also assessed resting state psychophysiological regulation, as indexed by respiratory sinus arrhythmia (RSA). Global coding revealed that maternal invalidation was associated with adolescent anger. Furthermore, maternal invalidation and coerciveness were both related to adolescent opposition/defiance. Results from the microanalytic system indicated that self-injuring dyads were more likely to escalate conflict, suggesting a potential mechanism through which emotion dysregulation is shaped and maintained over time. Finally, mother and teen aversiveness interacted to predict adolescent resting RSA. Low-aversive teens with highly aversive mothers had the highest RSA, whereas teens in high-high dyads showed the lowest RSA. These findings are consistent with theories that emotion invalidation and conflict escalation are possible contextual risk factors for self-injury.

  11. Mechanisms of Contextual Risk for Adolescent Self-Injury: Invalidation and Conflict Escalation in Mother-Child Interactions

    PubMed Central

    Crowell, Sheila E.; Baucom, Brian R.; McCauley, Elizabeth; Potapova, Natalia V.; Fitelson, Martha; Barth, Heather; Smith, Cindy J.; Beauchaine, Theodore P.

    2013-01-01

    OBJECTIVE According to developmental theories of self-injury, both child characteristics and environmental contexts shape and maintain problematic behaviors. Although progress has been made toward identifying biological vulnerabilities to self-injury, mechanisms underlying psychosocial risk have received less attention. METHOD In the present study, we compared self-injuring adolescents (n=17) with typical controls (n=20) during a mother-child conflict discussion. Dyadic interactions were coded using both global and microanalytic systems, allowing for a highly detailed characterization of mother-child interactions. We also assessed resting state psychophysiological regulation, as indexed by respiratory sinus arrhythmia (RSA). RESULTS Global coding revealed that maternal invalidation was associated with adolescent anger. Furthermore, maternal invalidation and coerciveness were both related to adolescent opposition/defiance. Results from the microanalytic system indicated that self-injuring dyads were more likely to escalate conflict, suggesting a potential mechanism through which emotion dysregulation is shaped and maintained over time. Finally, mother and teen aversiveness interacted to predict adolescent resting RSA. Low-aversive teens with highly aversive mothers had the highest RSA, whereas teens in high-high dyads showed the lowest RSA. CONCLUSIONS These findings are consistent with theories that emotion invalidation and conflict escalation are possible contextual risk factors for self-injury. PMID:23581508

  12. Risk-Based Fire Safety Experiment Definition for Manned Spacecraft

    NASA Technical Reports Server (NTRS)

    Apostolakis, G. E.; Ho, V. S.; Marcus, E.; Perry, A. T.; Thompson, S. L.

    1989-01-01

    Risk methodology is used to define experiments to be conducted in space which will help to construct and test the models required for accident sequence identification. The development of accident scenarios is based on the realization that whether damage occurs depends on the time competition of two processes: the ignition and creation of an adverse environment, and the detection and suppression activities. If the fire grows and causes damage faster than it is detected and suppressed, then an accident occurred. The proposed integrated experiments will provide information on individual models that apply to each of the above processes, as well as previously unidentified interactions and processes, if any. Initially, models that are used in terrestrial fire risk assessments are considered. These include heat and smoke release models, detection and suppression models, as well as damage models. In cases where the absence of gravity substantially invalidates a model, alternate models will be developed. Models that depend on buoyancy effects, such as the multizone compartment fire models, are included in these cases. The experiments will be performed in a variety of geometries simulating habitable areas, racks, and other spaces. These simulations will necessitate theoretical studies of scaling effects. Sensitivity studies will also be carried out including the effects of varying oxygen concentrations, pressures, fuel orientation and geometry, and air flow rates. The experimental apparatus described herein includes three major modules: the combustion, the fluids, and the command and power modules.

  13. Electrostatic effects in unfolded staphylococcal nuclease

    PubMed Central

    Fitzkee, Nicholas C.; García-Moreno E, Bertrand

    2008-01-01

    Structure-based calculations of pK a values and electrostatic free energies of proteins assume that electrostatic effects in the unfolded state are negligible. In light of experimental evidence showing that this assumption is invalid for many proteins, and with increasing awareness that the unfolded state is more structured and compact than previously thought, a detailed examination of electrostatic effects in unfolded proteins is warranted. Here we address this issue with structure-based calculations of electrostatic interactions in unfolded staphylococcal nuclease. The approach involves the generation of ensembles of structures representing the unfolded state, and calculation of Coulomb energies to Boltzmann weight the unfolded state ensembles. Four different structural models of the unfolded state were tested. Experimental proton binding data measured with a variant of nuclease that is unfolded under native conditions were used to establish the validity of the calculations. These calculations suggest that weak Coulomb interactions are an unavoidable property of unfolded proteins. At neutral pH, the interactions are too weak to organize the unfolded state; however, at extreme pH values, where the protein has a significant net charge, the combined action of a large number of weak repulsive interactions can lead to the expansion of the unfolded state. The calculated pK a values of ionizable groups in the unfolded state are similar but not identical to the values in small peptides in water. These studies suggest that the accuracy of structure-based calculations of electrostatic contributions to stability cannot be improved unless electrostatic effects in the unfolded state are calculated explicitly. PMID:18227429

  14. Learning toward practical head pose estimation

    NASA Astrophysics Data System (ADS)

    Sang, Gaoli; He, Feixiang; Zhu, Rong; Xuan, Shibin

    2017-08-01

    Head pose is useful information for many face-related tasks, such as face recognition, behavior analysis, human-computer interfaces, etc. Existing head pose estimation methods usually assume that the face images have been well aligned or that sufficient and precise training data are available. In practical applications, however, these assumptions are very likely to be invalid. This paper first investigates the impact of the failure of these assumptions, i.e., misalignment of face images, uncertainty and undersampling of training data, on head pose estimation accuracy of state-of-the-art methods. A learning-based approach is then designed to enhance the robustness of head pose estimation to these factors. To cope with misalignment, instead of using hand-crafted features, it seeks suitable features by learning from a set of training data with a deep convolutional neural network (DCNN), such that the training data can be best classified into the correct head pose categories. To handle uncertainty and undersampling, it employs multivariate labeling distributions (MLDs) with dense sampling intervals to represent the head pose attributes of face images. The correlation between the features and the dense MLD representations of face images is approximated by a maximum entropy model, whose parameters are optimized on the given training data. To estimate the head pose of a face image, its MLD representation is first computed according to the model based on the features extracted from the image by the trained DCNN, and its head pose is then assumed to be the one corresponding to the peak in its MLD. Evaluation experiments on the Pointing'04, FacePix, Multi-PIE, and CASIA-PEAL databases prove the effectiveness and efficiency of the proposed method.

  15. Prevalence of Invalid Performance on Baseline Testing for Sport-Related Concussion by Age and Validity Indicator.

    PubMed

    Abeare, Christopher A; Messa, Isabelle; Zuccato, Brandon G; Merker, Bradley; Erdodi, Laszlo

    2018-03-12

    Estimated base rates of invalid performance on baseline testing (base rates of failure) for the management of sport-related concussion range from 6.1% to 40.0%, depending on the validity indicator used. The instability of this key measure represents a challenge in the clinical interpretation of test results that could undermine the utility of baseline testing. To determine the prevalence of invalid performance on baseline testing and to assess whether the prevalence varies as a function of age and validity indicator. This retrospective, cross-sectional study included data collected between January 1, 2012, and December 31, 2016, from a clinical referral center in the Midwestern United States. Participants included 7897 consecutively tested, equivalently proportioned male and female athletes aged 10 to 21 years, who completed baseline neurocognitive testing for the purpose of concussion management. Baseline assessment was conducted with the Immediate Postconcussion Assessment and Cognitive Testing (ImPACT), a computerized neurocognitive test designed for assessment of concussion. Base rates of failure on published ImPACT validity indicators were compared within and across age groups. Hypotheses were developed after data collection but prior to analyses. Of the 7897 study participants, 4086 (51.7%) were male, mean (SD) age was 14.71 (1.78) years, 7820 (99.0%) were primarily English speaking, and the mean (SD) educational level was 8.79 (1.68) years. The base rate of failure ranged from 6.4% to 47.6% across individual indicators. Most of the sample (55.7%) failed at least 1 of 4 validity indicators. The base rate of failure varied considerably across age groups (117 of 140 [83.6%] for those aged 10 years to 14 of 48 [29.2%] for those aged 21 years), representing a risk ratio of 2.86 (95% CI, 2.60-3.16; P < .001). The results for base rate of failure were surprisingly high overall and varied widely depending on the specific validity indicator and the age of the examinee. The strong age association, with 3 of 4 participants aged 10 to 12 years failing validity indicators, suggests that the clinical interpretation and utility of baseline testing in this age group is questionable. These findings underscore the need for close scrutiny of performance validity indicators on baseline testing across age groups.

  16. Response to Comment on "Water harvesting from air with metal-organic frameworks powered by natural sunlight".

    PubMed

    Kim, Hyunho; Rao, Sameer R; Narayanan, Shankar; Kapustin, Eugene A; Yang, Sungwoo; Furukawa, Hiroyasu; Umans, Ari S; Yaghi, Omar M; Wang, Evelyn N

    2017-12-01

    In their comment, Bui et al argue that the approach we described in our report is vastly inferior in efficiency to alternative off-the-shelf technologies. Their conclusion is invalid, as they compare efficiencies in completely different operating conditions. Here, using heat transfer and thermodynamics principles, we show how Bui et al 's conclusions about the efficiencies of off-the-shelf technologies are fundamentally flawed and inaccurate for the operating conditions described in our study. Copyright © 2017, American Association for the Advancement of Science.

  17. Diffusive-light invisibility cloak for transient illumination

    NASA Astrophysics Data System (ADS)

    Orazbayev, B.; Beruete, M.; Martínez, A.; García-Meca, C.

    2016-12-01

    Invisibility in a diffusive-light-scattering medium has been recently demonstrated by employing a scattering-cancellation core-shell cloak. Unlike nondiffusive cloaks, such a device can be simultaneously macroscopic, broadband, passive, polarization independent, and omnidirectional. Unfortunately, it has been verified that this cloak, as well as more sophisticated ones based on transformation optics, fail under pulsed illumination, invalidating their use for a variety of applications. Here, we introduce a different approach based on unimodular transformations that enables the construction of unidirectional diffusive-light cloaks exhibiting a perfect invisibility effect, even under transient conditions. Moreover, we demonstrate that a polygonal cloak can extend this functionality to multiple directions with a nearly ideal behavior, while preserving all other features. We propose and numerically verify a simple cloak realization based on a layered stack of two isotropic materials. The studied devices have several applications not addressable by any of the other cloaks proposed to date, including shielding from pulse-based detection techniques, cloaking undesired scattering elements in time-of-flight imaging or high-speed communication systems for diffusive environments, and building extreme optical security features. The discussed cloaking strategy could also be applied to simplify the implementation of thermal cloaks.

  18. Setting and changing feature priorities in visual short-term memory.

    PubMed

    Kalogeropoulou, Zampeta; Jagadeesh, Akshay V; Ohl, Sven; Rolfs, Martin

    2017-04-01

    Many everyday tasks require prioritizing some visual features over competing ones, both during the selection from the rich sensory input and while maintaining information in visual short-term memory (VSTM). Here, we show that observers can change priorities in VSTM when, initially, they attended to a different feature. Observers reported from memory the orientation of one of two spatially interspersed groups of black and white gratings. Using colored pre-cues (presented before stimulus onset) and retro-cues (presented after stimulus offset) predicting the to-be-reported group, we manipulated observers' feature priorities independently during stimulus encoding and maintenance, respectively. Valid pre-cues reliably increased observers' performance (reduced guessing, increased report precision) as compared to neutral ones; invalid pre-cues had the opposite effect. Valid retro-cues also consistently improved performance (by reducing random guesses), even if the unexpected group suddenly became relevant (invalid-valid condition). Thus, feature-based attention can reshape priorities in VSTM protecting information that would otherwise be forgotten.

  19. Smartphone based automatic organ validation in ultrasound video.

    PubMed

    Vaish, Pallavi; Bharath, R; Rajalakshmi, P

    2017-07-01

    Telesonography involves transmission of ultrasound video from remote areas to the doctors for getting diagnosis. Due to the lack of trained sonographers in remote areas, the ultrasound videos scanned by these untrained persons do not contain the proper information that is required by a physician. As compared to standard methods for video transmission, mHealth driven systems need to be developed for transmitting valid medical videos. To overcome this problem, we are proposing an organ validation algorithm to evaluate the ultrasound video based on the content present. This will guide the semi skilled person to acquire the representative data from patient. Advancement in smartphone technology allows us to perform high medical image processing on smartphone. In this paper we have developed an Application (APP) for a smartphone which can automatically detect the valid frames (which consist of clear organ visibility) in an ultrasound video and ignores the invalid frames (which consist of no-organ visibility), and produces a compressed sized video. This is done by extracting the GIST features from the Region of Interest (ROI) of the frame and then classifying the frame using SVM classifier with quadratic kernel. The developed application resulted with the accuracy of 94.93% in classifying valid and invalid images.

  20. Motivated reflection on attitude-inconsistent information: an exploration of the role of fear of invalidity in self-persuasion.

    PubMed

    Clarkson, Joshua J; Valente, Matthew J; Leone, Christopher; Tormala, Zakary L

    2013-12-01

    The mere thought effect is defined in part by the tendency of self-reflective thought to heighten the generation of and reflection on attitude-consistent thoughts. By focusing on individuals' fears of invalidity, we explored the possibility that the mere opportunity for thought sometimes motivates reflection on attitude-inconsistent thoughts. Across three experiments, dispositional and situational fear of invalidity was shown to heighten reflection on attitude-inconsistent thoughts. This heightened reflection, in turn, interacted with individuals' thought confidence to determine whether attitude-inconsistent thoughts were assimilated or refuted and consequently whether individuals' attitudes and behavioral intentions depolarized or polarized following a sufficient opportunity for thought, respectively. These findings emphasize the impact of motivational influences on thought reflection and generation, the importance of thought confidence in the assimilation and refutation of self-generated thought, and the dynamic means by which the mere thought bias can impact self-persuasion.

  1. [Physicians as Experts of the Integration of war invalids of WWI and WWII].

    PubMed

    Wolters, Christine

    2015-12-01

    After the First World War the large number of war invalids posed a medical as well as a socio-political problem. This needed to be addressed, at least to some extent, through healthcare providers (Versorgungsbehörden) and reintegration into the labour market. Due to the demilitarization of Germany, this task was taken on by the civil administration, which was dissolved during the time of National Socialism. In 1950, the Federal Republic of Germany enacted the Federal War Victims Relief Act (Bundesversorgungsgesetz), which created a privileged group of civil and military war invalids, whereas other disabled people and victims of national socialist persecution were initially excluded. This article examines the continuities and discontinuities of the institutions following the First World War. A particular focus lies on the groups of doctors which structured this field. How did doctors become experts and what was their expertise?

  2. Impact of External Cue Validity on Driving Performance in Parkinson's Disease

    PubMed Central

    Scally, Karen; Charlton, Judith L.; Iansek, Robert; Bradshaw, John L.; Moss, Simon; Georgiou-Karistianis, Nellie

    2011-01-01

    This study sought to investigate the impact of external cue validity on simulated driving performance in 19 Parkinson's disease (PD) patients and 19 healthy age-matched controls. Braking points and distance between deceleration point and braking point were analysed for red traffic signals preceded either by Valid Cues (correctly predicting signal), Invalid Cues (incorrectly predicting signal), and No Cues. Results showed that PD drivers braked significantly later and travelled significantly further between deceleration and braking points compared with controls for Invalid and No-Cue conditions. No significant group differences were observed for driving performance in response to Valid Cues. The benefit of Valid Cues relative to Invalid Cues and No Cues was significantly greater for PD drivers compared with controls. Trail Making Test (B-A) scores correlated with driving performance for PDs only. These results highlight the importance of external cues and higher cognitive functioning for driving performance in mild to moderate PD. PMID:21789275

  3. Accounting for Non-Gaussian Sources of Spatial Correlation in Parametric Functional Magnetic Resonance Imaging Paradigms I: Revisiting Cluster-Based Inferences.

    PubMed

    Gopinath, Kaundinya; Krishnamurthy, Venkatagiri; Sathian, K

    2018-02-01

    In a recent study, Eklund et al. employed resting-state functional magnetic resonance imaging data as a surrogate for null functional magnetic resonance imaging (fMRI) datasets and posited that cluster-wise family-wise error (FWE) rate-corrected inferences made by using parametric statistical methods in fMRI studies over the past two decades may have been invalid, particularly for cluster defining thresholds less stringent than p < 0.001; this was principally because the spatial autocorrelation functions (sACF) of fMRI data had been modeled incorrectly to follow a Gaussian form, whereas empirical data suggested otherwise. Here, we show that accounting for non-Gaussian signal components such as those arising from resting-state neural activity as well as physiological responses and motion artifacts in the null fMRI datasets yields first- and second-level general linear model analysis residuals with nearly uniform and Gaussian sACF. Further comparison with nonparametric permutation tests indicates that cluster-based FWE corrected inferences made with Gaussian spatial noise approximations are valid.

  4. The Applied Mathematics for Power Systems (AMPS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chertkov, Michael

    2012-07-24

    Increased deployment of new technologies, e.g., renewable generation and electric vehicles, is rapidly transforming electrical power networks by crossing previously distinct spatiotemporal scales and invalidating many traditional approaches for designing, analyzing, and operating power grids. This trend is expected to accelerate over the coming years, bringing the disruptive challenge of complexity, but also opportunities to deliver unprecedented efficiency and reliability. Our Applied Mathematics for Power Systems (AMPS) Center will discover, enable, and solve emerging mathematics challenges arising in power systems and, more generally, in complex engineered networks. We will develop foundational applied mathematics resulting in rigorous algorithms and simulation toolboxesmore » for modern and future engineered networks. The AMPS Center deconstruction/reconstruction approach 'deconstructs' complex networks into sub-problems within non-separable spatiotemporal scales, a missing step in 20th century modeling of engineered networks. These sub-problems are addressed within the appropriate AMPS foundational pillar - complex systems, control theory, and optimization theory - and merged or 'reconstructed' at their boundaries into more general mathematical descriptions of complex engineered networks where important new questions are formulated and attacked. These two steps, iterated multiple times, will bridge the growing chasm between the legacy power grid and its future as a complex engineered network.« less

  5. Philosophy and Sociology of Science Evolution and History

    NASA Astrophysics Data System (ADS)

    Rosen, Joe

    The following sections are included: * Concrete Versus Abstract Theoretical Models * Introduction: concrete and abstract in kepler's contribution * Einstein's theory of gravitation and mach's principle * Unitary symmetry and the structure of hadrons * Conclusion * Dedication * Symmetry, Entropy and Complexity * Introduction * Symmetry Implies Abstraction and Loss of Information * Broken Symmetries - Imposed or Spontaneous * Symmetry, Order and Information * References * Cosmological Surrealism: More Than "Eternal Reality" Is Needed * Pythagoreanism in atomic, nuclear and particle physics * Introduction: Pythagoreanism as part of the Greek scientific world view — and the three questions I will tackle * Point 1: the impact of Gersonides and Crescas, two scientific anti-Aristotelian rebels * Point 2: Kepler's spheres to Bohr's orbits — Pythagoreanisms at last! * Point 3: Aristotle to Maupertuis, Emmy Noether, Schwinger * References * Paradigm Completion For Generalized Evolutionary Theory With Application To Epistemology * Evolution Fully Generalized * Entropy: Gravity as Model * Evolution and Entropy: Measures of Complexity * Extinctions and a Balanced Evolutionary Paradigm * The Evolution of Human Society - the Age of Information as example * High-Energy Physics and the World Wide Web * Twentieth Century Epistemology has Strong (de facto) Evolutionary Elements * The discoveries towards the beginning of the XXth Century * Summary and Conclusions * References * Evolutionary Epistemology and Invalidation * Introduction * Extinctions and A New Evolutionary Paradigm * Evolutionary Epistemology - Active Mutations * Evolutionary Epistemology: Invalidation as An Extinction * References

  6. Feedback control of combustion instabilities from within limit cycle oscillations using H∞ loop-shaping and the ν-gap metric

    PubMed Central

    Morgans, Aimee S.

    2016-01-01

    Combustion instabilities arise owing to a two-way coupling between acoustic waves and unsteady heat release. Oscillation amplitudes successively grow, until nonlinear effects cause saturation into limit cycle oscillations. Feedback control, in which an actuator modifies some combustor input in response to a sensor measurement, can suppress combustion instabilities. Linear feedback controllers are typically designed, using linear combustor models. However, when activated from within limit cycle, the linear model is invalid, and such controllers are not guaranteed to stabilize. This work develops a feedback control strategy guaranteed to stabilize from within limit cycle oscillations. A low-order model of a simple combustor, exhibiting the essential features of more complex systems, is presented. Linear plane acoustic wave modelling is combined with a weakly nonlinear describing function for the flame. The latter is determined numerically using a level set approach. Its implication is that the open-loop transfer function (OLTF) needed for controller design varies with oscillation level. The difference between the mean and the rest of the OLTFs is characterized using the ν-gap metric, providing the minimum required ‘robustness margin’ for an H∞ loop-shaping controller. Such controllers are designed and achieve stability both for linear fluctuations and from within limit cycle oscillations. PMID:27493558

  7. New Global 3D Upper to Mid-mantle Electrical Conductivity Model Based on Observatory Data with Realistic Auroral Sources

    NASA Astrophysics Data System (ADS)

    Kelbert, A.; Egbert, G. D.; Sun, J.

    2011-12-01

    Poleward of 45-50 degrees (geomagnetic) observatory data are influenced significantly by auroral ionospheric current systems, invalidating the simplifying zonal dipole source assumption traditionally used for long period (T > 2 days) geomagnetic induction studies. Previous efforts to use these data to obtain the global electrical conductivity distribution in Earth's mantle have omitted high-latitude sites (further thinning an already sparse dataset) and/or corrected the affected transfer functions using a highly simplified model of auroral source currents. Although these strategies are partly effective, there remain clear suggestions of source contamination in most recent 3D inverse solutions - specifically, bands of conductive features are found near auroral latitudes. We report on a new approach to this problem, based on adjusting both external field structure and 3D Earth conductivity to fit observatory data. As an initial step towards full joint inversion we are using a two step procedure. In the first stage, we adopt a simplified conductivity model, with a thin-sheet of variable conductance (to represent the oceans) overlying a 1D Earth, to invert observed magnetic fields for external source spatial structure. Input data for this inversion are obtained from frequency domain principal components (PC) analysis of geomagnetic observatory hourly mean values. To make this (essentially linear) inverse problem well-posed we regularize using covariances for source field structure that are consistent with well-established properties of auroral ionospheric (and magnetospheric) current systems, and basic physics of the EM fields. In the second stage, we use a 3D finite difference inversion code, with source fields estimated from the first stage, to further fit the observatory PC modes. We incorporate higher latitude data into the inversion, and maximize the amount of available information by directly inverting the magnetic field components of the PC modes, instead of transfer functions such as C-responses used previously. Recent improvements in accuracy and speed of the forward and inverse finite difference codes (a secondary field formulation and parallelization over frequencies) allow us to use finer computational grid for inversion, and thus to model finer scale features, making full use of the expanded data set. Overall, our approach presents an improvement over earlier observatory data interpretation techniques, making better use of the available data, and allowing to explore the trade-offs between complications in source structure, and heterogeneities in mantle conductivity. We will also report on progress towards applying the same approach to simultaneous source/conductivity inversion of shorter period observatory data, focusing especially on the daily variation band.

  8. Red flags in the clinical interview may forecast invalid neuropsychological testing.

    PubMed

    Keesler, Michael E; McClung, Kirstie; Meredith-Duliba, Tawny; Williams, Kelli; Swirsky-Sacchetti, Thomas

    2017-04-01

    Evaluating assessment validity is expected in neuropsychological evaluation, particularly in cases with identified secondary gain, where malingering or somatization may be present. Assessed with standalone measures and embedded indices, all within the testing portion of the examination, research on validity of self-report in the clinical interview is limited. Based on experience with litigation-involved examinees recovering from mild traumatic brain injury (mTBI), it was hypothesized that inconsistently reported date of injury (DOI) and/or loss of consciousness (LOC) might predict invalid performance on neurocognitive testing. This archival study examined cases of litigation-involved mTBI patients seen at an outpatient neuropsychological practice in Philadelphia, PA. Coded data included demographic variables, performance validity measures, and consistency between self-report and medicolegal records. A significant relationship was found between the consistency of examinees' self-report with records and their scores on performance validity testing, X 2 (1, N = 84) = 24.18, p < .01, Φ = .49. Post hoc testing revealed significant between-group differences in three of four comparisons, with medium to large effect sizes. A final post hoc analysis found significance between the number of performance validity tests (PVTs) failed and the extent to which an examinee incorrectly reported DOI r(83) = .49, p < .01. Using inconsistently reported LOC and/or DOI to predict an examinee's performance as invalid had a 75% sensitivity and a 75% specificity. Examinees whose reported DOI or LOC differs from records may be more likely to fail one or more PVTs, suggesting possible symptom exaggeration and/or under performance on cognitive testing.s.

  9. Equivalent Dynamic Models.

    PubMed

    Molenaar, Peter C M

    2017-01-01

    Equivalences of two classes of dynamic models for weakly stationary multivariate time series are discussed: dynamic factor models and autoregressive models. It is shown that exploratory dynamic factor models can be rotated, yielding an infinite set of equivalent solutions for any observed series. It also is shown that dynamic factor models with lagged factor loadings are not equivalent to the currently popular state-space models, and that restriction of attention to the latter type of models may yield invalid results. The known equivalent vector autoregressive model types, standard and structural, are given a new interpretation in which they are conceived of as the extremes of an innovating type of hybrid vector autoregressive models. It is shown that consideration of hybrid models solves many problems, in particular with Granger causality testing.

  10. WTP and WTA: do people think differently?

    PubMed

    Whynes, David K; Sach, Tracey H

    2007-09-01

    Contingent valuation (CV) studies in health care have used the willingness to pay (WTP) approach, to the virtual exclusion of willingness to accept (WTA). Outside the health care field, disparities between WTP and WTA values have been observed. Were such disparities to be demonstrated for health care technologies, the conventional assumption of a linear cost-effectiveness plane would be invalidated. This paper employs data derived from interviews with users of the UK's paediatric cochlear implantation (PCI) programme based in Nottingham (i) to assess the feasibility of estimating WTA for the potential discontinuation of an existing technology, and (ii) to investigate any WTA-WTP disparity which might be revealed. Only one-third of subjects providing WTP values were willing and able to offer a corresponding WTA value. Our qualitative data revealed that modes of response differed between the two valuation approaches. In particular, the presumption of fungibility of the health care intervention was a far more serious obstacle to completing the WTA task than it was for WTP. Among those prepared to offer values under both approaches, mean WTA was approximately four times mean WTP. Until more health studies are conducted, it remains unclear whether or not the findings are specific both to the intervention and to the elicitation format.

  11. Effects of attention bias modification with short and long stimulus-duration: A randomized experiment with individuals with subclinical social anxiety.

    PubMed

    Liang, Chi-Wen; Hsu, Wen-Yau

    2016-06-30

    This study investigated the differential effects of two attention bias modification (ABM) with different stimulus durations. Seventy-two undergraduates with subclinical social anxiety were randomly assigned to one of four conditions: an ABM condition with either a 100-ms or a 500-ms stimulus duration (ABM-100/ ABM-500) or an attention placebo (AP) condition with either a 100-ms or a 500-ms stimulus duration (AP-100/ AP-500). Participants completed the pre-assessments, eight attentional training sessions, and post-assessments. A modified Posner paradigm was used to assess changes in attentional processing. After completion of attentional training, the ABM-100 group significantly speeded up their responses to 100-ms invalid trials, regardless of the word type. The ABM-100 group also exhibited significant reduced latencies to 500-ms invalid social threat trials and a marginally significant reduced latencies to 500-ms invalid neutral trials. The ABM-500 group showed significant reduced latencies to 500-ms invalid social threat trials. Both ABMs significantly reduced participants' fear of negative evaluations and interactional anxiousness relative to their comparative AP. The effects on social anxiety did not differ between the two ABMs. This study suggests that although both ABMs using short and long stimulus durations reduce some aspects of social anxiety, they influence participants' attentional disengagement in different ways. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Interactivity Communication and Trust: Further Studies of Leadership in the Electronic Age

    DTIC Science & Technology

    2005-03-01

    composure, (h) favorable evaluation, and (i) decision-making effectiveness. Sample and Method. Participants (N = 146) were undergraduate students in...impairs decision quality when deceit and invalid information are introduced. Sample and Method. Participants (N = 126) were undergraduate students enrolled... undergraduate business students (N = 66) in two geographically distant U.S. universities participated in a four-week project using a web-based computer

  13. Validity in the hiring and evaluation process.

    PubMed

    Gregg, Robert E

    2006-01-01

    Validity means "based on sound principles." Hiring decisions, discharges, and layoffs are often challenged in court. Unfortunately the employer's defenses are too often found "invalid." The Americans With Disabilities Act requires the employer to show a "validated" hiring process. Defense of discharges or layoffs often focuses on validity of the employer's decision. This article explains the elements of validity needed for sound and defendable employment decisions.

  14. Correlates and consequences of the disclosure of pain-related distress to one's spouse.

    PubMed

    Cano, Annmarie; Leong, Laura E M; Williams, Amy M; May, Dana K K; Lutz, Jillian R

    2012-12-01

    The communication of pain has received a great deal of attention in the pain literature; however, one form of pain communication--emotional disclosure of pain-related distress (e.g., sadness, worry, anger about pain)--has not been studied extensively. This study examined the extent to which this form of pain communication occurred during an observed conversation with one's spouse and also investigated the correlates and consequences of disclosure. Individuals with chronic pain (ICP) and their spouses (N=95 couples) completed several questionnaires regarding pain, psychological distress, and relationship distress as well as video recorded interactions about the impact of pain on their lives. Approximately two-thirds of ICPs (n=65) disclosed their pain-related distress to their spouses. ICPs who reported greater pain severity, ruminative catastrophizing and affective distress about pain, and depressive and anxiety symptoms were more likely to disclose their distress to their spouses. Spouses of ICPs who disclosed only once or twice were significantly less likely to invalidate their partners whereas spouses of ICPs who disclosed at a higher rate were significantly more likely to validate their partners. Furthermore, spouses were more likely to engage in invalidation after attempting more neutral or validating responses, suggesting an erosion of support when ICPs engaged in high rates of disclosure. Correlates of spousal invalidation included both spouses' helplessness catastrophizing, ICPs' affective distress about pain, and spouses' anxiety, suggesting that both partners' distress are implicated in maladaptive disclosure-response patterns. Findings are discussed in light of pain communication and empathy models of pain. Copyright © 2012 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  15. From Bilski back to Benson: preemption, inventing around, and the case of genetic diagnostics.

    PubMed

    Dreyfuss, Rochelle; Evans, James P

    2011-06-01

    The long-anticipated decision in Bilski v. Kappos was supposed to end uncertainty regarding the patentability of process claims (or, at the least, business method claims). Instead, the opinion featured a series of anomalies: The Court emphasized strict construction of the Patent Act, but acknowledged three judge-made exceptions to patentability. It disapproved State Street, the Federal Circuit case that had upheld business method patents, but could muster only four votes for the proposition that business methods are in fact unpatentable. But even though the Court upheld business method patents, it invalidated all of Bilski's hedging claims. And while the Justices agreed on one thing - a patent that "preempts" something (a mathematical formula, an approach, a commonly used idea, a wide swath of technological developments, the public's access) is bad - they failed to operationalize the concept. That problem had plagued the law prior to State Street; in the interest of preventing the same set of problems from recurring, this Article uses recent empirical studies on gene patents to tease out indicia ("clues") to supplement the machine-or-transformation test for determining when a claim is preemptive and therefore invalid. Chief among these clues is the inability to invent around claims that cover broad prospects.

  16. LEOPARD: A grid-based dispersion relation solver for arbitrary gyrotropic distributions

    NASA Astrophysics Data System (ADS)

    Astfalk, Patrick; Jenko, Frank

    2017-01-01

    Particle velocity distributions measured in collisionless space plasmas often show strong deviations from idealized model distributions. Despite this observational evidence, linear wave analysis in space plasma environments such as the solar wind or Earth's magnetosphere is still mainly carried out using dispersion relation solvers based on Maxwellians or other parametric models. To enable a more realistic analysis, we present the new grid-based kinetic dispersion relation solver LEOPARD (Linear Electromagnetic Oscillations in Plasmas with Arbitrary Rotationally-symmetric Distributions) which no longer requires prescribed model distributions but allows for arbitrary gyrotropic distribution functions. In this work, we discuss the underlying numerical scheme of the code and we show a few exemplary benchmarks. Furthermore, we demonstrate a first application of LEOPARD to ion distribution data obtained from hybrid simulations. In particular, we show that in the saturation stage of the parallel fire hose instability, the deformation of the initial bi-Maxwellian distribution invalidates the use of standard dispersion relation solvers. A linear solver based on bi-Maxwellians predicts further growth even after saturation, while LEOPARD correctly indicates vanishing growth rates. We also discuss how this complies with former studies on the validity of quasilinear theory for the resonant fire hose. In the end, we briefly comment on the role of LEOPARD in directly analyzing spacecraft data, and we refer to an upcoming paper which demonstrates a first application of that kind.

  17. An interface tracking model for droplet electrocoalescence.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Lindsay Crowl

    This report describes an Early Career Laboratory Directed Research and Development (LDRD) project to develop an interface tracking model for droplet electrocoalescence. Many fluid-based technologies rely on electrical fields to control the motion of droplets, e.g. microfluidic devices for high-speed droplet sorting, solution separation for chemical detectors, and purification of biodiesel fuel. Precise control over droplets is crucial to these applications. However, electric fields can induce complex and unpredictable fluid dynamics. Recent experiments (Ristenpart et al. 2009) have demonstrated that oppositely charged droplets bounce rather than coalesce in the presence of strong electric fields. A transient aqueous bridge forms betweenmore » approaching drops prior to pinch-off. This observation applies to many types of fluids, but neither theory nor experiments have been able to offer a satisfactory explanation. Analytic hydrodynamic approximations for interfaces become invalid near coalescence, and therefore detailed numerical simulations are necessary. This is a computationally challenging problem that involves tracking a moving interface and solving complex multi-physics and multi-scale dynamics, which are beyond the capabilities of most state-of-the-art simulations. An interface-tracking model for electro-coalescence can provide a new perspective to a variety of applications in which interfacial physics are coupled with electrodynamics, including electro-osmosis, fabrication of microelectronics, fuel atomization, oil dehydration, nuclear waste reprocessing and solution separation for chemical detectors. We present a conformal decomposition finite element (CDFEM) interface-tracking method for the electrohydrodynamics of two-phase flow to demonstrate electro-coalescence. CDFEM is a sharp interface method that decomposes elements along fluid-fluid boundaries and uses a level set function to represent the interface.« less

  18. Apple App Store as a Business Model Supporting U.S. Navy Requirements

    DTIC Science & Technology

    2011-10-25

    credit card identity information for charging the required $99/year fee. Actual payment of the fee is handled by putting the developer program...required for the App Store. Applications are submitted via iTunes Connect (http://itunesconnect.apple.com). Table 1 gives a list of the...still being processed by the iTunes Connect system.  Invalid Binary—Appears when a binary is received through the Application Loader and has been

  19. JPRS Report, Science & Technology USSR: Physics & Mathematics

    DTIC Science & Technology

    1991-03-07

    field B < mw 2 / e (mw - mass of gauge W- boson ) does not invalidate this approximation inasmuch as the respective momentum integrals remain...model (sin29w = Vi where 0W - angle of W- boson momentum) indicate that, in an ultras- trong magnetic field, photon fusion produces more elec- tron... boson field throughout the 8^8* range. This study was made within the scope of Project N 344 in the Government Program "High-Temperature

  20. Real-time management of faulty electrodes in electrical impedance tomography.

    PubMed

    Hartinger, Alzbeta E; Guardo, Robert; Adler, Andy; Gagnon, Hervé

    2009-02-01

    Completely or partially disconnected electrodes are a fairly common occurrence in many electrical impedance tomography (EIT) clinical applications. Several factors can contribute to electrode disconnection: patient movement, perspiration, manipulations by clinical staff, and defective electrode leads or electronics. By corrupting several measurements, faulty electrodes introduce significant image artifacts. In order to properly manage faulty electrodes, it is necessary to: 1) account for invalid data in image reconstruction algorithms and 2) automatically detect faulty electrodes. This paper presents a two-part approach for real-time management of faulty electrodes based on the principle of voltage-current reciprocity. The first part allows accounting for faulty electrodes in EIT image reconstruction without a priori knowledge of which electrodes are at fault. The method properly weights each measurement according to its compliance with the principle of voltage-current reciprocity. Results show that the algorithm is able to automatically determine the valid portion of the data and use it to calculate high-quality images. The second part of the approach allows automatic real-time detection of at least one faulty electrode with 100% sensitivity and two faulty electrodes with 80% sensitivity enabling the clinical staff to fix the problem as soon as possible to minimize data loss.

  1. The influence of object similarity and orientation on object-based cueing.

    PubMed

    Hein, Elisabeth; Blaschke, Stefan; Rolke, Bettina

    2017-01-01

    Responses to targets that appear at a noncued position within the same object (invalid-same) compared to a noncued position at an equidistant different object (invalid-different) tend to be faster and more accurate. These cueing effects have been taken as evidence that visual attention can be object based (Egly, Driver, & Rafal, Journal of Experimental Psychology: General, 123, 161-177, 1994). Recent findings, however, have shown that the object-based cueing effect is influenced by object orientation, suggesting that the cueing effect might be due to a more general facilitation of attentional shifts across the horizontal meridian (Al-Janabi & Greenberg, Attention, Perception, & Psychophysics, 1-17, 2016; Pilz, Roggeveen, Creighton, Bennet, & Sekuler, PLOS ONE, 7, e30693, 2012). The aim of this study was to investigate whether the object-based cueing effect is influenced by object similarity and orientation. According to the object-based attention account, objects that are less similar to each other should elicit stronger object-based cueing effects independent of object orientation, whereas the horizontal meridian theory would not predict any effect of object similarity. We manipulated object similarity by using a color (Exp. 1, Exp. 2A) or shape change (Exp. 2B) to distinguish two rectangles in a variation of the classic two-rectangle paradigm (Egly et al., 1994). We found that the object-based cueing effects were influenced by the orientation of the rectangles and strengthened by object dissimilarity. We suggest that object-based cueing effects are strongly affected by the facilitation of attention along the horizontal meridian, but that they also have an object-based attentional component, which is revealed when the dissimilarity between the presented objects is accentuated.

  2. Testing Genetic Pleiotropy with GWAS Summary Statistics for Marginal and Conditional Analyses.

    PubMed

    Deng, Yangqing; Pan, Wei

    2017-12-01

    There is growing interest in testing genetic pleiotropy, which is when a single genetic variant influences multiple traits. Several methods have been proposed; however, these methods have some limitations. First, all the proposed methods are based on the use of individual-level genotype and phenotype data; in contrast, for logistical, and other, reasons, summary statistics of univariate SNP-trait associations are typically only available based on meta- or mega-analyzed large genome-wide association study (GWAS) data. Second, existing tests are based on marginal pleiotropy, which cannot distinguish between direct and indirect associations of a single genetic variant with multiple traits due to correlations among the traits. Hence, it is useful to consider conditional analysis, in which a subset of traits is adjusted for another subset of traits. For example, in spite of substantial lowering of low-density lipoprotein cholesterol (LDL) with statin therapy, some patients still maintain high residual cardiovascular risk, and, for these patients, it might be helpful to reduce their triglyceride (TG) level. For this purpose, in order to identify new therapeutic targets, it would be useful to identify genetic variants with pleiotropic effects on LDL and TG after adjusting the latter for LDL; otherwise, a pleiotropic effect of a genetic variant detected by a marginal model could simply be due to its association with LDL only, given the well-known correlation between the two types of lipids. Here, we develop a new pleiotropy testing procedure based only on GWAS summary statistics that can be applied for both marginal analysis and conditional analysis. Although the main technical development is based on published union-intersection testing methods, care is needed in specifying conditional models to avoid invalid statistical estimation and inference. In addition to the previously used likelihood ratio test, we also propose using generalized estimating equations under the working independence model for robust inference. We provide numerical examples based on both simulated and real data, including two large lipid GWAS summary association datasets based on ∼100,000 and ∼189,000 samples, respectively, to demonstrate the difference between marginal and conditional analyses, as well as the effectiveness of our new approach. Copyright © 2017 by the Genetics Society of America.

  3. Comment on "SU(5) octet scalar at the LHC"

    NASA Astrophysics Data System (ADS)

    Doršner, Ilja

    2015-06-01

    I address the validity of results presented in [S. Khalil, S. Salem, and M. Allam, Phys. Rev. D 89, 095011 (2014)] with regard to unification of gauge couplings within a particular S U (5 ) framework. The scalar sector of the proposed S U (5 ) model contains one 5-dimensional, one 24-dimensional, and one 45-dimensional representation. The authors discuss one specific unification scenario that supports the case for the LHC accessible color octet scalar. I show that the unification analysis in question is based on (i) an erroneous assumption related to the issue of nucleon stability and (ii) an incorrect input for the applicable set of renormalization group equations. This, in my view, invalidates the aforementioned gauge coupling unification study. I also question a source of the fermion mass relations presented in that work.

  4. Custom-oriented wavefront sensor for human eye properties measurements

    NASA Astrophysics Data System (ADS)

    Galetskiy, Sergey; Letfullin, Renat; Dubinin, Alex; Cherezova, Tatyana; Belyakov, Alexey; Kudryashov, Alexis

    2005-12-01

    The problem of correct measurement of human eye aberrations is very important with the rising widespread of a surgical procedure for reducing refractive error in the eye, so called, LASIK (laser-assisted in situ keratomileusis). In this paper we show capabilities to measure aberrations by means of the aberrometer built in our lab together with Active Optics Ltd. We discuss the calibration of the aberrometer and show invalidity to use for the ophthalmic calibration purposes the analytical equation based on thin lens formula. We show that proper analytical equation suitable for calibration should have dependence on the square of the distance increment and we illustrate this both by experiment and by Zemax Ray tracing modeling. Also the error caused by inhomogeneous intensity distribution of the beam imaged onto the aberrometer's Shack-Hartmann sensor is discussed.

  5. Does overprotection cause cardiac invalidism after acute myocardial infarction?

    PubMed

    Riegel, B J; Dracup, K A

    1992-01-01

    To determine if overprotection on the part of the patient's family and friends contributes to the development of cardiac invalidism after acute myocardial infarction. Longitudinal survey. Nine hospitals in the southwestern United States. One hundred eleven patients who had experienced a first acute myocardial infarction. Subjects were predominantly male, older-aged, married, caucasian, and in functional class I. Eighty-one patients characterized themselves as being overprotected (i.e., receiving more social support from family and friends than desired), and 28 reported receiving inadequate support. Only two patients reported receiving as much support as they desired. Self-esteem, emotional distress, health perceptions, interpersonal dependency, return to work. Overprotected patients experienced less anxiety, depression, anger, confusion, more vigor, and higher self-esteem than inadequately supported patients 1 month after myocardial infarction (p < 0.05). Inadequately supported patients were more dependent 4 months after the event. Overprotection on the part of family and friends may facilitate psychosocial adjustment in the early months after an acute myocardial infarction rather than lead to cardiac invalidism.

  6. Age-dependent impairment of auditory processing under spatially focused and divided attention: an electrophysiological study.

    PubMed

    Wild-Wall, Nele; Falkenstein, Michael

    2010-01-01

    By using event-related potentials (ERPs) the present study examines if age-related differences in preparation and processing especially emerge during divided attention. Binaurally presented auditory cues called for focused (valid and invalid) or divided attention to one or both ears. Responses were required to subsequent monaurally presented valid targets (vowels), but had to be suppressed to non-target vowels or invalidly cued vowels. Middle-aged participants were more impaired under divided attention than young ones, likely due to an age-related decline in preparatory attention following cues as was reflected in a decreased CNV. Under divided attention, target processing was increased in the middle-aged, likely reflecting compensatory effort to fulfill task requirements in the difficult condition. Additionally, middle-aged participants processed invalidly cued stimuli more intensely as was reflected by stimulus ERPs. The results suggest an age-related impairment in attentional preparation after auditory cues especially under divided attention and latent difficulties to suppress irrelevant information.

  7. Deformable segmentation of 3D MR prostate images via distributed discriminative dictionary and ensemble learning

    PubMed Central

    Guo, Yanrong; Gao, Yaozong; Shao, Yeqin; Price, True; Oto, Aytekin; Shen, Dinggang

    2014-01-01

    Purpose: Automatic prostate segmentation from MR images is an important task in various clinical applications such as prostate cancer staging and MR-guided radiotherapy planning. However, the large appearance and shape variations of the prostate in MR images make the segmentation problem difficult to solve. Traditional Active Shape/Appearance Model (ASM/AAM) has limited accuracy on this problem, since its basic assumption, i.e., both shape and appearance of the targeted organ follow Gaussian distributions, is invalid in prostate MR images. To this end, the authors propose a sparse dictionary learning method to model the image appearance in a nonparametric fashion and further integrate the appearance model into a deformable segmentation framework for prostate MR segmentation. Methods: To drive the deformable model for prostate segmentation, the authors propose nonparametric appearance and shape models. The nonparametric appearance model is based on a novel dictionary learning method, namely distributed discriminative dictionary (DDD) learning, which is able to capture fine distinctions in image appearance. To increase the differential power of traditional dictionary-based classification methods, the authors' DDD learning approach takes three strategies. First, two dictionaries for prostate and nonprostate tissues are built, respectively, using the discriminative features obtained from minimum redundancy maximum relevance feature selection. Second, linear discriminant analysis is employed as a linear classifier to boost the optimal separation between prostate and nonprostate tissues, based on the representation residuals from sparse representation. Third, to enhance the robustness of the authors' classification method, multiple local dictionaries are learned for local regions along the prostate boundary (each with small appearance variations), instead of learning one global classifier for the entire prostate. These discriminative dictionaries are located on different patches of the prostate surface and trained to adaptively capture the appearance in different prostate zones, thus achieving better local tissue differentiation. For each local region, multiple classifiers are trained based on the randomly selected samples and finally assembled by a specific fusion method. In addition to this nonparametric appearance model, a prostate shape model is learned from the shape statistics using a novel approach, sparse shape composition, which can model nonGaussian distributions of shape variation and regularize the 3D mesh deformation by constraining it within the observed shape subspace. Results: The proposed method has been evaluated on two datasets consisting of T2-weighted MR prostate images. For the first (internal) dataset, the classification effectiveness of the authors' improved dictionary learning has been validated by comparing it with three other variants of traditional dictionary learning methods. The experimental results show that the authors' method yields a Dice Ratio of 89.1% compared to the manual segmentation, which is more accurate than the three state-of-the-art MR prostate segmentation methods under comparison. For the second dataset, the MICCAI 2012 challenge dataset, the authors' proposed method yields a Dice Ratio of 87.4%, which also achieves better segmentation accuracy than other methods under comparison. Conclusions: A new magnetic resonance image prostate segmentation method is proposed based on the combination of deformable model and dictionary learning methods, which achieves more accurate segmentation performance on prostate T2 MR images. PMID:24989402

  8. Deformable segmentation of 3D MR prostate images via distributed discriminative dictionary and ensemble learning.

    PubMed

    Guo, Yanrong; Gao, Yaozong; Shao, Yeqin; Price, True; Oto, Aytekin; Shen, Dinggang

    2014-07-01

    Automatic prostate segmentation from MR images is an important task in various clinical applications such as prostate cancer staging and MR-guided radiotherapy planning. However, the large appearance and shape variations of the prostate in MR images make the segmentation problem difficult to solve. Traditional Active Shape/Appearance Model (ASM/AAM) has limited accuracy on this problem, since its basic assumption, i.e., both shape and appearance of the targeted organ follow Gaussian distributions, is invalid in prostate MR images. To this end, the authors propose a sparse dictionary learning method to model the image appearance in a nonparametric fashion and further integrate the appearance model into a deformable segmentation framework for prostate MR segmentation. To drive the deformable model for prostate segmentation, the authors propose nonparametric appearance and shape models. The nonparametric appearance model is based on a novel dictionary learning method, namely distributed discriminative dictionary (DDD) learning, which is able to capture fine distinctions in image appearance. To increase the differential power of traditional dictionary-based classification methods, the authors' DDD learning approach takes three strategies. First, two dictionaries for prostate and nonprostate tissues are built, respectively, using the discriminative features obtained from minimum redundancy maximum relevance feature selection. Second, linear discriminant analysis is employed as a linear classifier to boost the optimal separation between prostate and nonprostate tissues, based on the representation residuals from sparse representation. Third, to enhance the robustness of the authors' classification method, multiple local dictionaries are learned for local regions along the prostate boundary (each with small appearance variations), instead of learning one global classifier for the entire prostate. These discriminative dictionaries are located on different patches of the prostate surface and trained to adaptively capture the appearance in different prostate zones, thus achieving better local tissue differentiation. For each local region, multiple classifiers are trained based on the randomly selected samples and finally assembled by a specific fusion method. In addition to this nonparametric appearance model, a prostate shape model is learned from the shape statistics using a novel approach, sparse shape composition, which can model nonGaussian distributions of shape variation and regularize the 3D mesh deformation by constraining it within the observed shape subspace. The proposed method has been evaluated on two datasets consisting of T2-weighted MR prostate images. For the first (internal) dataset, the classification effectiveness of the authors' improved dictionary learning has been validated by comparing it with three other variants of traditional dictionary learning methods. The experimental results show that the authors' method yields a Dice Ratio of 89.1% compared to the manual segmentation, which is more accurate than the three state-of-the-art MR prostate segmentation methods under comparison. For the second dataset, the MICCAI 2012 challenge dataset, the authors' proposed method yields a Dice Ratio of 87.4%, which also achieves better segmentation accuracy than other methods under comparison. A new magnetic resonance image prostate segmentation method is proposed based on the combination of deformable model and dictionary learning methods, which achieves more accurate segmentation performance on prostate T2 MR images.

  9. Analysis of Parasite and Other Skewed Counts

    PubMed Central

    Alexander, Neal

    2012-01-01

    Objective To review methods for the statistical analysis of parasite and other skewed count data. Methods Statistical methods for skewed count data are described and compared, with reference to those used over a ten year period of Tropical Medicine and International Health. Two parasitological datasets are used for illustration. Results Ninety papers were identified, 89 with descriptive and 60 with inferential analysis. A lack of clarity is noted in identifying measures of location, in particular the Williams and geometric mean. The different measures are compared, emphasizing the legitimacy of the arithmetic mean for skewed data. In the published papers, the t test and related methods were often used on untransformed data, which is likely to be invalid. Several approaches to inferential analysis are described, emphasizing 1) non-parametric methods, while noting that they are not simply comparisons of medians, and 2) generalized linear modelling, in particular with the negative binomial distribution. Additional methods, such as the bootstrap, with potential for greater use are described. Conclusions Clarity is recommended when describing transformations and measures of location. It is suggested that non-parametric methods and generalized linear models are likely to be sufficient for most analyses. PMID:22943299

  10. A Phenomenology Examining High School Teachers' Perceptions of the Effects of Standards-Based Grading on Planning, Instruction, Assessment, Classroom Environment, and Students' Characteristics and Behaviors

    ERIC Educational Resources Information Center

    Knight, Megan E.

    2017-01-01

    Today's grading practices mirror those of the early 1900s, and despite myriad research suggesting they are invalid, unreliable, and a hindrance to student learning, many teachers continue detrimental practices such as using 100-point percentage scales averaging all academic and nonacademic factors together into a single grade, and using grades to…

  11. Securing BGP Using External Security Monitors

    DTIC Science & Technology

    2006-01-01

    forms. In Proc. SOSP, Brighton , UK , Oct. 2005. [19] A. Seshadri, A. Perrig, L. van Doorn, and P. Khosla. SWATT: Software-based Attestation for...Williams, E. G. Sirer, and F. B. Schnei- der. Nexus: A New Operating System for Trustwor- thy Computing (extended abstract). In Proc. SOSP, Brighton , UK ...as a distributed database of untrustworthy hosts or messages. An ESM that detects invalid behavior issues a certifi- cate describing the behavior or

  12. Redundancy Analysis of Capacitance Data of a Coplanar Electrode Array for Fast and Stable Imaging Processing

    PubMed Central

    Wen, Yintang; Zhang, Zhenda; Zhang, Yuyan; Sun, Dongtao

    2017-01-01

    A coplanar electrode array sensor is established for the imaging of composite-material adhesive-layer defect detection. The sensor is based on the capacitive edge effect, which leads to capacitance data being considerably weak and susceptible to environmental noise. The inverse problem of coplanar array electrical capacitance tomography (C-ECT) is ill-conditioning, in which a small error of capacitance data can seriously affect the quality of reconstructed images. In order to achieve a stable image reconstruction process, a redundancy analysis method for capacitance data is proposed. The proposed method is based on contribution rate and anti-interference capability. According to the redundancy analysis, the capacitance data are divided into valid and invalid data. When the image is reconstructed by valid data, the sensitivity matrix needs to be changed accordingly. In order to evaluate the effectiveness of the sensitivity map, singular value decomposition (SVD) is used. Finally, the two-dimensional (2D) and three-dimensional (3D) images are reconstructed by the Tikhonov regularization method. Through comparison of the reconstructed images of raw capacitance data, the stability of the image reconstruction process can be improved, and the quality of reconstructed images is not degraded. As a result, much invalid data are not collected, and the data acquisition time can also be reduced. PMID:29295537

  13. Saccadic eye movements do not disrupt the deployment of feature-based attention.

    PubMed

    Kalogeropoulou, Zampeta; Rolfs, Martin

    2017-07-01

    The tight link of saccades to covert spatial attention has been firmly established, yet their relation to other forms of visual selection remains poorly understood. Here we studied the temporal dynamics of feature-based attention (FBA) during fixation and across saccades. Participants reported the orientation (on a continuous scale) of one of two sets of spatially interspersed Gabors (black or white). We tested performance at different intervals between the onset of a colored cue (black or white, indicating which stimulus was the most probable target; red: neutral condition) and the stimulus. FBA built up after cue onset: Benefits (errors for valid vs. neutral cues), costs (invalid vs. neutral), and the overall cueing effect (valid vs. invalid) increased with the cue-stimulus interval. Critically, we also tested visual performance at different intervals after a saccade, when FBA had been fully deployed before saccade initiation. Cueing effects were evident immediately after the saccade and were predicted most accurately and most precisely by fully deployed FBA, indicating that FBA was continuous throughout saccades. Finally, a decomposition of orientation reports into target reports and random guesses confirmed continuity of report precision and guess rates across the saccade. We discuss the role of FBA in perceptual continuity across saccades.

  14. Does the Current Minimum Validate (or Invalidate) Cycle Prediction Methods?

    NASA Technical Reports Server (NTRS)

    Hathaway, David H.

    2010-01-01

    This deep, extended solar minimum and the slow start to Cycle 24 strongly suggest that Cycle 24 will be a small cycle. A wide array of solar cycle prediction techniques have been applied to predicting the amplitude of Cycle 24 with widely different results. Current conditions and new observations indicate that some highly regarded techniques now appear to have doubtful utility. Geomagnetic precursors have been reliable in the past and can be tested with 12 cycles of data. Of the three primary geomagnetic precursors only one (the minimum level of geomagnetic activity) suggests a small cycle. The Sun's polar field strength has also been used to successfully predict the last three cycles. The current weak polar fields are indicative of a small cycle. For the first time, dynamo models have been used to predict the size of a solar cycle but with opposite predictions depending on the model and the data assimilation. However, new measurements of the surface meridional flow indicate that the flow was substantially faster on the approach to Cycle 24 minimum than at Cycle 23 minimum. In both dynamo predictions a faster meridional flow should have given a shorter cycle 23 with stronger polar fields. This suggests that these dynamo models are not yet ready for solar cycle prediction.

  15. Illness invalidation from spouse and family is associated with depression in diabetic patients with first superficial diabetic foot ulcers.

    PubMed

    Sehlo, Mohammad G; Alzahrani, Owiss H; Alzahrani, Hasan A

    (1) To assess the prevalence of depressive disorders in a sample of diabetic patients with their first superficial diabetic foot ulcer. (2) To evaluate the association between illness invalidation from spouse, family, and depressive disorders in those patients. Depressive disorders and severity were diagnosed by the Structured Clinical Interview for DSM-IV Axis Ι disorders, clinical version, and the spouse and family scales of the Illness Invalidation Inventory, respectively (3*I). Physical functioning was also assessed using the Physical Component of The Short Form 36 item health-related quality of life questionnaire. The prevalence of depressive disorders was 27.50% (22/80). There was a significant decrease in physical health component summary mean score and a significant increase in ulcer size, Center for Epidemiologic Studies-Depression Scale, spouse discounting, spouse lack of understanding, and family discounting mean scores in the depressed group compared to the non-depressed group. Higher levels of spouse discounting, spouse understanding, and family discounting were significant predictors of diagnosis of depressive disorders and were strongly associated with increased severity of depressive symptoms in diabetic patients with first superficial diabetic foot ulcers. Poor physical functioning was associated with increased depressive symptom severity. This study demonstrated that illness invalidation from spouse and family is associated with diagnosis of depressive disorders and increased severity of depressive symptoms in diabetic patients with first superficial diabetic foot ulcers. © The Author(s) 2015.

  16. Which Factors Contribute to False-Positive, False-Negative, and Invalid Results in Fetal Fibronectin Testing in Women with Symptoms of Preterm Labor?

    PubMed

    Bruijn, Merel M C; Hermans, Frederik J R; Vis, Jolande Y; Wilms, Femke F; Oudijk, Martijn A; Kwee, Anneke; Porath, Martina M; Oei, Guid; Scheepers, Hubertina C J; Spaanderman, Marc E A; Bloemenkamp, Kitty W M; Haak, Monique C; Bolte, Antoinette C; Vandenbussche, Frank P H A; Woiski, Mallory D; Bax, Caroline J; Cornette, Jérôme M J; Duvekot, Johannes J; Bijvank, Bas W A N I J; van Eyck, Jim; Franssen, Maureen T M; Sollie, Krystyna M; van der Post, Joris A M; Bossuyt, Patrick M M; Kok, Marjolein; Mol, Ben W J; van Baaren, Gert-Jan

    2017-02-01

    Objective  We assessed the influence of external factors on false-positive, false-negative, and invalid fibronectin results in the prediction of spontaneous delivery within 7 days. Methods  We studied symptomatic women between 24 and 34 weeks' gestational age. We performed uni- and multivariable logistic regression to estimate the effect of external factors (vaginal soap, digital examination, transvaginal sonography, sexual intercourse, vaginal bleeding) on the risk of false-positive, false-negative, and invalid results, using spontaneous delivery within 7 days as the outcome. Results  Out of 708 women, 237 (33%) had a false-positive result; none of the factors showed a significant association. Vaginal bleeding increased the proportion of positive fetal fibronectin (fFN) results, but was significantly associated with a lower risk of false-positive test results (odds ratio [OR], 0.22; 95% confidence intervals [CI], 0.12-0.39). Ten women (1%) had a false-negative result. None of the investigated factors was significantly associated with a significantly higher risk of false-negative results. Twenty-one tests (3%) were invalid; only vaginal bleeding showed a significant association (OR, 4.5; 95% CI, 1.7-12). Conclusion  The effect of external factors on the performance of qualitative fFN testing is limited, with vaginal bleeding as the only factor that reduces its validity. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  17. Performance evaluation of the Aptima HSV-1 and 2 assay for the detection of HSV in cutaneous and mucocutaneous lesion specimens.

    PubMed

    Sam, Soya S; Caliendo, Angela M; Ingersoll, Jessica; Abdul-Ali, Deborah; Kraft, Colleen S

    Timely and precise laboratory diagnosis of Herpes simplex viruses (HSV) is required to guide clinical management. The study evaluated limit of detection (LOD) and performance characteristics of the Aptima HSV 1 & 2 assay in comparison to four assays. The multi-center study compared qualitative detection of HSV-1 and 2 by the Aptima HSV-1 and 2 assay (Hologic) to ELVIS culture, Lyra Direct (Quidel), AmpliVue (Quidel) and a laboratory developed test (LDT). LOD was performed using VTM and STM diluted viral concentrations and clinical performance was evaluated using 505 swab specimens. The Aptima LOD studies performed showed a lower detection limit for STM specimens as 1450 copies/mL and 430 copies/mL for HSV1 and HSV-2 respectively; the LOD for VTM specimens was 9370 copies/mL and 8045 copies/mL for HSV-1 and HSV-2 respectively. When the assays were analyzed based on the positive consensus result established the Aptima had 95% of percent positive agreement (PPA) and 100% negative percent agreement (NPA) for the HSV-1. For the HSV-2, the PPA and NPA for Aptima were 96% and 100% respectively. AmpliVue had 1.8% invalid rate, while Lyra had no invalid results but an inhibition rate of 0.8%. Aptima and LDT did not have any invalid or inhibited results. The results indicate that the Aptima HSV-1 & 2 assay is sensitive and the performance characteristics of the Aptima assay is comparable to the assays analyzed for the detection and differentiation of HSV-1 and 2 from cutaneous and mucocutaneous lesions. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. An Artificial Turf-Based Surrogate Surface Collector for the Direct Measurement of Atmospheric Mercury Dry Deposition

    PubMed Central

    Hall, Naima L.; Dvonch, Joseph Timothy; Marsik, Frank J.; Barres, James A.; Landis, Matthew S.

    2017-01-01

    This paper describes the development of a new artificial turf surrogate surface (ATSS) sampler for use in the measurement of mercury (Hg) dry deposition. In contrast to many existing surrogate surface designs, the ATSS utilizes a three-dimensional deposition surface that may more closely mimic the physical structure of many natural surfaces than traditional flat surrogate surface designs (water, filter, greased Mylar film). The ATSS has been designed to overcome several complicating factors that can impact the integrity of samples with other direct measurement approaches by providing a passive system which can be deployed for both short and extended periods of time (days to weeks), and is not contaminated by precipitation and/or invalidated by strong winds. Performance characteristics including collocated precision, in-field procedural and laboratory blanks were evaluated. The results of these performance evaluations included a mean collocated precision of 9%, low blanks (0.8 ng), high extraction efficiency (97%–103%), and a quantitative matrix spike recovery (100%). PMID:28208603

  19. Two-dimensional non-volatile programmable p-n junctions

    NASA Astrophysics Data System (ADS)

    Li, Dong; Chen, Mingyuan; Sun, Zhengzong; Yu, Peng; Liu, Zheng; Ajayan, Pulickel M.; Zhang, Zengxing

    2017-09-01

    Semiconductor p-n junctions are the elementary building blocks of most electronic and optoelectronic devices. The need for their miniaturization has fuelled the rapid growth of interest in two-dimensional (2D) materials. However, the performance of a p-n junction considerably degrades as its thickness approaches a few nanometres and traditional technologies, such as doping and implantation, become invalid at the nanoscale. Here we report stable non-volatile programmable p-n junctions fabricated from the vertically stacked all-2D semiconductor/insulator/metal layers (WSe2/hexagonal boron nitride/graphene) in a semifloating gate field-effect transistor configuration. The junction exhibits a good rectifying behaviour with a rectification ratio of 104 and photovoltaic properties with a power conversion efficiency up to 4.1% under a 6.8 nW light. Based on the non-volatile programmable properties controlled by gate voltages, the 2D p-n junctions have been exploited for various electronic and optoelectronic applications, such as memories, photovoltaics, logic rectifiers and logic optoelectronic circuits.

  20. Two-dimensional non-volatile programmable p-n junctions.

    PubMed

    Li, Dong; Chen, Mingyuan; Sun, Zhengzong; Yu, Peng; Liu, Zheng; Ajayan, Pulickel M; Zhang, Zengxing

    2017-09-01

    Semiconductor p-n junctions are the elementary building blocks of most electronic and optoelectronic devices. The need for their miniaturization has fuelled the rapid growth of interest in two-dimensional (2D) materials. However, the performance of a p-n junction considerably degrades as its thickness approaches a few nanometres and traditional technologies, such as doping and implantation, become invalid at the nanoscale. Here we report stable non-volatile programmable p-n junctions fabricated from the vertically stacked all-2D semiconductor/insulator/metal layers (WSe 2 /hexagonal boron nitride/graphene) in a semifloating gate field-effect transistor configuration. The junction exhibits a good rectifying behaviour with a rectification ratio of 10 4 and photovoltaic properties with a power conversion efficiency up to 4.1% under a 6.8 nW light. Based on the non-volatile programmable properties controlled by gate voltages, the 2D p-n junctions have been exploited for various electronic and optoelectronic applications, such as memories, photovoltaics, logic rectifiers and logic optoelectronic circuits.

  1. Pediatric Sleep Questionnaires as Diagnostic or Epidemiological Tools: A Review of Currently Available Instruments

    PubMed Central

    Spruyt, Karen; Gozal, David

    2010-01-01

    An extensive list of published and unpublished instruments used to investigate or evaluate sleep issues in children was collected and assessed based on the fundamental operational principles of instrument development (11 steps). Of all the available tools identified, only a few were validated and standardized using appropriate psychometric criteria. In fact, only 2 fulfill all desirable criteria and approximately 11 instruments seem to adhere to most of the psychometric tool development requirements, and were therefore assessed in greater detail. Notwithstanding, in the rapidly developing scientific world of pediatric sleep, there are too many tools being used that have not undergone careful and methodical psychometric evaluation, and as such may be fraught with biased or invalid findings. It is hoped that this initial effort in categorizing and assessing available tools for pediatric sleep will serve as recognition of the relatively early developmental stage of our field, and provide the necessary impetus for future tool development using multicentered approaches and adequate methodologies. PMID:20934896

  2. Some Behaviorial Science Measurement Concerns and Proposals.

    PubMed

    Nesselroade, John R; Molenaar, Peter C M

    2016-01-01

    Primarily from a measurement standpoint, we question some basic beliefs and procedures characterizing the scientific study of human behavior. The relations between observed and unobserved variables are key to an empirical approach to building explanatory theories and we are especially concerned about how the former are used as proxies for the latter. We believe that behavioral science can profitably reconsider the prevailing version of this arrangement because of its vulnerability to limiting idiosyncratic aspects of observed/unobserved variable relations. We describe a general measurement approach that takes into account idiosyncrasies that should be irrelevant to the measurement process but can intrude and may invalidate it in ways that distort and weaken relations among theoretically important variables. To clarify further our major concerns, we briefly describe one version of the measurement approach that fundamentally supports the individual as the primary unit of analysis orientation that we believe should be preeminent in the scientific study of human behavior.

  3. Feedback-tuned, noise resilient gates for encoded spin qubits

    NASA Astrophysics Data System (ADS)

    Bluhm, Hendrik

    Spin 1/2 particles form native two level systems and thus lend themselves as a natural qubit implementation. However, encoding a single qubit in several spins entails benefits, such as reducing the resources necessary for qubit control and protection from certain decoherence channels. While several varieties of such encoded spin qubits have been implemented, accurate control remains challenging, and leakage out of the subspace of valid qubit states is a potential issue. Optimal performance typically requires large pulse amplitudes for fast control, which is prone to systematic errors and prohibits standard control approaches based on Rabi flopping. Furthermore, the exchange interaction typically used to electrically manipulate encoded spin qubits is inherently sensitive to charge noise. I will discuss all-electrical, high-fidelity single qubit operations for a spin qubit encoded in two electrons in a GaAs double quantum dot. Starting from a set of numerically optimized control pulses, we employ an iterative tuning procedure based on measured error syndromes to remove systematic errors.Randomized benchmarking yields an average gate fidelity exceeding 98 % and a leakage rate into invalid states of 0.2 %. These gates exhibit a certain degree of resilience to both slow charge and nuclear spin fluctuations due to dynamical correction analogous to a spin echo. Furthermore, the numerical optimization minimizes the impact of fast charge noise. Both types of noise make relevant contributions to gate errors. The general approach is also adaptable to other qubit encodings and exchange based two-qubit gates.

  4. Modality, probability, and mental models.

    PubMed

    Hinterecker, Thomas; Knauff, Markus; Johnson-Laird, P N

    2016-10-01

    We report 3 experiments investigating novel sorts of inference, such as: A or B or both. Therefore, possibly (A and B). Where the contents were sensible assertions, for example, Space tourism will achieve widespread popularity in the next 50 years or advances in material science will lead to the development of antigravity materials in the next 50 years, or both . Most participants accepted the inferences as valid, though they are invalid in modal logic and in probabilistic logic too. But, the theory of mental models predicts that individuals should accept them. In contrast, inferences of this sort—A or B but not both. Therefore, A or B or both—are both logically valid and probabilistically valid. Yet, as the model theory also predicts, most reasoners rejected them. The participants’ estimates of probabilities showed that their inferences tended not to be based on probabilistic validity, but that they did rate acceptable conclusions as more probable than unacceptable conclusions. We discuss the implications of the results for current theories of reasoning. PsycINFO Database Record (c) 2016 APA, all rights reserved

  5. ScaleNet: a literature-based model of scale insect biology and systematics

    PubMed Central

    García Morales, Mayrolin; Denno, Barbara D.; Miller, Douglass R.; Miller, Gary L.; Ben-Dov, Yair; Hardy, Nate B.

    2016-01-01

    Scale insects (Hemiptera: Coccoidea) are small herbivorous insects found on all continents except Antarctica. They are extremely invasive, and many species are serious agricultural pests. They are also emerging models for studies of the evolution of genetic systems, endosymbiosis and plant-insect interactions. ScaleNet was launched in 1995 to provide insect identifiers, pest managers, insect systematists, evolutionary biologists and ecologists efficient access to information about scale insect biological diversity. It provides comprehensive information on scale insects taken directly from the primary literature. Currently, it draws from 23 477 articles and describes the systematics and biology of 8194 valid species. For 20 years, ScaleNet ran on the same software platform. That platform is no longer viable. Here, we present a new, open-source implementation of ScaleNet. We have normalized the data model, begun the process of correcting invalid data, upgraded the user interface, and added online administrative tools. These improvements make ScaleNet easier to use and maintain and make the ScaleNet data more accurate and extendable. Database URL: http://scalenet.info PMID:26861659

  6. Mesospheric ozone measurements by SAGE II

    NASA Technical Reports Server (NTRS)

    Chu, D. A.; Cunnold, D. M.

    1994-01-01

    SAGE II observations of ozone at sunrise and sunset (solar zenith angle = 90 deg) at approximately the same tropical latitude and on the same day exhibit larger concentrations at sunrise than at sunset between 55 and 65 km. Because of the rapid conversion between atomic oxygen and ozone, the onion-peeling scheme used in SAGE II retrievals, which is based on an assumption of constant ozone, is invalid. A one-dimensional photochemical model is used to simulate the diurnal variation of ozone particularly within the solar zenith angle of 80 deg - 100 deg. This model indicates that the retrieved SAGE II sunrise and sunset ozone values are both overestimated. The Chapman reactions produce an adequate simulation of the ozone sunrise/sunset ratio only below 60 km, while above 60 km this ratio is highly affected by the odd oxygen loss due to odd hydrogen reactions, particularly OH. The SAGE II ozone measurements are in excellent agreement with model results to which an onion peeling procedure is applied. The SAGE II ozone observations provide information on the mesospheric chemistry not only through the ozone profile averages but also from the sunrise/sunset ratio.

  7. Simulating Charge Transport in Solid Oxide Mixed Ionic and Electronic Conductors: Nernst-Planck Theory vs Modified Fick's Law

    DOE PAGES

    Jin, Xinfang; White, Ralph E.; Huang, Kevin

    2016-10-04

    With the assumption that the Fermi level (electrochemical potential of electrons) is uniform across the thickness of a mixed ionic and electronic conducting (MIEC) electrode, the charge-transport model in the electrode domain can be reduced to the modified Fick’s first law, which includes a thermodynamic factor A. A transient numerical solution of the Nernst-Planck theory was obtained for a symmetric cell with MIEC electrodes to illustrate the validity of the assumption of a uniform Fermi level. Subsequently, an impedance numerical solution based on the modified Fick’s first law is compared with that from the Nernst-Planck theory. The results show thatmore » Nernst-Planck charge-transport model is essentially the same as the modified Fick’s first law model as long as the MIEC electrodes have a predominant electronic conductivity. However, because of the invalidity of the uniform Fermi level assumption for aMIEC electrolyte with a predominant ionic conductivity, Nernst-Planck theory is needed to describe the charge transport behaviors.« less

  8. Quantitative kinetic theory of active matter

    NASA Astrophysics Data System (ADS)

    Ihle, Thomas; Chou, Yen-Liang

    2014-03-01

    Models of self-driven agents similar to the Vicsek model [Phys. Rev. Lett. 75 (1995) 1226] are studied by means of kinetic theory. In these models, particles try to align their travel directions with the average direction of their neighbours. At strong alignment a globally ordered state of collective motion forms. An Enskog-like kinetic theory is derived from the exact Chapman-Kolmogorov equation in phase space using Boltzmann's mean-field approximation of molecular chaos. The kinetic equation is solved numerically by a nonlocal Lattice-Boltzmann-like algorithm. Steep soliton-like waves are observed that lead to an abrupt jump of the global order parameter if the noise level is changed. The shape of the wave is shown to follow a novel scaling law and to quantitatively agree within 3 % with agent-based simulations at large particle speeds. This provides a mean-field mechanism to change the second-order character of the flocking transition to first order. Diagrammatic techniques are used to investigate small particle speeds, where the mean-field assumption of Molecular Chaos is invalid and where correlation effects need to be included.

  9. The utility of the NEO-PI-R validity scales to detect response distortion: a comparison with the MMPI-2.

    PubMed

    Morasco, Benjamin J; Gfeller, Jeffrey D; Elder, Katherine A

    2007-06-01

    In this psychometric study, we compared the recently developed Validity Scales from the Revised NEO Personality Inventory (NEO PI-R; Costa & McCrae, 1992b) with the MMPI-2 (Butcher, Dahstrom, Graham, Tellegen, & Kaemmer, 1989) Validity Scales. We collected data from clients (n = 74) who completed comprehensive psychological evaluations at a university-based outpatient mental health clinic. Correlations between the Validity Scales of the NEO-PI-R and MMPI-2 were significant and in the expected directions. The relationships provide support for convergent and discriminant validity of the NEO-PI-R Validity Scales. The percent agreement of invalid responding on the two measures was high, although the diagnostic agreement was modest (kappa = .22-.33). Finally, clients who responded in an invalid manner on the NEO-PI-R Validity Scales produced significantly different clinical profiles on the NEO-PI-R and MMPI-2 than clients with valid protocols. These results provide additional support for the clinical utility of the NEO-PI-R Validity Scales as indicators of response bias.

  10. Neutral location cues and cost/benefit analysis of visual attention shifts.

    PubMed

    Wright, R D; Richard, C M; McDonald, J J

    1995-12-01

    The effects of location cuing on target responses can be examined by comparing informative and neutral cuing conditions. In particular, the magnitudes of costs of invalid location cuing and of benefits of valid location cuing can be determined by comparing invalid and valid cue responses to location-nonspecific neutral cue responses. Cost/benefit analysis is based on the assumption that neutral baseline measures reflect a general warning effect about the impending target's onset but no other specific target information. The experiments we report were carried out to determine the appropriateness of two baseline measures for cost/benefit analyses of direct (nonsymbolic) location cuing effects. We found that a multiple-cue baseline attenuated the benefits of valid cuing, and that a background-flash baseline arbitrarily attenuated costs or benefits depending on flash intensity. It is proposed that a background flash is the more suitable neutral cue because it is target-location-nonspecific, but that its intensity should be adjusted to elicit a target-onset warning signal of the same magnitude as the location cues with which it will be compared.

  11. A New Viewpoint (The expanding universe, Dark energy and Dark matter)

    NASA Astrophysics Data System (ADS)

    Cwele, Daniel

    2011-10-01

    Just as the relativity paradox once threatened the validity of physics in Albert Einstein's days, the cosmos paradox, the galaxy rotation paradox and the experimental invalidity of the theory of dark matter and dark energy threaten the stability and validity of physics today. These theories and ideas and many others, including the Big Bang theory, all depend almost entirely on the notion of the expanding universe, Edwin Hubble's observations and reports and the observational inconsistencies of modern day theoretical Physics and Astrophysics on related subjects. However, much of the evidence collected in experimental Physics and Astronomy aimed at proving many of these ideas and theories is ambiguous, and can be used to prove other theories, given a different interpretation of its implications. The argument offered here is aimed at providing one such interpretation, attacking the present day theories of dark energy, dark matter and the Big Bang, and proposing a new Cosmological theory based on a modification of Isaac Newton's laws and an expansion on Albert Einstein's theories, without assuming any invalidity or questionability on present day cosmological data and astronomical observations.

  12. Comparison of two weighted integration models for the cueing task: linear and likelihood

    NASA Technical Reports Server (NTRS)

    Shimozaki, Steven S.; Eckstein, Miguel P.; Abbey, Craig K.

    2003-01-01

    In a task in which the observer must detect a signal at two locations, presenting a precue that predicts the location of a signal leads to improved performance with a valid cue (signal location matches the cue), compared to an invalid cue (signal location does not match the cue). The cue validity effect has often been explained with a limited capacity attentional mechanism improving the perceptual quality at the cued location. Alternatively, the cueing effect can also be explained by unlimited capacity models that assume a weighted combination of noisy responses across the two locations. We compare two weighted integration models, a linear model and a sum of weighted likelihoods model based on a Bayesian observer. While qualitatively these models are similar, quantitatively they predict different cue validity effects as the signal-to-noise ratios (SNR) increase. To test these models, 3 observers performed in a cued discrimination task of Gaussian targets with an 80% valid precue across a broad range of SNR's. Analysis of a limited capacity attentional switching model was also included and rejected. The sum of weighted likelihoods model best described the psychophysical results, suggesting that human observers approximate a weighted combination of likelihoods, and not a weighted linear combination.

  13. Vertebral Augmentation can Induce Early Signs of Degeneration in the Adjacent Intervertebral Disc: Evidence from a Rabbit Model.

    PubMed

    Feng, Zhiyun; Chen, Lunhao; Hu, Xiaojian; Yang, Ge; Wang, Yue; Chen, Zhong

    2018-04-11

    An experimental study. The aim of this study was to determine the effect of polymethylmethacrylate (PMMA) augmentation on the adjacent disc. Vertebral augmentation with PMMA reportedly may predispose the adjacent vertebra to fracture. The influence of PMMA augmentation on the adjacent disc, however, remains unclear. Using a retroperitoneal approach, PMMA augmentation was performed for 23 rabbits. For each animal, at least one vertebra was augmented with 0.2 to 0.3 mL PMMA. The disc adjacent to the augmented vertebra and a proximal control disc were studied using magnetic resonance (MR) imaging, histological and molecular level evaluation at 1, 3, and 6 months postoperatively. Marrow contact channels in the endplate were quantified in histological slices and number of invalid channels (those without erythrocytes inside) was rated. Terminal deoxynucleotidyl transferase-mediated dUTP nick-end labeling (TUNEL) was performed to determine disc cell apoptosis. On MR images, the signal and height of the adjacent disc did not change 6 months after vertebral augmentation. Histological scores of the adjacent disc increased over time, particularly for the nucleus pulposus. The adjacent disc had greater nucleus degeneration score than the control disc at 3 months (5.7 vs. 4.5, P < 0.01) and 6 months (6.9 vs. 4.4, P < 0.001). There were more invalid marrow contact channels in the endplate of augmented vertebra than the control (43.3% vs. 11.1%, P < 0.01). mRNA of ADAMTS-5, MMP-13, HIF-1α, and caspase-3 were significantly upregulated in the adjacent disc at 3 and 6 months (P < 0.05 for all). In addition, there were more TUNEL-positive cells in the adjacent disc than in the control disc (43.4% vs. 24.0%, P < 0.05) at 6 months postoperatively. Vertebral augmentation can induce early degenerative signs in the adjacent disc, which may be due to impaired nutrient supply to the disc. N/A.

  14. Asymptotic theory of intermediate- and high-degree solar acoustic oscillations

    NASA Technical Reports Server (NTRS)

    Brodsky, M.; Vorontsov, S. V.

    1993-01-01

    A second-order asymptotic approximation is developed for adiabatic nonradial p-modes of a spherically symmetric star. The exact solutions of adiabatic oscillations are assumed in the outermost layers, where the asymptotic description becomes invalid, which results in a eigenfrequency equation with model-dependent surface phase shift. For lower degree modes, the phase shift is a function of frequency alone; for high-degree modes, its dependence on the degree is explicitly taken into account.

  15. Implications of Analytical Investigations about the Semiconductor Equations on Device Modeling Programs.

    DTIC Science & Technology

    1983-04-01

    34.. .. . ...- "- -,-. SIGNIFICANCE AND EXPLANATION Many different codes for the simulation of semiconductor devices such as transitors , diodes, thyristors are already circulated...partially take into account the consequences introduced by degenerate semiconductors (e.g. invalidity of Boltzmann’s statistics , bandgap narrowing). These...ft - ni p nep /Ut(2.10) Sni *e p nie 2.11) .7. (2.10) can be physically interpreted as the application of Boltzmann statistics . However (2.10) a.,zo

  16. On the distinguishability of HRF models in fMRI.

    PubMed

    Rosa, Paulo N; Figueiredo, Patricia; Silvestre, Carlos J

    2015-01-01

    Modeling the Hemodynamic Response Function (HRF) is a critical step in fMRI studies of brain activity, and it is often desirable to estimate HRF parameters with physiological interpretability. A biophysically informed model of the HRF can be described by a non-linear time-invariant dynamic system. However, the identification of this dynamic system may leave much uncertainty on the exact values of the parameters. Moreover, the high noise levels in the data may hinder the model estimation task. In this context, the estimation of the HRF may be seen as a problem of model falsification or invalidation, where we are interested in distinguishing among a set of eligible models of dynamic systems. Here, we propose a systematic tool to determine the distinguishability among a set of physiologically plausible HRF models. The concept of absolutely input-distinguishable systems is introduced and applied to a biophysically informed HRF model, by exploiting the structure of the underlying non-linear dynamic system. A strategy to model uncertainty in the input time-delay and magnitude is developed and its impact on the distinguishability of two physiologically plausible HRF models is assessed, in terms of the maximum noise amplitude above which it is not possible to guarantee the falsification of one model in relation to another. Finally, a methodology is proposed for the choice of the input sequence, or experimental paradigm, that maximizes the distinguishability of the HRF models under investigation. The proposed approach may be used to evaluate the performance of HRF model estimation techniques from fMRI data.

  17. Development of a new pentaplex real-time PCR assay for the identification of poly-microbial specimens containing Staphylococcus aureus and other staphylococci, with simultaneous detection of staphylococcal virulence and methicillin resistance markers.

    PubMed

    Okolie, Charles E; Wooldridge, Karl G; Turner, David P; Cockayne, Alan; James, Richard

    2015-06-01

    Staphylococcus aureus strains harbouring genes encoding virulence and antibiotic resistance are of public health importance. In clinical samples, pathogenic S. aureus is often mixed with putatively less pathogenic coagulase-negative staphylococci (CoNS), both of which can harbour mecA, the gene encoding staphylococcal methicillin-resistance. There have been previous attempts at distinguishing MRSA from MRCoNS, most of which were based on the detection of one of the pathognomonic markers of S. aureus, such as coa, nuc or spa. That approach might suffice for discrete colonies and mono-microbial samples; it is inadequate for identification of clinical specimens containing mixtures of S. aureus and CoNS. In the present study, a real-time pentaplex PCR assay has been developed which simultaneously detects markers for bacteria (16S rRNA), coagulase-negative staphylococcus (cns), S. aureus (spa), Panton-Valentine leukocidin (pvl) and methicillin resistance (mecA). Staphylococcal and non-staphylococcal bacterial strains (n = 283) were used to validate the new assay. The applicability of this test to clinical samples was evaluated using spiked blood cultures (n = 43) containing S. aureus and CoNS in mono-microbial and poly-microbial models, which showed that the 5 markers were all detected as expected. Cycling completes within 1 h, delivering 100% specificity, NPV and PPV with a detection limit of 1.0 × 10(1) to 3.0 × 10(1) colony forming units (CFU)/ml, suggesting direct applicability in routine diagnostic microbiology. This is the most multiplexed real-time PCR-based PVL-MRSA assay and the first detection of a unique marker for CoNS without recourse to the conventional elimination approach. There was no evidence that this new assay produced invalid/indeterminate test results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. [Invalidation of a girl after 7 years of succesfull malingering of a wound (author's transl)].

    PubMed

    Reinbacher, L

    1976-01-01

    Provocation of wounds by malingering is one of the most typical artefacts in surgery. In this paper the succesful maintaining of a wound in a girl for over 7 years is demonstrated. In this time the girl underwent 25 surgical operations and was invalidated at the age of 18 years. In patients with non-healing wounds, which cannot be explained by the rules of general pathology, doctors are well advised to think of the possibilities of malingering at an early time.

  19. High speed data compactor

    DOEpatents

    Baumbaugh, Alan E.; Knickerbocker, Kelly L.

    1988-06-04

    A method and apparatus for suppressing from transmission, non-informational data words from a source of data words such as a video camera. Data words having values greater than a predetermined threshold are transmitted whereas data words having values less than a predetermined threshold are not transmitted but their occurrences instead are counted. Before being transmitted, the count of occurrences of invalid data words and valid data words are appended with flag digits which a receiving system decodes. The original data stream is fully reconstructable from the stream of valid data words and count of invalid data words.

  20. Basic corrections to predictions of solar cell performance required by nonlinearities

    NASA Technical Reports Server (NTRS)

    Lindholm, F. A.; Fossum, J. G.; Burgess, E. L.

    1976-01-01

    The superposition principle is used to derive the approximation that the current-voltage characteristic of an illuminated solar cell is the dark current-voltage characteristic shifted by the short-circuit photocurrent. The derivation requires the linearity of the boundary value problems that underlie the electrical characteristics. The shifting approximation is invalid if considerable photocurrent and considerable dark current both occur within the junction space-charge region; it is invalid also if sizable series resistance is present or if high-injection concentrations of holes and electrons exist within the quasi-neutral regions.

  1. Selecting the right statistical model for analysis of insect count data by using information theoretic measures.

    PubMed

    Sileshi, G

    2006-10-01

    Researchers and regulatory agencies often make statistical inferences from insect count data using modelling approaches that assume homogeneous variance. Such models do not allow for formal appraisal of variability which in its different forms is the subject of interest in ecology. Therefore, the objectives of this paper were to (i) compare models suitable for handling variance heterogeneity and (ii) select optimal models to ensure valid statistical inferences from insect count data. The log-normal, standard Poisson, Poisson corrected for overdispersion, zero-inflated Poisson, the negative binomial distribution and zero-inflated negative binomial models were compared using six count datasets on foliage-dwelling insects and five families of soil-dwelling insects. Akaike's and Schwarz Bayesian information criteria were used for comparing the various models. Over 50% of the counts were zeros even in locally abundant species such as Ootheca bennigseni Weise, Mesoplatys ochroptera Stål and Diaecoderus spp. The Poisson model after correction for overdispersion and the standard negative binomial distribution model provided better description of the probability distribution of seven out of the 11 insects than the log-normal, standard Poisson, zero-inflated Poisson or zero-inflated negative binomial models. It is concluded that excess zeros and variance heterogeneity are common data phenomena in insect counts. If not properly modelled, these properties can invalidate the normal distribution assumptions resulting in biased estimation of ecological effects and jeopardizing the integrity of the scientific inferences. Therefore, it is recommended that statistical models appropriate for handling these data properties be selected using objective criteria to ensure efficient statistical inference.

  2. Is the Bifactor Model a Better Model or is it Just Better at Modeling Implausible Responses? Application of Iteratively Reweighted Least Squares to the Rosenberg Self-Esteem Scale

    PubMed Central

    Reise, Steven P.; Kim, Dale S.; Mansolf, Maxwell; Widaman, Keith F.

    2017-01-01

    Although the structure of the Rosenberg Self-Esteem Scale (RSES; Rosenberg, 1965) has been exhaustively evaluated, questions regarding dimensionality and direction of wording effects continue to be debated. To shed new light on these issues, we ask: (1) for what percentage of individuals is a unidimensional model adequate, (2) what additional percentage of individuals can be modeled with multidimensional specifications, and (3) what percentage of individuals respond so inconsistently that they cannot be well modeled? To estimate these percentages, we applied iteratively reweighted least squares (IRLS; Yuan & Bentler, 2000) to examine the structure of the RSES in a large, publicly available dataset. A distance measure, ds, reflecting a distance between a response pattern and an estimated model, was used for case weighting. We found that a bifactor model provided the best overall model fit, with one general factor and two wording-related group factors. But, based on dr values, a distance measure based on individual residuals, we concluded that approximately 86% of cases were adequately modeled through a unidimensional structure, and only an additional 3% required a bifactor model. Roughly 11% of cases were judged as “unmodelable” due to their significant residuals in all models considered. Finally, analysis of ds revealed that some, but not all, of the superior fit of the bifactor model is owed to that model’s ability to better accommodate implausible and possibly invalid response patterns, and not necessarily because it better accounts for the effects of direction of wording. PMID:27834509

  3. Memory and disgust: Effects of appearance-congruent and appearance-incongruent information on source memory for food.

    PubMed

    Mieth, Laura; Bell, Raoul; Buchner, Axel

    2016-01-01

    The present study was stimulated by previous findings showing that people preferentially remember person descriptions that violate appearance-based first impressions. Given that until now all studies used faces as stimuli, these findings can be explained by referring to a content-specific module for social information processing that facilitates social orientation within groups via stereotyping and counter-stereotyping. The present study tests whether the same results can be obtained with fitness-relevant stimuli from another domain--pictures of disgusting-looking or tasty-looking food, paired with tasty and disgusting descriptions. A multinomial model was used to disentangle item memory, guessing and source memory. There was an old-new recognition advantage for disgusting-looking food. People had a strong tendency towards guessing that disgusting-looking food had been previously associated with a disgusting description. Source memory was enhanced for descriptions that disconfirmed these negative, appearance-based impressions. These findings parallel the results from the social domain. Heuristic processing of stimuli based on visual appearance may be complemented by intensified processing of incongruent information that invalidates these first impressions.

  4. Wet-spun, porous, orientational graphene hydrogel films for high-performance supercapacitor electrodes

    NASA Astrophysics Data System (ADS)

    Kou, Liang; Liu, Zheng; Huang, Tieqi; Zheng, Bingna; Tian, Zhanyuan; Deng, Zengshe; Gao, Chao

    2015-02-01

    Supercapacitors with porous electrodes of graphene macroscopic assembly are supposed to have high energy storage capacity. However, a great number of ``close pores'' in porous graphene electrodes are invalid because electrolyte ions cannot infiltrate. A quick method to prepare porous graphene electrodes with reduced ``close pores'' is essential for higher energy storage. Here we propose a wet-spinning assembly approach based on the liquid crystal behavior of graphene oxide to continuously spin orientational graphene hydrogel films with ``open pores'', which are used directly as binder-free supercapacitor electrodes. The resulting supercapacitor electrodes show better electrochemical performance than those with disordered graphene sheets. Furthermore, three reduction methods including hydrothermal treatment, hydrazine and hydroiodic acid reduction are used to evaluate the specific capacitances of the graphene hydrogel film. Hydrazine-reduced graphene hydrogel film shows the highest capacitance of 203 F g-1 at 1 A g-1 and maintains 67.1% specific capacitance (140 F g-1) at 50 A g-1. The combination of scalable wet-spinning technology and orientational structure makes graphene hydrogel films an ideal electrode material for supercapacitors.Supercapacitors with porous electrodes of graphene macroscopic assembly are supposed to have high energy storage capacity. However, a great number of ``close pores'' in porous graphene electrodes are invalid because electrolyte ions cannot infiltrate. A quick method to prepare porous graphene electrodes with reduced ``close pores'' is essential for higher energy storage. Here we propose a wet-spinning assembly approach based on the liquid crystal behavior of graphene oxide to continuously spin orientational graphene hydrogel films with ``open pores'', which are used directly as binder-free supercapacitor electrodes. The resulting supercapacitor electrodes show better electrochemical performance than those with disordered graphene sheets. Furthermore, three reduction methods including hydrothermal treatment, hydrazine and hydroiodic acid reduction are used to evaluate the specific capacitances of the graphene hydrogel film. Hydrazine-reduced graphene hydrogel film shows the highest capacitance of 203 F g-1 at 1 A g-1 and maintains 67.1% specific capacitance (140 F g-1) at 50 A g-1. The combination of scalable wet-spinning technology and orientational structure makes graphene hydrogel films an ideal electrode material for supercapacitors. Electronic supplementary information (ESI) available: The schematic diagram for fabricating graphene oxide hydrogel films, stress-strain curves and TGA curves of three GHFs, a digital photo of the test device for the two-electrode system, and comparison of the electrochemical performance of our GHF-HZ supercapacitors. See DOI: 10.1039/c4nr07038k

  5. Comment on: "Bachmann, R. W., M. V. Hoyer, and D. E. Canfield. 2013. The extent that natural lakes in the United States of America have been changed by cultural eutrophication. Limnology and Oceanography 58:945-950."

    EPA Science Inventory

    In a recent paper, Bachmann et al. (2013) conclude, based on paleolimnological reconstructions, that lakes in the conterminous U.S. have undergone very little cultural eutrophication. They go on to suggest that their results invalidate the efforts of the U.S. EPA to establish num...

  6. A New Wide-Range Equation of State for Xenon

    NASA Astrophysics Data System (ADS)

    Carpenter, John H.

    2011-06-01

    We describe the development of a new wide-range equation of state (EOS) for xenon. Three different prior EOS models predicted significant variations in behavior along the high pressure Hugoniot from an initial liquid state at 163.5 K and 2.97 g/cm3, which is near the triple point. Experimental measurements on Sandia's Z machine as well as density functional theory based molecular dynamics calculations both invalidate the prior EOS models in the pressure range from 200 to 840 GPa. The reason behind these EOS model disagreements is found to lie in the contribution from the thermal electronic models. A new EOS, based upon the standard separation of the Helmholtz free energy into ionic and electronic components, is constructed by combining the successful parts of prior models with a semi-empirical electronic model. Both the fluid and fcc solid phases are combined in a wide-range, multi-phase table. The new EOS is tabulated on a fine temperature and density grid, to preserve phase boundary information, and is available as table number 5191 in the LANL SESAME database. Improvements over prior EOS models are found not only along the Hugoniot, but also along the melting curve and in the region of the liquid-vapor critical point. *Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  7. Are there two processes in reasoning? The dimensionality of inductive and deductive inferences.

    PubMed

    Stephens, Rachel G; Dunn, John C; Hayes, Brett K

    2018-03-01

    Single-process accounts of reasoning propose that the same cognitive mechanisms underlie inductive and deductive inferences. In contrast, dual-process accounts propose that these inferences depend upon 2 qualitatively different mechanisms. To distinguish between these accounts, we derived a set of single-process and dual-process models based on an overarching signal detection framework. We then used signed difference analysis to test each model against data from an argument evaluation task, in which induction and deduction judgments are elicited for sets of valid and invalid arguments. Three data sets were analyzed: data from Singmann and Klauer (2011), a database of argument evaluation studies, and the results of an experiment designed to test model predictions. Of the large set of testable models, we found that almost all could be rejected, including all 2-dimensional models. The only testable model able to account for all 3 data sets was a model with 1 dimension of argument strength and independent decision criteria for induction and deduction judgments. We conclude that despite the popularity of dual-process accounts, current results from the argument evaluation task are best explained by a single-process account that incorporates separate decision thresholds for inductive and deductive inferences. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. ERP evidence for selective drop in attentional costs in uncertain environments: challenging a purely premotor account of covert orienting of attention.

    PubMed

    Lasaponara, Stefano; Chica, Ana B; Lecce, Francesca; Lupianez, Juan; Doricchi, Fabrizio

    2011-07-01

    Several studies have proved that the reliability of endogenous spatial cues linearly modulates the reaction time advantage in the processing of targets at validly cued vs. invalidly cued locations, i.e. the "validity effect". This would imply that with non-predictive cues, no "validity effect" should be observed. However, contrary to this prediction, one could hypothesize that attentional benefits by valid cuing (i.e. the RT advantage for validly vs. neutrally cued targets) can still be maintained with non-predictive cues, if the brain were endowed with mechanisms allowing the selective reduction in costs of reorienting from invalidly cued locations (i.e. the reduction of the RT disadvantage for invalidly vs. neutrally cued targets). This separated modulation of attentional benefits and costs would be adaptive in uncertain contexts where cues predict at chance level the location of targets. Through the joint recording of manual reaction times and event-related cerebral potentials (ERPs), we have found that this is the case and that relying on non-predictive endogenous cues results in abatement of attentional costs and the difference in the amplitude of the P1 brain responses evoked by invalidly vs. neutrally cued targets. In contrast, the use of non-predictive cues leaves unaffected attentional benefits and the difference in the amplitude of the N1 responses evoked by validly vs. neutrally cued targets. At the individual level, the drop in costs with non-predictive cues was matched with equivalent lateral biases in RTs to neutrally and invalidly cued targets presented in the left and right visual field. During the cue period, the drop in costs with non-predictive cues was preceded by reduction of the Early Directing Attention Negativity (EDAN) on posterior occipital sites and by enhancement of the frontal Anterior Directing Attention Negativity (ADAN) correlated to preparatory voluntary orienting. These findings demonstrate, for the first time, that the segregation of mechanisms regulating attentional benefits and costs helps efficiency of orienting in "uncertain" visual spatial contexts characterized by poor probabilistic association between cues and targets. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Increased effect of target eccentricity on covert shifts of visual attention in patients with neglect.

    PubMed

    Hamilton, Roy H; Stark, Marianna; Coslett, H Branch

    2010-01-01

    Debate continues regarding the mechanisms underlying covert shifts of visual attention. We examined the relationship between target eccentricity and the speed of covert shifts of attention in normal subjects and patients with brain lesions using a cued-response task in which cues and targets were presented at 2 degrees or 8 degrees lateral to the fixation point. Normal subjects were slower on invalid trials in the 8 degrees as compared to 2 degrees condition. Patients with right-hemisphere stroke with neglect were slower in their responses to left-sided invalid targets compared to valid targets, and demonstrated a significant increase in the effect of target validity as a function of target eccentricity. Additional data from one neglect patient (JM) demonstrated an exaggerated validity x eccentricity x side interaction for contralesional targets on a cued reaction time task with a central (arrow) cue. We frame these results in the context of a continuous 'moving spotlight' model of attention, and also consider the potential role of spatial saliency maps. By either account, we argue that neglect is characterized by an eccentricity-dependent deficit in the allocation of attention.

  10. Distinct roles of the intraparietal sulcus and temporoparietal junction in attentional capture from distractor features: An individual differences approach.

    PubMed

    Painter, David R; Dux, Paul E; Mattingley, Jason B

    2015-07-01

    Setting attention for an elementary visual feature, such as color or motion, results in greater spatial attentional "capture" from items with target compared with distractor features. Thus, capture is contingent on feature-based control settings. Neuroimaging studies suggest that this contingent attentional capture involves interactions between dorsal and ventral frontoparietal networks. To examine the distinct causal influences of these networks on contingent capture, we applied continuous theta-burst stimulation (cTBS) to alter neural excitability within the dorsal intraparietal sulcus (IPS), the ventral temporoparietal junction (TPJ) and a control site, visual area MT. Participants undertook an attentional capture task before and after stimulation, in which they made speeded responses to color-defined targets that were preceded by spatial cues in the target or distractor color. Cues appeared either at the target location (valid) or at a non-target location (invalid). Reaction times were slower for targets preceded by invalid compared with valid cues, demonstrating spatial attentional capture. Cues with the target color captured attention to a greater extent than those with the distractor color, consistent with contingent capture. Effects of cTBS were not evident at the group level, but emerged instead from analyses of individual differences. Target capture magnitude was positively correlated pre- and post-stimulation for all three cortical sites, suggesting that cTBS did not influence target capture. Conversely, distractor capture was positively correlated pre- and post-stimulation of MT, but uncorrelated for IPS and TPJ, suggesting that stimulation of IPS and TPJ selectively disrupted distractor capture. Additionally, the effects of IPS stimulation were predicted by pre-stimulation attentional capture, whereas the effects of TPJ stimulation were predicted by pre-stimulation distractor suppression. The results are consistent with the existence of distinct neural circuits underlying target and distractor capture, as well as distinct roles for the IPS and TPJ. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. On the phantom barrier crossing and the bounds on the speed of sound in non-minimal derivative coupling theories

    NASA Astrophysics Data System (ADS)

    Quiros, Israel; Gonzalez, Tame; Nucamendi, Ulises; García-Salcedo, Ricardo; Horta-Rangel, Francisco Antonio; Saavedra, Joel

    2018-04-01

    In this paper we investigate the so-called ‘phantom barrier crossing’ issue in a cosmological model based on the scalar–tensor theory with non-minimal derivative coupling to the Einstein tensor. Special attention will be paid to the physical bounds on the squared sound speed. The numeric results are geometrically illustrated by means of a qualitative procedure of analysis that is based on the mapping of the orbits in the phase plane onto the surfaces that represent physical quantities in the extended phase space, that is: the phase plane complemented with an additional dimension relative to the given physical parameter. We find that the cosmological model based on the non-minimal derivative coupling theory—this includes both the quintessence and the pure derivative coupling cases—has serious causality problems related to superluminal propagation of the scalar and tensor perturbations. Even more disturbing is the finding that, despite the fact that the underlying theory is free of the Ostrogradsky instability, the corresponding cosmological model is plagued by the Laplacian (classical) instability related with negative squared sound speed. This instability leads to an uncontrollable growth of the energy density of the perturbations that is inversely proportional to their wavelength. We show that, independent of the self-interaction potential, for positive coupling the tensor perturbations propagate superluminally, while for negative coupling a Laplacian instability arises. This latter instability invalidates the possibility for the model to describe the primordial inflation.

  12. Medical-legal issues in headache: penal and civil Italian legislation, working claims, social security, off-label prescription.

    PubMed

    Aguggia, M; Cavallini, M; Varetto, L

    2006-05-01

    Primary headaches can be considered simultaneously as symptom and disease itself, while secondary headaches are expressions of a pathological process that can be systemic or locoregional. Because of its subjective features, headache is often difficult to assess and quantify by severity, frequency and invalidity rate, and for these reasons it has often been implicated in legal controversies. Headache has seldom been considered in the criminal law, except when it represents a typical symptom of a disease whose existence can be objectively assessed (i. e. raised intracranial pressure). Therefore, in civil legislation it is not yet coded to start claiming for invalidity compensation. In particular, one of the most debated medical-legal questions is represented by headaches occurring after head injury. Headache is often the principal symptom at the beginning of several toxic chronic syndromes, with many implications, especially in working claims, and, more recently, it may be referred to as one of the most frequent symptoms by victims of mobbing (i. e. psychological harassment in the workplace). The National Institute for Industrial Accident Insurance (INAIL) scales (instituted by the law 38/2000) mention the "Subjective cranial trauma syndrome" and give an invalidity rate evaluation. With reference to other headache forms, no legislation really exists at the present time, and headache is only considered as a symptom of a certain coded disease. Requests for invalidity social pension and the question of off-label prescriptions (drug prescription for a disease, without formal indication for it) are other controversial matters.

  13. Urine specimen validity test for drug abuse testing in workplace and court settings.

    PubMed

    Lin, Shin-Yu; Lee, Hei-Hwa; Lee, Jong-Feng; Chen, Bai-Hsiun

    2018-01-01

    In recent decades, urine drug testing in the workplace has become common in many countries in the world. There have been several studies concerning the use of the urine specimen validity test (SVT) for drug abuse testing administered in the workplace. However, very little data exists concerning the urine SVT on drug abuse tests from court specimens, including dilute, substituted, adulterated, and invalid tests. We investigated 21,696 submitted urine drug test samples for SVT from workplace and court settings in southern Taiwan over 5 years. All immunoassay screen-positive urine specimen drug tests were confirmed by gas chromatography/mass spectrometry. We found that the mean 5-year prevalence of tampering (dilute, substituted, or invalid tests) in urine specimens from the workplace and court settings were 1.09% and 3.81%, respectively. The mean 5-year percentage of dilute, substituted, and invalid urine specimens from the workplace were 89.2%, 6.8%, and 4.1%, respectively. The mean 5-year percentage of dilute, substituted, and invalid urine specimens from the court were 94.8%, 1.4%, and 3.8%, respectively. No adulterated cases were found among the workplace or court samples. The most common drug identified from the workplace specimens was amphetamine, followed by opiates. The most common drug identified from the court specimens was ketamine, followed by amphetamine. We suggest that all urine specimens taken for drug testing from both the workplace and court settings need to be tested for validity. Copyright © 2017. Published by Elsevier B.V.

  14. [Medical-social aspects of multiple sclerosis].

    PubMed

    Vermersch, P; Marissal, J P

    2001-09-01

    On a daily basis the quality of life of patients suffering from multiple sclerosis (MS) partially depends on social measures. These are not specific to MS. Patients often need to be helped by hospital or town social services for the numerous and complicated administrative steps to be taken. The information given to a patient is of prime importance concerning his rights, particularly his occupational rights. Many organisations have to be contacted to obtain financial and material aids, even if the latter are considered insufficient in many fields especially for improvements in accommodation. An invalidity card may entitle its holder to certain tax reductions. The competences of the COTOREP are wide-ranging and include the recognition of the handicapped worker, his training and his regarding at work, his orientation and admission into a specialised structure, the degree of his invalidity rate and should his handicap justify it, benefits such as the handicapped adults allowance and the compensatory third person's allowance. It is essential to adopt a multidisciplinary way when dealing with MS in order to provide a better care, experiments in specialised structures and networks are being undertaken. Numerous partners are taking part in these new approaches and patient associations may find their place there. Social aspects have to be taken into account as well in the way the cost of the disease is evaluated in terms of money and humanity.

  15. Computational models of the Posner simple and choice reaction time tasks

    PubMed Central

    Feher da Silva, Carolina; Baldo, Marcus V. C.

    2015-01-01

    The landmark experiments by Posner in the late 1970s have shown that reaction time (RT) is faster when the stimulus appears in an expected location, as indicated by a cue; since then, the so-called Posner task has been considered a “gold standard” test of spatial attention. It is thus fundamental to understand the neural mechanisms involved in performing it. To this end, we have developed a Bayesian detection system and small integrate-and-fire neural networks, which modeled sensory and motor circuits, respectively, and optimized them to perform the Posner task under different cue type proportions and noise levels. In doing so, main findings of experimental research on RT were replicated: the relative frequency effect, suboptimal RTs and significant error rates due to noise and invalid cues, slower RT for choice RT tasks than for simple RT tasks, fastest RTs for valid cues and slowest RTs for invalid cues. Analysis of the optimized systems revealed that the employed mechanisms were consistent with related findings in neurophysiology. Our models predict that (1) the results of a Posner task may be affected by the relative frequency of valid and neutral trials, (2) in simple RT tasks, input from multiple locations are added together to compose a stronger signal, and (3) the cue affects motor circuits more strongly in choice RT tasks than in simple RT tasks. In discussing the computational demands of the Posner task, attention has often been described as a filter that protects the nervous system, whose capacity is limited, from information overload. Our models, however, reveal that the main problems that must be overcome to perform the Posner task effectively are distinguishing signal from external noise and selecting the appropriate response in the presence of internal noise. PMID:26190997

  16. An exploration of the impact of invalid MMPI-2 protocols on collateral self-report measure scores.

    PubMed

    Forbey, Johnathan D; Lee, Tayla T C

    2011-11-01

    Although a number of studies have examined the impact of invalid MMPI-2 (Butcher et al., 2001) response styles on MMPI-2 scale scores, limited research has specifically explored the effects that such response styles might have on conjointly administered collateral self-report measures. This study explored the potential impact of 2 invalidating response styles detected by the Validity scales of the MMPI-2, overreporting and underreporting, on scores of collateral self-report measures administered conjointly with the MMPI-2. The final group of participants included in analyses was 1,112 college students from a Midwestern university who completed all measures as part of a larger study. Results of t-test analyses suggested that if either over- or underreporting was indicated by the MMPI-2 Validity scales, the scores of most conjointly administered collateral measures were also significantly impacted. Overall, it appeared that test-takers who were identified as either over- or underreporting relied on such a response style across measures. Limitations and suggestions for future study are discussed.

  17. Altered orientation of spatial attention in depersonalization disorder.

    PubMed

    Adler, Julia; Beutel, Manfred E; Knebel, Achim; Berti, Stefan; Unterrainer, Josef; Michal, Matthias

    2014-05-15

    Difficulties with concentration are frequent complaints of patients with depersonalization disorder (DPD). Standard neuropsychological tests suggested alterations of the attentional and perceptual systems. To investigate this, the well-validated Spatial Cueing paradigm was used with two different tasks, consisting either in the detection or in the discrimination of visual stimuli. At the start of each trial a cue indicated either the correct (valid) or the incorrect (invalid) position of the upcoming stimulus or was uninformative (neutral). Only under the condition of increased task difficulty (discrimination task) differences between DPD patients and controls were observed. DPD patients showed a smaller total attention directing effect (RT in valid vs. invalid trials) compared to healthy controls only in the discrimination condition. RT costs (i.e., prolonged RT in neutral vs. invalid trials) mainly accounted for this difference. These results indicate that DPD is associated with altered attentional mechanisms, especially with a stronger responsiveness to unexpected events. From an evolutionary perspective this may be advantageous in a dangerous environment, in daily life it may be experienced as high distractibility. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. The effects of musical training on movement pre-programming and re-programming abilities: an event-related potential investigation.

    PubMed

    Anatürk, Melis; Jentzsch, Ines

    2015-03-01

    Two response precuing experiments were conducted to investigate effects of musical skill level on the ability to pre- and re-programme simple movements. Participants successfully used advance information to prepare forthcoming responses and showed response slowing when precue information was invalid rather than valid. This slowing was, however, only observed for partially invalid but not fully invalid precues. Musicians were generally faster than non-musicians, but no group differences in the efficiency of movement pre-programming or re-programming were observed. Interestingly, only musicians exhibited a significant foreperiod lateralized readiness potential (LRP) when response hand was pre-specified or full advance information was provided. These LRP findings suggest increased effector-specific motor preparation in musicians than non-musicians. However, here the levels of effector-specific preparation did not predict preparatory advantages observed in behaviour. In sum, combining the response precuing and ERP paradigms serves a valuable tool to examine influences of musical training on movement pre- or re-programming processes. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Cognitive status and profile validity on the Personality Assessment Inventory (PAI) in offenders with serious mental illness.

    PubMed

    Matlasz, Tatiana M; Brylski, Jamie L; Leidenfrost, Corey M; Scalco, Matt; Sinclair, Samuel J; Schoelerman, Ronald M; Tsang, Valerie; Antonius, Daniel

    Cognitive impairment among seriously mentally ill offenders has implications for legal matters (e.g., competency to stand trial), as well as clinical treatment and care. Thus, being able to identify potential cognitive concerns early in the adjudication process can be important when deciding on further interventions. In this study, we examined the validity scales of the Personality Assessment Inventory (PAI), scores on the Wechsler Adult Intelligence Scale-IV (WAIS-IV), and competency findings in male inmates (n=61) diagnosed with a serious mental illness. Lower scores on the WAIS-IV significantly (p=0.001) predicted invalid, versus valid, PAI profiles, with working memory impairment being the most significant (p=0.004) predictor of an invalid profile. Ancillary analyses on a smaller sample (n=18) indicate that those with invalid PAI profiles were more likely to be deemed legally incompetent (p=0.03). These findings suggest that the PAI validity scales may be informative in detecting cognitive concerns and help clinicians make determinations about competency restoration and treatment. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Measuring Image Navigation and Registration Performance at the 3-Sigma Level Using Platinum Quality Landmarks

    NASA Technical Reports Server (NTRS)

    Carr, James L.; Madani, Houria

    2007-01-01

    Geostationary Operational Environmental Satellite (GOES) Image Navigation and Registration (INR) performance is specified at the 3- level, meaning that 99.7% of a collection of individual measurements must comply with specification thresholds. Landmarks are measured by the Replacement Product Monitor (RPM), part of the operational GOES ground system, to assess INR performance and to close the INR loop. The RPM automatically discriminates between valid and invalid measurements enabling it to run without human supervision. In general, this screening is reliable, but a small population of invalid measurements will be falsely identified as valid. Even a small population of invalid measurements can create problems when assessing performance at the 3-sigma level. This paper describes an additional layer of quality control whereby landmarks of the highest quality ("platinum") are identified by their self-consistency. The platinum screening criteria are not simple statistical outlier tests against sigma values in populations of INR errors. In-orbit INR performance metrics for GOES-12 and GOES-13 are presented using the platinum landmark methodology.

  1. Self-interest and other-orientation in organizational behavior: implications for job performance, prosocial behavior, and personal initiative.

    PubMed

    De Dreu, Carsten K W; Nauta, Aukje

    2009-07-01

    In this article, the authors develop the self-concern and other-orientation as moderators hypothesis. The authors argue that many theories on work behavior assume humans to be either self-interested or to be social in nature with strong other-orientation but that this assumption is empirically invalid and may lead to overly narrow models of work behavior. The authors instead propose that self-concern and other-orientation are independent. The authors also propose that job performance, prosocial behavior, and personal initiative are a function of (a) individual-level attributes, such as job characteristics when employees are high in self-concern, and (b) group-level attributes, such as justice climate when employees are high in other-orientation. Three studies involving 4 samples of employees from a variety of organizations support these propositions. Implications are discussed for theory on work behavior and interventions geared toward job enrichment and team-based working.

  2. Polar symmetric flow of a viscous compressible atmosphere; an application to Mars

    NASA Technical Reports Server (NTRS)

    Pirraglia, J. A.

    1974-01-01

    The atmosphere is assumed to be driven by a polar symmetric temperature field and the equations of motion in pressure ratio coordinates are linearized by considering the zero order in terms of a thermal Rossby number R delta I/(2a omega) sq where delta T is a measure of the latitudinal temperature gradient. When the eddy viscosity is greater than 1 million sq cm/sec, the boundary layer extends far up into the atmosphere, making the geostrophic approximation invalid for the bulk of the atmosphere. A temperature model for Mars was used which was based on Mariner 9 infrared spectral data with a 30% increase in the depth averaged temperature from the winter pole to the subsolar point. The results obtained for the increase in surface pressure from the subsolar point to the winter pole, as a function of eddy viscosity and with no-slip conditions imposed at the surface, are given.

  3. Childhood leukemia and cancers near German nuclear reactors: significance, context, and ramifications of recent studies.

    PubMed

    Nussbaum, Rudi H

    2009-01-01

    A government-sponsored study of childhood cancer in the proximity of German nuclear power plants (German acronym KiKK) found that children < 5 years living < 5 km from plant exhaust stacks had twice the risk for contracting leukemia as those residing > 5 km. The researchers concluded that since "this result was not to be expected under current radiation-epidemiological knowledge" and confounders could not be identified, the observed association of leukemia incidence with residential proximity to nuclear plants "remains unexplained." This unjustified conclusion illustrates the dissonance between evidence and assumptions. There exist serious flaws and gaps in the knowledge on which accepted models for population exposure and radiation risk are based. Studies with results contradictory to those of KiKK lack statistical power to invalidate its findings. The KiKK study's ramifications add to the urgency for a public policy debate regarding the health impact of nuclear power generation.

  4. It's Not a Big Sky After All: Justification for a Close Approach Prediction and Risk Assessment Process

    NASA Technical Reports Server (NTRS)

    Newman, Lauri Kraft; Frigm, Ryan; McKinley, David

    2009-01-01

    There is often skepticism about the need for Conjunction Assessment from mission operators that invest in the "big sky theory", which states that the likelihood of a collision is so small that it can be neglected. On 10 February 2009, the collision between Iridium 3; and Cosmos 2251 provided an indication that this theory is invalid and that a CA process should be considered for all missions. This paper presents statistics of the effect of the Iridium/Cosmos collision on NASA's Earth Science Constellation as well as results of analyses which characterize the debris environment for NASA's robotic missions.

  5. Application of JAERI quantum molecular dynamics model for collisions of heavy nuclei

    NASA Astrophysics Data System (ADS)

    Ogawa, Tatsuhiko; Hashimoto, Shintaro; Sato, Tatsuhiko; Niita, Koji

    2016-06-01

    The quantum molecular dynamics (QMD) model incorporated into the general-purpose radiation transport code PHITS was revised for accurate prediction of fragment yields in peripheral collisions. For more accurate simulation of peripheral collisions, stability of the nuclei at their ground state was improved and the algorithm to reject invalid events was modified. In-medium correction on nucleon-nucleon cross sections was also considered. To clarify the effect of this improvement on fragmentation of heavy nuclei, the new QMD model coupled with a statistical decay model was used to calculate fragment production cross sections of Ag and Au targets and compared with the data of earlier measurement. It is shown that the revised version can predict cross section more accurately.

  6. Nonlinear modal resonances in low-gravity slosh-spacecraft systems

    NASA Technical Reports Server (NTRS)

    Peterson, Lee D.

    1991-01-01

    Nonlinear models of low gravity slosh, when coupled to spacecraft vibrations, predict intense nonlinear eigenfrequency shifts at zero gravity. These nonlinear frequency shifts are due to internal quadratic and cubic resonances between fluid slosh modes and spacecraft vibration modes. Their existence has been verified experimentally, and they cannot be correctly modeled by approximate, uncoupled nonlinear models, such as pendulum mechanical analogs. These predictions mean that linear slosh assumptions for spacecraft vibration models can be invalid, and may lead to degraded control system stability and performance. However, a complete nonlinear modal analysis will predict the correct dynamic behavior. This paper presents the analytical basis for these results, and discusses the effect of internal resonances on the nonlinear coupled response at zero gravity.

  7. Consistent and efficient processing of ADCP streamflow measurements

    USGS Publications Warehouse

    Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan

    2016-01-01

    The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.

  8. [Environmental and demographic variables associated with psychiatric morbidity in former prisoners of war].

    PubMed

    Dethienne, F; Donnay, J M

    1976-01-01

    In this study, present psychiatric morbidity of 100 former prisoners of war is related to 28 environmental and demographic variables grouped in 3 periods: before, during and after WW2. With the exception of the invalidity percentage, all statistically significant relations concern variables of the first two periods. The present results are discussed at the light of former publications by the authors. It appears among others that the age variable has to be taken in consideration in the explanation of psychiatric sequels of captivity, and that the condition "invalid of war" poorly reflects the degree of psychiatric morbidity.

  9. Two-Point Resistance of a Non-Regular Cylindrical Network with a Zero Resistor Axis and Two Arbitrary Boundaries

    NASA Astrophysics Data System (ADS)

    Tan, Zhi-Zhong

    2017-03-01

    We study a problem of two-point resistance in a non-regular m × n cylindrical network with a zero resistor axis and two arbitrary boundaries by means of the Recursion-Transform method. This is a new problem never solved before, the Green’s function technique and the Laplacian matrix approach are invalid in this case. A disordered network with arbitrary boundaries is a basic model in many physical systems or real world systems, however looking for the exact calculation of the resistance of a binary resistor network is important but difficult in the case of the arbitrary boundaries, the boundary is like a wall or trap which affects the behavior of finite network. In this paper we obtain a general resistance formula of a non-regular m × n cylindrical network, which is composed of a single summation. Further, the current distribution is given explicitly as a byproduct of the method. As applications, several interesting results are derived by making special cases from the general formula. Supported by the Natural Science Foundation of Jiangsu Province under Grant No. BK20161278

  10. Sketching the pion's valence-quark generalised parton distribution

    DOE PAGES

    Mezrag, C.; Chang, L.; Moutarde, H.; ...

    2015-02-01

    In order to learn effectively from measurements of generalised parton distributions (GPDs), it is desirable to compute them using a framework that can potentially connect empirical information with basic features of the Standard Model. We sketch an approach to such computations, based upon a rainbow-ladder (RL) truncation of QCD’s Dyson–Schwinger equations and exemplified via the pion’s valence dressed-quark GPD, H v π(x, ξ, t). Our analysis focuses primarily on ξ=0, although we also capitalise on the symmetry-preserving nature of the RL truncation by connecting H v π(x, ξ=±1, t)with the pion’s valence-quark parton distribution amplitude. We explain that the impulse-approximationmore » used hitherto to define the pion’s valence dressed-quark GPD is generally invalid owing to omission of contributions from the gluons which bind dressed-quarks into the pion. A simple correction enables us to identify a practicable improvement to the approximation for H v π(x, 0, t), expressed as the Radon transform of a single amplitude. Therewith we obtain results for H v π(x, 0, t) and the associated impact-parameter dependent distribution, q v π(x, |b⊥|), which provide a qualitatively sound picture of the pion’s dressed-quark structure at a hadronic scale. We evolve the distributions to a scale ζ = 2 GeV, so as to facilitate comparisons in future with results from experiment or other nonperturbative methods.« less

  11. Mathematical modelling of complex contagion on clustered networks

    NASA Astrophysics Data System (ADS)

    O'sullivan, David J.; O'Keeffe, Gary; Fennell, Peter; Gleeson, James

    2015-09-01

    The spreading of behavior, such as the adoption of a new innovation, is influenced bythe structure of social networks that interconnect the population. In the experiments of Centola (Science, 2010), adoption of new behavior was shown to spread further and faster across clustered-lattice networks than across corresponding random networks. This implies that the “complex contagion” effects of social reinforcement are important in such diffusion, in contrast to “simple” contagion models of disease-spread which predict that epidemics would grow more efficiently on random networks than on clustered networks. To accurately model complex contagion on clustered networks remains a challenge because the usual assumptions (e.g. of mean-field theory) regarding tree-like networks are invalidated by the presence of triangles in the network; the triangles are, however, crucial to the social reinforcement mechanism, which posits an increased probability of a person adopting behavior that has been adopted by two or more neighbors. In this paper we modify the analytical approach that was introduced by Hebert-Dufresne et al. (Phys. Rev. E, 2010), to study disease-spread on clustered networks. We show how the approximation method can be adapted to a complex contagion model, and confirm the accuracy of the method with numerical simulations. The analytical results of the model enable us to quantify the level of social reinforcement that is required to observe—as in Centola’s experiments—faster diffusion on clustered topologies than on random networks.

  12. DyKOSMap: A framework for mapping adaptation between biomedical knowledge organization systems.

    PubMed

    Dos Reis, Julio Cesar; Pruski, Cédric; Da Silveira, Marcos; Reynaud-Delaître, Chantal

    2015-06-01

    Knowledge Organization Systems (KOS) and their associated mappings play a central role in several decision support systems. However, by virtue of knowledge evolution, KOS entities are modified over time, impacting mappings and potentially turning them invalid. This requires semi-automatic methods to maintain such semantic correspondences up-to-date at KOS evolution time. We define a complete and original framework based on formal heuristics that drives the adaptation of KOS mappings. Our approach takes into account the definition of established mappings, the evolution of KOS and the possible changes that can be applied to mappings. This study experimentally evaluates the proposed heuristics and the entire framework on realistic case studies borrowed from the biomedical domain, using official mappings between several biomedical KOSs. We demonstrate the overall performance of the approach over biomedical datasets of different characteristics and sizes. Our findings reveal the effectiveness in terms of precision, recall and F-measure of the suggested heuristics and methods defining the framework to adapt mappings affected by KOS evolution. The obtained results contribute and improve the quality of mappings over time. The proposed framework can adapt mappings largely automatically, facilitating thus the maintenance task. The implemented algorithms and tools support and minimize the work of users in charge of KOS mapping maintenance. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Deformable segmentation of 3D MR prostate images via distributed discriminative dictionary and ensemble learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Yanrong; Shao, Yeqin; Gao, Yaozong

    Purpose: Automatic prostate segmentation from MR images is an important task in various clinical applications such as prostate cancer staging and MR-guided radiotherapy planning. However, the large appearance and shape variations of the prostate in MR images make the segmentation problem difficult to solve. Traditional Active Shape/Appearance Model (ASM/AAM) has limited accuracy on this problem, since its basic assumption, i.e., both shape and appearance of the targeted organ follow Gaussian distributions, is invalid in prostate MR images. To this end, the authors propose a sparse dictionary learning method to model the image appearance in a nonparametric fashion and further integratemore » the appearance model into a deformable segmentation framework for prostate MR segmentation. Methods: To drive the deformable model for prostate segmentation, the authors propose nonparametric appearance and shape models. The nonparametric appearance model is based on a novel dictionary learning method, namely distributed discriminative dictionary (DDD) learning, which is able to capture fine distinctions in image appearance. To increase the differential power of traditional dictionary-based classification methods, the authors' DDD learning approach takes three strategies. First, two dictionaries for prostate and nonprostate tissues are built, respectively, using the discriminative features obtained from minimum redundancy maximum relevance feature selection. Second, linear discriminant analysis is employed as a linear classifier to boost the optimal separation between prostate and nonprostate tissues, based on the representation residuals from sparse representation. Third, to enhance the robustness of the authors' classification method, multiple local dictionaries are learned for local regions along the prostate boundary (each with small appearance variations), instead of learning one global classifier for the entire prostate. These discriminative dictionaries are located on different patches of the prostate surface and trained to adaptively capture the appearance in different prostate zones, thus achieving better local tissue differentiation. For each local region, multiple classifiers are trained based on the randomly selected samples and finally assembled by a specific fusion method. In addition to this nonparametric appearance model, a prostate shape model is learned from the shape statistics using a novel approach, sparse shape composition, which can model nonGaussian distributions of shape variation and regularize the 3D mesh deformation by constraining it within the observed shape subspace. Results: The proposed method has been evaluated on two datasets consisting of T2-weighted MR prostate images. For the first (internal) dataset, the classification effectiveness of the authors' improved dictionary learning has been validated by comparing it with three other variants of traditional dictionary learning methods. The experimental results show that the authors' method yields a Dice Ratio of 89.1% compared to the manual segmentation, which is more accurate than the three state-of-the-art MR prostate segmentation methods under comparison. For the second dataset, the MICCAI 2012 challenge dataset, the authors' proposed method yields a Dice Ratio of 87.4%, which also achieves better segmentation accuracy than other methods under comparison. Conclusions: A new magnetic resonance image prostate segmentation method is proposed based on the combination of deformable model and dictionary learning methods, which achieves more accurate segmentation performance on prostate T2 MR images.« less

  14. Surface-illuminant ambiguity and color constancy: effects of scene complexity and depth cues.

    PubMed

    Kraft, James M; Maloney, Shannon I; Brainard, David H

    2002-01-01

    Two experiments were conducted to study how scene complexity and cues to depth affect human color constancy. Specifically, two levels of scene complexity were compared. The low-complexity scene contained two walls with the same surface reflectance and a test patch which provided no information about the illuminant. In addition to the surfaces visible in the low-complexity scene, the high-complexity scene contained two rectangular solid objects and 24 paper samples with diverse surface reflectances. Observers viewed illuminated objects in an experimental chamber and adjusted the test patch until it appeared achromatic. Achromatic settings made tinder two different illuminants were used to compute an index that quantified the degree of constancy. Two experiments were conducted: one in which observers viewed the stimuli directly, and one in which they viewed the scenes through an optical system that reduced cues to depth. In each experiment, constancy was assessed for two conditions. In the valid-cue condition, many cues provided valid information about the illuminant change. In the invalid-cue condition, some image cues provided invalid information. Four broad conclusions are drawn from the data: (a) constancy is generally better in the valid-cue condition than in the invalid-cue condition: (b) for the stimulus configuration used, increasing image complexity has little effect in the valid-cue condition but leads to increased constancy in the invalid-cue condition; (c) for the stimulus configuration used, reducing cues to depth has little effect for either constancy condition: and (d) there is moderate individual variation in the degree of constancy exhibited, particularly in the degree to which the complexity manipulation affects performance.

  15. Conditionally Increased Acoustic Pressures in Nonfetal Diagnostic Ultrasound Examinations Without Contrast Agents: A Preliminary Assessment

    PubMed Central

    Nightingale, Kathryn R.; Church, Charles C.; Harris, Gerald; Wear, Keith A.; Bailey, Michael R.; Carson, Paul L.; Jiang, Hui; Sandstrom, Kurt L.; Szabo, Thomas L.; Ziskin, Marvin C.

    2016-01-01

    The mechanical index (MI) has been used by the US Food and Drug Administration (FDA) since 1992 for regulatory decisions regarding the acoustic output of diagnostic ultrasound equipment. Its formula is based on predictions of acoustic cavitation under specific conditions. Since its implementation over 2 decades ago, new imaging modes have been developed that employ unique beam sequences exploiting higher-order acoustic phenomena, and, concurrently, studies of the bioeffects of ultrasound under a range of imaging scenarios have been conducted. In 2012, the American Institute of Ultrasound in Medicine Technical Standards Committee convened a working group of its Output Standards Subcommittee to examine and report on the potential risks and benefits of the use of conditionally increased acoustic pressures (CIP) under specific diagnostic imaging scenarios. The term “conditionally” is included to indicate that CIP would be considered on a per-patient basis for the duration required to obtain the necessary diagnostic information. This document is a result of that effort. In summary, a fundamental assumption in the MI calculation is the presence of a preexisting gas body. For tissues not known to contain preexisting gas bodies, based on theoretical predications and experimentally reported cavitation thresholds, we find this assumption to be invalid. We thus conclude that exceeding the recommended maximum MI level given in the FDA guidance could be warranted without concern for increased risk of cavitation in these tissues. However, there is limited literature assessing the potential clinical benefit of exceeding the MI guidelines in these tissues. The report proposes a 3-tiered approach for CIP that follows the model for employing elevated output in magnetic resonance imaging and concludes with summary recommendations to facilitate Institutional Review Board (IRB)-monitored clinical studies investigating CIP in specific tissues. PMID:26112617

  16. First Spectra of O Stars in R136A

    NASA Astrophysics Data System (ADS)

    Heap, Sara

    1994-01-01

    Hubble images of the cluster, R136a, in the LMC indicate that the cluster contains 3 Wolf-Rayet stars, R136a1,-a2, and a3 (Campbell et al. 1992) and numerous O and B-type stars. Although models for WR stars are not well enough developed to infer the basic parameters of the 3 WR stars in R136a, models for O stars are well well established, and they suggest that the O stars in R136a are relatively normal, having initial masses no higher than 60 Msun (Heap et al. 1992, Malumuth & Heap 1992, di Marchi et al. 1993); there are no unusual "super-massive" stars in R136a. With HST/GHRS/CoSTAR, it will be possible to obtain spectra of an O star in R136a without contam- ination by WR stars. These spectra will be able to confirm or invalidate the photometric results. Thus, these spectra will have implications both for the population of R136a and for the validity of stellar population studies of giant extragalactic HII regions and starbursts that are based entirely on photometry.

  17. Optimal harvesting policy of a stochastic two-species competitive model with Lévy noise in a polluted environment

    NASA Astrophysics Data System (ADS)

    Zhao, Yu; Yuan, Sanling

    2017-07-01

    As well known that the sudden environmental shocks and toxicant can affect the population dynamics of fish species, a mechanistic understanding of how sudden environmental change and toxicant influence the optimal harvesting policy requires development. This paper presents the optimal harvesting of a stochastic two-species competitive model with Lévy noise in a polluted environment, where the Lévy noise is used to describe the sudden climate change. Due to the discontinuity of the Lévy noise, the classical optimal harvesting methods based on the explicit solution of the corresponding Fokker-Planck equation are invalid. The object of this paper is to fill up this gap and establish the optimal harvesting policy. By using of aggregation and ergodic methods, the approximation of the optimal harvesting effort and maximum expectation of sustainable yields are obtained. Numerical simulations are carried out to support these theoretical results. Our analysis shows that the Lévy noise and the mean stress measure of toxicant in organism may affect the optimal harvesting policy significantly.

  18. Heuristics can produce surprisingly rational probability estimates: Comment on Costello and Watts (2014).

    PubMed

    Nilsson, Håkan; Juslin, Peter; Winman, Anders

    2016-01-01

    Costello and Watts (2014) present a model assuming that people's knowledge of probabilities adheres to probability theory, but that their probability judgments are perturbed by a random noise in the retrieval from memory. Predictions for the relationships between probability judgments for constituent events and their disjunctions and conjunctions, as well as for sums of such judgments were derived from probability theory. Costello and Watts (2014) report behavioral data showing that subjective probability judgments accord with these predictions. Based on the finding that subjective probability judgments follow probability theory, Costello and Watts (2014) conclude that the results imply that people's probability judgments embody the rules of probability theory and thereby refute theories of heuristic processing. Here, we demonstrate the invalidity of this conclusion by showing that all of the tested predictions follow straightforwardly from an account assuming heuristic probability integration (Nilsson, Winman, Juslin, & Hansson, 2009). We end with a discussion of a number of previous findings that harmonize very poorly with the predictions by the model suggested by Costello and Watts (2014). (c) 2015 APA, all rights reserved).

  19. Hume, Mill, Hill, and the Sui Generis Epidemiologic Approach to Causal Inference

    PubMed Central

    Morabia, Alfredo

    2013-01-01

    The epidemiologic approach to causal inference (i.e., Hill's viewpoints) consists of evaluating potential causes from the following 2, noncumulative angles: 1) established results from comparative, observational, or experimental epidemiologic studies; and 2) reviews of nonepidemiologic evidence. It does not involve statements of statistical significance. The philosophical roots of Hill's viewpoints are unknown. Superficially, they seem to descend from the ideas of Hume and Mill. Hill's viewpoints, however, use a different kind of evidence and have different purposes than do Hume's rules or Mill's system of logic. In a nutshell, Hume ignores comparative evidence central to Hill's viewpoints. Mill's logic disqualifies as invalid nonexperimental evidence, which forms the bulk of epidemiologic findings reviewed from Hill's viewpoints. The approaches by Hume and Mill cannot corroborate successful implementations of Hill's viewpoints. Besides Hume and Mill, the epidemiologic literature is clueless about a plausible, pre-1965 philosophical origin of Hill's viewpoints. Thus, Hill's viewpoints may be philosophically novel, sui generis, still waiting to be validated and justified. PMID:24071010

  20. Hume, Mill, Hill, and the sui generis epidemiologic approach to causal inference.

    PubMed

    Morabia, Alfredo

    2013-11-15

    The epidemiologic approach to causal inference (i.e., Hill's viewpoints) consists of evaluating potential causes from the following 2, noncumulative angles: 1) established results from comparative, observational, or experimental epidemiologic studies; and 2) reviews of nonepidemiologic evidence. It does not involve statements of statistical significance. The philosophical roots of Hill's viewpoints are unknown. Superficially, they seem to descend from the ideas of Hume and Mill. Hill's viewpoints, however, use a different kind of evidence and have different purposes than do Hume's rules or Mill's system of logic. In a nutshell, Hume ignores comparative evidence central to Hill's viewpoints. Mill's logic disqualifies as invalid nonexperimental evidence, which forms the bulk of epidemiologic findings reviewed from Hill's viewpoints. The approaches by Hume and Mill cannot corroborate successful implementations of Hill's viewpoints. Besides Hume and Mill, the epidemiologic literature is clueless about a plausible, pre-1965 philosophical origin of Hill's viewpoints. Thus, Hill's viewpoints may be philosophically novel, sui generis, still waiting to be validated and justified.

  1. ScaleNet: a literature-based model of scale insect biology and systematics.

    PubMed

    García Morales, Mayrolin; Denno, Barbara D; Miller, Douglass R; Miller, Gary L; Ben-Dov, Yair; Hardy, Nate B

    2016-01-01

    Scale insects (Hemiptera: Coccoidea) are small herbivorous insects found on all continents except Antarctica. They are extremely invasive, and many species are serious agricultural pests. They are also emerging models for studies of the evolution of genetic systems, endosymbiosis and plant-insect interactions. ScaleNet was launched in 1995 to provide insect identifiers, pest managers, insect systematists, evolutionary biologists and ecologists efficient access to information about scale insect biological diversity. It provides comprehensive information on scale insects taken directly from the primary literature. Currently, it draws from 23,477 articles and describes the systematics and biology of 8194 valid species. For 20 years, ScaleNet ran on the same software platform. That platform is no longer viable. Here, we present a new, open-source implementation of ScaleNet. We have normalized the data model, begun the process of correcting invalid data, upgraded the user interface, and added online administrative tools. These improvements make ScaleNet easier to use and maintain and make the ScaleNet data more accurate and extendable. Database URL: http://scalenet.info. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.

  2. Statistical studies of animal response data from USF toxicity screening test method

    NASA Technical Reports Server (NTRS)

    Hilado, C. J.; Machado, A. M.

    1978-01-01

    Statistical examination of animal response data obtained using Procedure B of the USF toxicity screening test method indicates that the data deviate only slightly from a normal or Gaussian distribution. This slight departure from normality is not expected to invalidate conclusions based on theoretical statistics. Comparison of times to staggering, convulsions, collapse, and death as endpoints shows that time to death appears to be the most reliable endpoint because it offers the lowest probability of missed observations and premature judgements.

  3. A neural network-based estimator for the mixture ratio of the Space Shuttle Main Engine

    NASA Astrophysics Data System (ADS)

    Guo, T. H.; Musgrave, J.

    1992-11-01

    In order to properly utilize the available fuel and oxidizer of a liquid propellant rocket engine, the mixture ratio is closed loop controlled during main stage (65 percent - 109 percent power) operation. However, because of the lack of flight-capable instrumentation for measuring mixture ratio, the value of mixture ratio in the control loop is estimated using available sensor measurements such as the combustion chamber pressure and the volumetric flow, and the temperature and pressure at the exit duct on the low pressure fuel pump. This estimation scheme has two limitations. First, the estimation formula is based on an empirical curve fitting which is accurate only within a narrow operating range. Second, the mixture ratio estimate relies on a few sensor measurements and loss of any of these measurements will make the estimate invalid. In this paper, we propose a neural network-based estimator for the mixture ratio of the Space Shuttle Main Engine. The estimator is an extension of a previously developed neural network based sensor failure detection and recovery algorithm (sensor validation). This neural network uses an auto associative structure which utilizes the redundant information of dissimilar sensors to detect inconsistent measurements. Two approaches have been identified for synthesizing mixture ratio from measurement data using a neural network. The first approach uses an auto associative neural network for sensor validation which is modified to include the mixture ratio as an additional output. The second uses a new network for the mixture ratio estimation in addition to the sensor validation network. Although mixture ratio is not directly measured in flight, it is generally available in simulation and in test bed firing data from facility measurements of fuel and oxidizer volumetric flows. The pros and cons of these two approaches will be discussed in terms of robustness to sensor failures and accuracy of the estimate during typical transients using simulation data.

  4. A neural network-based estimator for the mixture ratio of the Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Guo, T. H.; Musgrave, J.

    1992-01-01

    In order to properly utilize the available fuel and oxidizer of a liquid propellant rocket engine, the mixture ratio is closed loop controlled during main stage (65 percent - 109 percent power) operation. However, because of the lack of flight-capable instrumentation for measuring mixture ratio, the value of mixture ratio in the control loop is estimated using available sensor measurements such as the combustion chamber pressure and the volumetric flow, and the temperature and pressure at the exit duct on the low pressure fuel pump. This estimation scheme has two limitations. First, the estimation formula is based on an empirical curve fitting which is accurate only within a narrow operating range. Second, the mixture ratio estimate relies on a few sensor measurements and loss of any of these measurements will make the estimate invalid. In this paper, we propose a neural network-based estimator for the mixture ratio of the Space Shuttle Main Engine. The estimator is an extension of a previously developed neural network based sensor failure detection and recovery algorithm (sensor validation). This neural network uses an auto associative structure which utilizes the redundant information of dissimilar sensors to detect inconsistent measurements. Two approaches have been identified for synthesizing mixture ratio from measurement data using a neural network. The first approach uses an auto associative neural network for sensor validation which is modified to include the mixture ratio as an additional output. The second uses a new network for the mixture ratio estimation in addition to the sensor validation network. Although mixture ratio is not directly measured in flight, it is generally available in simulation and in test bed firing data from facility measurements of fuel and oxidizer volumetric flows. The pros and cons of these two approaches will be discussed in terms of robustness to sensor failures and accuracy of the estimate during typical transients using simulation data.

  5. Vaccination coverage survey in Dhaka District.

    PubMed

    Khan, M N A; Rahman, M L; Awal Miah, A; Islam, M S; Musa, S A J M; Tofail, F

    2005-08-01

    A survey was conducted in Dhaka District to measure the level of routine immunization coverage of children (12-23 months), to assess the tetanus toxoid (TT) immunization coverage among mothers of children (12-23 month), to evaluate EPI program continuity (dropout rates) and quality (percent of Invalid doses, vaccination card availability etc.) For this purpose, a thirty cluster cross-sectional survey was conducted in October 2002 to assess the immunization coverage in Dhaka. In this survey 30 clusters were randomly selected from a list of villages in 63 Unions of Dhaka following probability proportion to size (PPS) sampling procedure. A total of 210 children was studied using pre-tested structured questionnaire. Descriptive statistics was employed using software SPSS package for data analysis. The study showed that the routine immunization coverage in Dhaka among children by 12 months of age by card + history was 97% for BCG, 97% for Diphtheria, Pertussis Tetanus (DPT 1) and Oral Polio Vaccine (OPV 1), 75% for DPT3 and OPV3 and 67% for measles. Sixty six percent of all children surveyed had received valid doses of all vaccines by 12 months (fully immunized child). Programme access as measured by crude DPT1 coverage was better in Keranigonj (97%). Vaccination cards retention rate for children was 84%. Invalid DPT (1,2 or 3) doses were given to 25% of vaccinated children; 18% of measles doses were invalid. Surprisingly, major cause for invalid doses were not due to early immunizations or due to card lost but for giving tick in the card, instead of writing a valid date. DPT1 and DPT3 and DPT1- Measles drop out rates were 5% and 13% respectively. Major reason parents gave for never vaccinating their children (zero dose children) was (43%), major reasons for incomplete vaccination was lack of knowledge regarding subsequent doses (46%). TT surveys were also conducted for mothers of the children surveyed for vaccination coverage (mothers between 15-49 year old). Valid TT 1-5 coverage by card+ history was 97%, 55%, 44%, 24% and 11%, respectively. Card retention rate for TT was 67%. The findings of this study revealed that access to child and TT immunizations were good. But high dropouts and invalid doses reduced these percentages of fully immunized child to 66%. Programmatic strategy must be undertaken to reduce the existing high dropout rate in both child and TT immunizations.

  6. Comment on "Continuum Lowering and Fermi-Surface Rising in Stromgly Coupled and Degenerate Plasmas"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iglesias, C. A.; Sterne, P. A.

    In a recent Letter, Hu [1] reported photon absorption cross sections in strongly coupled, degenerate plasmas from quantum molecular dynamics (QMD). The Letter claims that the K-edge shift as a function of plasma density computed with simple ionization potential depression (IPD) models are in violent disagreement with the QMD results. The QMD calculations displayed an increase in Kedge shift with increasing density while the simpler models yielded a decrease. Here, this Comment shows that the claimed large errors reported by Hu for the widely used Stewart- Pyatt (SP) model [2] stem from an invalid comparison of disparate physical quantities andmore » is largely resolved by including well-known corrections for degenerate systems.« less

  7. Comment on "Continuum Lowering and Fermi-Surface Rising in Stromgly Coupled and Degenerate Plasmas"

    DOE PAGES

    Iglesias, C. A.; Sterne, P. A.

    2018-03-16

    In a recent Letter, Hu [1] reported photon absorption cross sections in strongly coupled, degenerate plasmas from quantum molecular dynamics (QMD). The Letter claims that the K-edge shift as a function of plasma density computed with simple ionization potential depression (IPD) models are in violent disagreement with the QMD results. The QMD calculations displayed an increase in Kedge shift with increasing density while the simpler models yielded a decrease. Here, this Comment shows that the claimed large errors reported by Hu for the widely used Stewart- Pyatt (SP) model [2] stem from an invalid comparison of disparate physical quantities andmore » is largely resolved by including well-known corrections for degenerate systems.« less

  8. CRISPR/Cas9 mutagenesis invalidates a putative cancer dependency targeted in on-going clinical trials.

    PubMed

    Lin, Ann; Giuliano, Christopher J; Sayles, Nicole M; Sheltzer, Jason M

    2017-03-24

    The Maternal Embryonic Leucine Zipper Kinase (MELK) has been reported to be a genetic dependency in several cancer types. MELK RNAi and small-molecule inhibitors of MELK block the proliferation of various cancer cell lines, and MELK knockdown has been described as particularly effective against the highly-aggressive basal/triple-negative subtype of breast cancer. Based on these preclinical results, the MELK inhibitor OTS167 is currently being tested as a novel chemotherapy agent in several clinical trials. Here, we report that mutagenizing MELK with CRISPR/Cas9 has no effect on the fitness of basal breast cancer cell lines or cell lines from six other cancer types. Cells that harbor null mutations in MELK exhibit wild-type doubling times, cytokinesis, and anchorage-independent growth. Furthermore, MELK-knockout lines remain sensitive to OTS167, suggesting that this drug blocks cell division through an off-target mechanism. In total, our results undermine the rationale for a series of current clinical trials and provide an experimental approach for the use of CRISPR/Cas9 in preclinical target validation that can be broadly applied.

  9. Optimization Algorithm for Kalman Filter Exploiting the Numerical Characteristics of SINS/GPS Integrated Navigation Systems.

    PubMed

    Hu, Shaoxing; Xu, Shike; Wang, Duhu; Zhang, Aiwu

    2015-11-11

    Aiming at addressing the problem of high computational cost of the traditional Kalman filter in SINS/GPS, a practical optimization algorithm with offline-derivation and parallel processing methods based on the numerical characteristics of the system is presented in this paper. The algorithm exploits the sparseness and/or symmetry of matrices to simplify the computational procedure. Thus plenty of invalid operations can be avoided by offline derivation using a block matrix technique. For enhanced efficiency, a new parallel computational mechanism is established by subdividing and restructuring calculation processes after analyzing the extracted "useful" data. As a result, the algorithm saves about 90% of the CPU processing time and 66% of the memory usage needed in a classical Kalman filter. Meanwhile, the method as a numerical approach needs no precise-loss transformation/approximation of system modules and the accuracy suffers little in comparison with the filter before computational optimization. Furthermore, since no complicated matrix theories are needed, the algorithm can be easily transplanted into other modified filters as a secondary optimization method to achieve further efficiency.

  10. Ground Data System Analysis Tools to Track Flight System State Parameters for the Mars Science Laboratory (MSL) and Beyond

    NASA Technical Reports Server (NTRS)

    Allard, Dan; Deforrest, Lloyd

    2014-01-01

    Flight software parameters enable space mission operators fine-tuned control over flight system configurations, enabling rapid and dynamic changes to ongoing science activities in a much more flexible manner than can be accomplished with (otherwise broadly used) configuration file based approaches. The Mars Science Laboratory (MSL), Curiosity, makes extensive use of parameters to support complex, daily activities via commanded changes to said parameters in memory. However, as the loss of Mars Global Surveyor (MGS) in 2006 demonstrated, flight system management by parameters brings with it risks, including the possibility of losing track of the flight system configuration and the threat of invalid command executions. To mitigate this risk a growing number of missions have funded efforts to implement parameter tracking parameter state software tools and services including MSL and the Soil Moisture Active Passive (SMAP) mission. This paper will discuss the engineering challenges and resulting software architecture of MSL's onboard parameter state tracking software and discuss the road forward to make parameter management tools suitable for use on multiple missions.

  11. Confident Surgical Decision Making in Temporal Lobe Epilepsy by Heterogeneous Classifier Ensembles

    PubMed Central

    Fakhraei, Shobeir; Soltanian-Zadeh, Hamid; Jafari-Khouzani, Kourosh; Elisevich, Kost; Fotouhi, Farshad

    2015-01-01

    In medical domains with low tolerance for invalid predictions, classification confidence is highly important and traditional performance measures such as overall accuracy cannot provide adequate insight into classifications reliability. In this paper, a confident-prediction rate (CPR) which measures the upper limit of confident predictions has been proposed based on receiver operating characteristic (ROC) curves. It has been shown that heterogeneous ensemble of classifiers improves this measure. This ensemble approach has been applied to lateralization of focal epileptogenicity in temporal lobe epilepsy (TLE) and prediction of surgical outcomes. A goal of this study is to reduce extraoperative electrocorticography (eECoG) requirement which is the practice of using electrodes placed directly on the exposed surface of the brain. We have shown that such goal is achievable with application of data mining techniques. Furthermore, all TLE surgical operations do not result in complete relief from seizures and it is not always possible for human experts to identify such unsuccessful cases prior to surgery. This study demonstrates the capability of data mining techniques in prediction of undesirable outcome for a portion of such cases. PMID:26609547

  12. Search for Effects of an Electrostatic Potential on Clocks in the Frame of Reference of a Charged Particle

    NASA Technical Reports Server (NTRS)

    Ringermacher, Harry I.; Conradi, Mark S.; Cassenti, Brice

    2005-01-01

    Results of experiments to confirm a theory that links classical electromagnetism with the geometry of spacetime are described. The theory, based on the introduction of a Torsion tensor into Einstein s equations and following the approach of Schroedinger, predicts effects on clocks attached to charged particles, subject to intense electric fields, analogous to the effects on clocks in a gravitational field. We show that in order to interpret this theory, one must re-interpret all clock changes, both gravitational and electromagnetic, as arising from changes in potential energy and not merely potential. The clock is provided naturally by proton spins in hydrogen atoms subject to Nuclear Magnetic Resonance trials. No frequency change of clocks was observed to a resolution of 6310(exp -9). A new "Clock Principle" was postulated to explain the null result. There are two possible implications of the experiments: (a) The Clock Principle is invalid and, in fact, no metric theory incorporating electromagnetism is possible; (b) The Clock Principle is valid and it follows that a negative rest mass cannot exist.

  13. Performance goals in conflictual social interactions: towards the distinction between two modes of relational conflict regulation.

    PubMed

    Sommet, Nicolas; Darnon, Céline; Mugny, Gabriel; Quiamzade, Alain; Pulfrey, Caroline; Dompnier, Benoît; Butera, Fabrizio

    2014-03-01

    Socio-cognitive conflict has been defined as a situation of confrontation with a disagreeing other. Previous research suggests that individuals can regulate conflict in a relational way, namely by focusing on social comparison between relative levels of competences. Relational conflict regulation has been described as yielding particularly negative effects on social interactions and learning, but has been understudied. The present research addresses the question of the origin of relational conflict regulation by introducing a fundamental distinction between two types of regulation, one based on the affirmation of one's own point of view and the invalidation of the other's (i.e., 'competitive' regulation), the other corresponding to the protection of self-competence via compliance (i.e., 'protective' regulation). Three studies show that these modes of relational conflict regulation result from the endorsement of distinct performance goals, respectively, performance-approach goals (trying to outperform others) and performance-avoidance goals (avoiding performing more poorly than others). Theoretical implications for the literature on both conflict regulation and achievement goals are discussed. © 2012 The British Psychological Society.

  14. A two-step framework for reconstructing remotely sensed land surface temperatures contaminated by cloud

    NASA Astrophysics Data System (ADS)

    Zeng, Chao; Long, Di; Shen, Huanfeng; Wu, Penghai; Cui, Yaokui; Hong, Yang

    2018-07-01

    Land surface temperature (LST) is one of the most important parameters in land surface processes. Although satellite-derived LST can provide valuable information, the value is often limited by cloud contamination. In this paper, a two-step satellite-derived LST reconstruction framework is proposed. First, a multi-temporal reconstruction algorithm is introduced to recover invalid LST values using multiple LST images with reference to corresponding remotely sensed vegetation index. Then, all cloud-contaminated areas are temporally filled with hypothetical clear-sky LST values. Second, a surface energy balance equation-based procedure is used to correct for the filled values. With shortwave irradiation data, the clear-sky LST is corrected to the real LST under cloudy conditions. A series of experiments have been performed to demonstrate the effectiveness of the developed approach. Quantitative evaluation results indicate that the proposed method can recover LST in different surface types with mean average errors in 3-6 K. The experiments also indicate that the time interval between the multi-temporal LST images has a greater impact on the results than the size of the contaminated area.

  15. Characterization and Modeling of High Power Microwave Effects in CMOS Microelectronics

    DTIC Science & Technology

    2010-01-01

    margin measurement 28 Any voltage above the line marked VIH is considered a valid logic high on the input of the gate. VIH and VIL are defined...can handle any voltage noise level at the input up to VIL without changing state. The region in between VIL and VIH is considered an invalid logic...29 Table 2.2: Intrinsic device characteristics derived from SPETCRE simulations   VIH  (V)  VIL (V)  High Noise Margin  (V)  Low Noise Margin (V

  16. An Enhanced Vacuum Cure Technique for On-Aircraft Repair of Carbon-Bismaleimide Composites

    NASA Astrophysics Data System (ADS)

    Rider, Andrew N.; Baker, Alan A.; Wang, Chun H.; Smith, Graeme

    2011-06-01

    Carbon/bismaleimide (BMI) composite is increasingly employed in critical load carrying aircraft structures designed to operate at temperatures approaching 180°C. The high post-cure temperature (above 220°C) required to fully react the BMI resin, however, renders existing on-aircraft prepreg or wet layup repair methods invalid. This paper presents a new on-aircraft repair technique for carbon/BMI composites. The composite prepregs are first warm-staged to improve the ability to evacuate entrapped air. Then the patch is cured in the scarf cavity using the vacuum bag technique, followed by off-aircraft post-cure. The fully cured patch then can be bonded using a structural adhesive.

  17. Tutorial on Biostatistics: Linear Regression Analysis of Continuous Correlated Eye Data.

    PubMed

    Ying, Gui-Shuang; Maguire, Maureen G; Glynn, Robert; Rosner, Bernard

    2017-04-01

    To describe and demonstrate appropriate linear regression methods for analyzing correlated continuous eye data. We describe several approaches to regression analysis involving both eyes, including mixed effects and marginal models under various covariance structures to account for inter-eye correlation. We demonstrate, with SAS statistical software, applications in a study comparing baseline refractive error between one eye with choroidal neovascularization (CNV) and the unaffected fellow eye, and in a study determining factors associated with visual field in the elderly. When refractive error from both eyes were analyzed with standard linear regression without accounting for inter-eye correlation (adjusting for demographic and ocular covariates), the difference between eyes with CNV and fellow eyes was 0.15 diopters (D; 95% confidence interval, CI -0.03 to 0.32D, p = 0.10). Using a mixed effects model or a marginal model, the estimated difference was the same but with narrower 95% CI (0.01 to 0.28D, p = 0.03). Standard regression for visual field data from both eyes provided biased estimates of standard error (generally underestimated) and smaller p-values, while analysis of the worse eye provided larger p-values than mixed effects models and marginal models. In research involving both eyes, ignoring inter-eye correlation can lead to invalid inferences. Analysis using only right or left eyes is valid, but decreases power. Worse-eye analysis can provide less power and biased estimates of effect. Mixed effects or marginal models using the eye as the unit of analysis should be used to appropriately account for inter-eye correlation and maximize power and precision.

  18. Analysis of health economics assessment reports for pharmaceuticals in France – understanding the underlying philosophy of CEESP assessment

    PubMed Central

    Toumi, Mondher; Motrunich, Anastasiia; Millier, Aurélie; Rémuzat, Cécile; Chouaid, Christos; Falissard, Bruno; Aballéa, Samuel

    2017-01-01

    ABSTRACT Background: Despite the guidelines for Economic and Public Health Assessment Committee (CEESP) submission having been available for nearly six years, the dossiers submitted continue to deviate from them, potentially impacting product prices. Objective: to review the reports published by CEESP, analyse deviations from the guidelines, and discuss their implications for the pricing and reimbursement process. Study design: CEESP reports published until January 2017 were reviewed, and deviations from the guidelines were extracted. The frequency of deviations was described by type of methodological concern (minor, important or major). Results: In 19 reports, we identified 243 methodological concerns, most often concerning modelling, measurement and valuation of health states and results presentation and sensitivity analyses; nearly 63% were minor, 33% were important and 4.5% were major. All reports included minor methodological concerns, and 17 (89%) included at least one important and/or major methodological concern. Global major methodological concerns completely invalidated the analysis in seven dossiers (37%). Conclusion: The CEESP submission dossiers fail to adhere to the guidelines, potentially invalidating the health economics analysis and resulting in pricing negotiations. As these negotiations tend to be unfavourable for the manufacturer, the industry should strive to improve the quality of the analyses submitted to CEESP. PMID:28804600

  19. Inconsistent Responding in a Criminal Forensic Setting: An Evaluation of the VRIN-r and TRIN-r Scales of the MMPI-2-RF.

    PubMed

    Gu, Wen; Reddy, Hima B; Green, Debbie; Belfi, Brian; Einzig, Shanah

    2017-01-01

    Criminal forensic evaluations are complicated by the risk that examinees will respond in an unreliable manner. Unreliable responding could occur due to lack of personal investment in the evaluation, severe mental illness, and low cognitive abilities. In this study, 31% of Minnesota Multiphasic Personality Inventory-2 Restructured Form (MMPI-2-RF; Ben-Porath & Tellegen, 2008/2011) profiles were invalid due to random or fixed-responding (T score ≥ 80 on the VRIN-r or TRIN-r scales) in a sample of pretrial criminal defendants evaluated in the context of treatment for competency restoration. Hierarchical regression models showed that symptom exaggeration variables, as measured by inconsistently reported psychiatric symptoms, contributed over and above education and intellectual functioning in their prediction of both random responding and fixed responding. Psychopathology variables, as measured by mood disturbance, better predicted fixed responding after controlling for estimates of cognitive abilities, but did not improve the prediction for random responding. These findings suggest that random responding and fixed responding are not only affected by education and intellectual functioning, but also by intentional exaggeration and aspects of psychopathology. Measures of intellectual functioning and effort and response style should be considered for administration in conjunction with self-report personality measures to rule out rival hypotheses of invalid profiles.

  20. Developing self-concept instrument for pre-service mathematics teachers

    NASA Astrophysics Data System (ADS)

    Afgani, M. W.; Suryadi, D.; Dahlan, J. A.

    2018-01-01

    This study aimed to develop self-concept instrument for undergraduate students of mathematics education in Palembang, Indonesia. Type of this study was development research of non-test instrument in questionnaire form. A Validity test of the instrument was performed with construct validity test by using Pearson product moment and factor analysis, while reliability test used Cronbach’s alpha. The instrument was tested by 65 undergraduate students of mathematics education in one of the universities at Palembang, Indonesia. The instrument consisted of 43 items with 7 aspects of self-concept, that were the individual concern, social identity, individual personality, view of the future, the influence of others who become role models, the influence of the environment inside or outside the classroom, and view of the mathematics. The result of validity test showed there was one invalid item because the value of Pearson’s r was 0.107 less than the critical value (0.244; α = 0.05). The item was included in social identity aspect. After the invalid item was removed, Construct validity test with factor analysis generated only one factor. The Kaiser-Meyer-Olkin (KMO) coefficient was 0.846 and reliability coefficient was 0.91. From that result, we concluded that the self-concept instrument for undergraduate students of mathematics education in Palembang, Indonesia was valid and reliable with 42 items.

  1. Adapting Cognitive Interviewing for Early Adolescent Hispanic Girls and Sensitive Topics

    PubMed Central

    Norris, Anne E.; Torres-Thomas, Sylvia; Williams, Ellita T.

    2015-01-01

    Cognitive interviewing is a research technique commonly used in survey research to improve measurement validity. However, this technique is useful to researchers planning to use self-report measures in intervention research because invalidity of such measures jeopardizes detection of intervention effects. Little research currently exists regarding the use of cognitive interviewing techniques with adolescent populations, particularly those who are Hispanic. This article describes common challenges to conducting cognitive interviewing with early adolescent girls and how these challenges are impacted by Hispanic culture and sensitive topics. A focus group approach is recommended over the traditional one-on-one cognitive interview format, and experiences from actual focus groups, conducted in preparation for an intervention study are used to illustrate strategies for accomplishing the goals of cognitive interviewing. Creative and careful planning, attention to developmental considerations, and incorporation of cultural values are essential to the success of this approach. PMID:25239207

  2. Perspectives in Super-resolved Fluorescence Microscopy: What comes next?

    NASA Astrophysics Data System (ADS)

    Cremer, Christoph; Birk, Udo

    2016-04-01

    The Nobel Prize in Chemistry 2014 has been awarded to three scientists involved in the development of STED and PALM super-resolution fluorescence microscopy (SRM) methods. They have proven that it is possible to overcome the hundred year old theoretical limit for the resolution potential of light microscopy (of about 200 nm for visible light), which for decades has precluded a direct glimpse of the molecular machinery of life. None of the present-day super-resolution techniques have invalidated the Abbe limit for light optical detection; however, they have found clever ways around it. In this report, we discuss some of the challenges still to be resolved before arising SRM approaches will be fit to bring about the revolution in Biology and Medicine envisaged. Some of the challenges discussed are the applicability to image live and/or large samples, the further enhancement of resolution, future developments of labels, and multi-spectral approaches.

  3. The National Integrated Heat Health Information System (NIHHIS) as a Learning System for Extreme Heat: Evolving Future Resilience from Present Climate Extremes

    NASA Astrophysics Data System (ADS)

    Jones, H.; Trtanj, J.; Pulwarty, R. S.; Higgins, W.

    2016-12-01

    There is presently no consensus indicator for the effect of extreme heat on human health. At the early warning timescale, a variety of approaches to setting temperature thresholds (minimum, maximum, time-lagged) or more complex approaches (Heat Index, Thermal Comfort, etc...) for issuing alerts and warnings have been recommended by literature and implemented, leading to much heterogeneity. At longer timescales, efforts have been made to quantify potential future health outcomes using climate projections, but nonstationarity of the climate system, economy, and demography may invalidate many of the assumptions which were necessarily made in these studies. Furthermore, in our pursuit of developing the best models and indicators to represent the impacts of climate extremes, perhaps we have not paid enough attention to what makes them policy-relevant, responsive to changing assumptions, and targeted at elements that can actually be predicted. In response to this concern, a comprehensive approach to improving the impactfulness of these indicators is underway as part of the National Integrated Heat Health Information System (NIHHIS), which was initiated by NOAA and CDC, but has grown to include many other federal agency and non-governmental partners. NIHHIS is a framework that integrates what we know about extreme heat and health outcomes within a learning system - simultaneously informing early warning and long-term risk reduction prior to, during, and while recovering from extreme heat events. NIHHIS develops impactful evolutionary responses to climate extremes. Through ongoing regional engagements, we are applying the lessons of impact modeling studies to create learning systems in the Southwest, Northeast, Midwest, and soon other regions of the U.S. This session will provide a view of this process as it has been carried out in the Southwest region - focused on the transboundary (US-Mexico) region around El Paso, Texas, and the NIHHIS approach to indicators overall.

  4. On the universality of power laws for tokamak plasma predictions

    NASA Astrophysics Data System (ADS)

    Garcia, J.; Cambon, D.; Contributors, JET

    2018-02-01

    Significant deviations from well established power laws for the thermal energy confinement time, obtained from extensive databases analysis as the IPB98(y,2), have been recently reported in dedicated power scans. In order to illuminate the adequacy, validity and universality of power laws as tools for predicting plasma performance, a simplified analysis has been carried out in the framework of a minimal modeling for heat transport which is, however, able to account for the interplay between turbulence and collinear effects with the input power known to play a role in experiments with significant deviations from such power laws. Whereas at low powers, the usual scaling laws are recovered with little influence of other plasma parameters, resulting in a robust power low exponent, at high power it is shown how the exponents obtained are extremely sensitive to the heating deposition, the q-profile or even the sampling or the number of points considered due to highly non-linear behavior of the heat transport. In particular circumstances, even a minimum of the thermal energy confinement time with the input power can be obtained, which means that the approach of the energy confinement time as a power law might be intrinsically invalid. Therefore plasma predictions with a power law approximation with a constant exponent obtained from a regression of a broad range of powers and other plasma parameters which can non-linearly affect and suppress heat transport, can lead to misleading results suggesting that this approach should be taken cautiously and its results continuously compared with modeling which can properly capture the underline physics, as gyrokinetic simulations.

  5. Study of nonlinear interaction between bunched beam and intermediate cavities in a relativistic klystron amplifier

    NASA Astrophysics Data System (ADS)

    Wu, Y.; Xu, Z.; Li, Z. H.; Tang, C. X.

    2012-07-01

    In intermediate cavities of a relativistic klystron amplifier (RKA) driven by intense relativistic electron beam, the equivalent circuit model, which is widely adopted to investigate the interaction between bunched beam and the intermediate cavity in a conventional klystron design, is invalid due to the high gap voltage and the nonlinear beam loading in a RKA. According to Maxwell equations and Lorentz equation, the self-consistent equations for beam-wave interaction in the intermediate cavity are introduced to study the nonlinear interaction between bunched beam and the intermediate cavity in a RKA. Based on the equations, the effects of modulation depth and modulation frequency of the beam on the gap voltage amplitude and its phase are obtained. It is shown that the gap voltage is significantly lower than that estimated by the equivalent circuit model when the beam modulation is high. And the bandwidth becomes wider as the beam modulation depth increases. An S-band high gain relativistic klystron amplifier is designed based on the result. And the corresponding experiment is carried out on the linear transformer driver accelerator. The peak output power has achieved 1.2 GW with an efficiency of 28.6% and a gain of 46 dB in the corresponding experiment.

  6. An automatic iris occlusion estimation method based on high-dimensional density estimation.

    PubMed

    Li, Yung-Hui; Savvides, Marios

    2013-04-01

    Iris masks play an important role in iris recognition. They indicate which part of the iris texture map is useful and which part is occluded or contaminated by noisy image artifacts such as eyelashes, eyelids, eyeglasses frames, and specular reflections. The accuracy of the iris mask is extremely important. The performance of the iris recognition system will decrease dramatically when the iris mask is inaccurate, even when the best recognition algorithm is used. Traditionally, people used the rule-based algorithms to estimate iris masks from iris images. However, the accuracy of the iris masks generated this way is questionable. In this work, we propose to use Figueiredo and Jain's Gaussian Mixture Models (FJ-GMMs) to model the underlying probabilistic distributions of both valid and invalid regions on iris images. We also explored possible features and found that Gabor Filter Bank (GFB) provides the most discriminative information for our goal. Finally, we applied Simulated Annealing (SA) technique to optimize the parameters of GFB in order to achieve the best recognition rate. Experimental results show that the masks generated by the proposed algorithm increase the iris recognition rate on both ICE2 and UBIRIS dataset, verifying the effectiveness and importance of our proposed method for iris occlusion estimation.

  7. Possible roles of Peccei-Quinn symmetry in an effective low energy model

    NASA Astrophysics Data System (ADS)

    Suematsu, Daijiro

    2017-12-01

    The strong C P problem is known to be solved by imposing Peccei-Quinn (PQ) symmetry. However, the domain wall problem caused by the spontaneous breaking of its remnant discrete subgroup could make models invalid in many cases. We propose a model in which the PQ charge is assigned quarks so as to escape this problem without introducing any extra colored fermions. In the low energy effective model resulting after the PQ symmetry breaking, both the quark mass hierarchy and the CKM mixing could be explained through Froggatt-Nielsen mechanism. If the model is combined with the lepton sector supplemented by an inert doublet scalar and right-handed neutrinos, the effective model reduces to the scotogenic neutrino mass model in which both the origin of neutrino masses and dark matter are closely related. The strong C P problem could be related to the quark mass hierarchy, neutrino masses, and dark matter through the PQ symmetry.

  8. The two-mode multi-photon intensity-dependent Rabi model

    NASA Astrophysics Data System (ADS)

    Lo, C. F.

    2014-06-01

    We have investigated the energy eigen-spectrum of the two-mode k-photon intensity-dependent Rabi (IDR) model for k ≥ 2. Our analysis shows that the model does not have eigenstates in the Hilbert space spanned by the eigenstates of the two-mode k-photon intensity-dependent Jaynes-Cummings (IDJC) model, which is obtained by applying the rotating-wave approximation (RWA) to the two-mode k-photon IDR model. That is, the two-mode k-photon IDR model is ill-defined for k ≥ 2, and it is qualitatively different from the RWA counterpart which is valid for all values of k, implying that the counter-rotating term does drastically alter the nature of the RWA counterpart. Hence, the previous study of the effect of the counter-rotating term in the two-mode k-photon IDJC model via the time-dependent perturbation expansion is completely invalid.

  9. Statistical, economic and other tools for assessing natural aggregate

    USGS Publications Warehouse

    Bliss, J.D.; Moyle, P.R.; Bolm, K.S.

    2003-01-01

    Quantitative aggregate resource assessment provides resource estimates useful for explorationists, land managers and those who make decisions about land allocation, which may have long-term implications concerning cost and the availability of aggregate resources. Aggregate assessment needs to be systematic and consistent, yet flexible enough to allow updating without invalidating other parts of the assessment. Evaluators need to use standard or consistent aggregate classification and statistic distributions or, in other words, models with geological, geotechnical and economic variables or interrelationships between these variables. These models can be used with subjective estimates, if needed, to estimate how much aggregate may be present in a region or country using distributions generated by Monte Carlo computer simulations.

  10. Reasons for Journal Impact Factor Changes: Influence of Changing Source Items.

    PubMed

    Kiesslich, Tobias; Weineck, Silke B; Koelblinger, Dorothea

    2016-01-01

    Both the concept and the application of the impact factor (IF) have been subject to widespread critique, including concerns over its potential manipulation. This study provides a systematic analysis of significant journal Impact Factor changes, based on the relative contribution of either one or both variables of the IF equation (i.e. citations / articles as the numerator / denominator of the quotient). A cohort of JCR-listed journals which faced the most dramatic absolute IF changes between 2013 and 2014 (ΔIF ≥ 3.0, n = 49) was analyzed for the causes resulting in IF changes that theses journals have experienced in the last five years. Along with the variation by number of articles and citations, this analysis includes the relative change of both variables compared to each other and offers a classification of `valid`and `invalid`scenarios of IF variation in terms of the intended goal of the IF to measure journal quality. The sample cohort features a considerable incidence of IF increases (18%) which are qualified as `invalid`according to this classification because the IF increase is merely based on a favorably changing number of articles (denominator). The results of this analysis point out the potentially delusive effect of IF increases gained through effective shrinkage of publication output. Therefore, careful consideration of the details of the IF equation and possible implementation of control mechanisms versus the volatile factor of number of articles may help to improve the expressiveness of this metric.

  11. On geometric distance determination to the Cepheid RS Puppis from its light echoes

    NASA Astrophysics Data System (ADS)

    Bond, H. E.; Sparks, W. B.

    2009-02-01

    Context: The luminous Galactic Cepheid RS Puppis is unique in being surrounded by a dust nebula illuminated by the variable light of the Cepheid. In a recent paper in this journal, Kervella et al. (2008) report a very precise geometric distance to RS Pup, based on measured phase lags of the light variations of individual knots in the reflection nebula. Aims: In this commentary, we examine the validity of the distance measurement, as well as the reality of the spatial structure of the nebula determined by Feast (2008) based upon the phase lags of the knots. Methods: Kervella et al. assumed that the illuminated dust knots lie, on average, in the plane of the sky (otherwise it is not possible to derive a geometric distance from direct imaging of light echoes). We consider the biasing introduced by the high efficiency of forward scattering. Results: We conclude that most of the knots are in fact likely to lie in front of the plane of the sky, thus invalidating the Kervella et al. result. We also show that the flat equatorial disk structure determined by Feast is unlikely; instead, the morphology of the nebula is more probably bipolar, with a significant tilt of its axis with respect to the plane of the sky. Conclusions: Although the Kervella et al. distance result is invalidated, we show that high-resolution polarimetric imaging has the potential to yield a valid geometric distance to this important Cepheid.

  12. Applying simple water-energy balance frameworks to predict the climate sensitivity of streamflow over the continental United States

    NASA Astrophysics Data System (ADS)

    Renner, M.; Bernhofer, C.

    2012-08-01

    The prediction of climate effects on terrestrial ecosystems and water resources is one of the major research questions in hydrology. Conceptual water-energy balance models can be used to gain a first order estimate of how long-term average streamflow is changing with a change in water and energy supply. A common framework for investigation of this question is based on the Budyko hypothesis, which links hydrological response to aridity. Recently, Renner et al. (2012) introduced the climate change impact hypothesis (CCUW), which is based on the assumption that the total efficiency of the catchment ecosystem to use the available water and energy for actual evapotranspiration remains constant even under climate changes. Here, we confront the climate sensitivity approaches (the Budyko approach of Roderick and Farquhar, 2011, and the CCUW) with data of more than 400 basins distributed over the continental United States. We first estimate the sensitivity of streamflow to changes in precipitation using long-term average data of the period 1949 to 2003. This provides a hydro-climatic status of the respective basins as well as their expected proportional effect to changes in climate. Next, we test the ability of both approaches to predict climate impacts on streamflow by splitting the data into two periods. We (i) analyse the long-term average changes in hydro-climatology and (ii) derive a statistical classification of potential climate and basin change impacts based on the significance of observed changes in runoff, precipitation and potential evapotranspiration. Then we (iii) use the different climate sensitivity methods to predict the change in streamflow given the observed changes in water and energy supply and (iv) evaluate the predictions by (v) using the statistical classification scheme and (vi) a conceptual approach to separate the impacts of changes in climate from basin characteristics change on streamflow. This allows us to evaluate the observed changes in streamflow as well as to assess the impact of basin changes on the validity of climate sensitivity approaches. The apparent increase of streamflow of the majority of basins in the US is dominated by an increase in precipitation. It is further evident that impacts of changes in basin characteristics appear simultaneously with climate changes. There are coherent spatial patterns with catchments where basin changes compensate for climatic changes being dominant in the western and central parts of the US. A hot spot of basin changes leading to excessive runoff is found within the US Midwest. The impact of basin changes on the prediction is large and can be twice as much as the observed change signal. Although the CCUW and the Budyko approach yield similar predictions for most basins, the data of water-limited basins support the Budyko framework rather than the CCUW approach, which is known to be invalid under limiting climatic conditions.

  13. Evidence-based medicine and epistemological imperialism: narrowing the divide between evidence and illness.

    PubMed

    Crowther, Helen; Lipworth, Wendy; Kerridge, Ian

    2011-10-01

    Evidence-based medicine (EBM) has been rapidly and widely adopted because it claims to provide a method for determining the safety and efficacy of medical therapies and public health interventions more generally. However, as others have noted, EBM may be riven through with cultural bias, both in the generation of evidence and in its translation. We suggest that technological and scientific advances in medicine accentuate and entrench these cultural biases, to the extent that they may invalidate the evidence we have about disease and its treatment. This creates a significant ethical, epistemological and ontological challenge for medicine. © 2011 Blackwell Publishing Ltd.

  14. System and method for forward error correction

    NASA Technical Reports Server (NTRS)

    Cole, Robert M. (Inventor); Bishop, James E. (Inventor)

    2006-01-01

    A system and method are provided for transferring a packet across a data link. The packet may include a stream of data symbols which is delimited by one or more framing symbols. Corruptions of the framing symbol which result in valid data symbols may be mapped to invalid symbols. If it is desired to transfer one of the valid data symbols that has been mapped to an invalid symbol, the data symbol may be replaced with an unused symbol. At the receiving end, these unused symbols are replaced with the corresponding valid data symbols. The data stream of the packet may be encoded with forward error correction information to detect and correct errors in the data stream.

  15. System and method for transferring data on a data link

    NASA Technical Reports Server (NTRS)

    Cole, Robert M. (Inventor); Bishop, James E. (Inventor)

    2007-01-01

    A system and method are provided for transferring a packet across a data link. The packet may include a stream of data symbols which is delimited by one or more framing symbols. Corruptions of the framing symbol which result in valid data symbols may be mapped to invalid symbols. If it is desired to transfer one of the valid data symbols that has been mapped to an invalid symbol, the data symbol may be replaced with an unused symbol. At the receiving end, these unused symbols are replaced with the corresponding valid data symbols. The data stream of the packet may be encoded with forward error correction information to detect and correct errors in the data stream.

  16. 3D finite element model of the diabetic neuropathic foot: a gait analysis driven approach.

    PubMed

    Guiotto, Annamaria; Sawacha, Zimi; Guarneri, Gabriella; Avogaro, Angelo; Cobelli, Claudio

    2014-09-22

    Diabetic foot is an invalidating complication of diabetes that can lead to foot ulcers. Three-dimensional (3D) finite element analysis (FEA) allows characterizing the loads developed in the different anatomical structures of the foot in dynamic conditions. The aim of this study was to develop a subject specific 3D foot FE model (FEM) of a diabetic neuropathic (DNS) and a healthy (HS) subject, whose subject specificity can be found in term of foot geometry and boundary conditions. Kinematics, kinetics and plantar pressure (PP) data were extracted from the gait analysis trials of the two subjects with this purpose. The FEM were developed segmenting bones, cartilage and skin from MRI and drawing a horizontal plate as ground support. Materials properties were adopted from previous literature. FE simulations were run with the kinematics and kinetics data of four different phases of the stance phase of gait (heel strike, loading response, midstance and push off). FEMs were then driven by group gait data of 10 neuropathic and 10 healthy subjects. Model validation focused on agreement between FEM-simulated and experimental PP. The peak values and the total distribution of the pressures were compared for this purpose. Results showed that the models were less robust when driven from group data and underestimated the PP in each foot subarea. In particular in the case of the neuropathic subject's model the mean errors between experimental and simulated data were around the 20% of the peak values. This knowledge is crucial in understanding the aetiology of diabetic foot. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Analytic treatment of nuclear spin-lattice relaxation for diffusion in a cone model

    NASA Astrophysics Data System (ADS)

    Sitnitsky, A. E.

    2011-12-01

    We consider nuclear spin-lattice relaxation rate resulted from a diffusion equation for rotational wobbling in a cone. We show that the widespread point of view that there are no analytical expressions for correlation functions for wobbling in a cone model is invalid and prove that nuclear spin-lattice relaxation in this model is exactly tractable and amenable to full analytical description. The mechanism of relaxation is assumed to be due to dipole-dipole interaction of nuclear spins and is treated within the framework of the standard Bloemberger, Purcell, Pound-Solomon scheme. We consider the general case of arbitrary orientation of the cone axis relative the magnetic field. The BPP-Solomon scheme is shown to remain valid for systems with the distribution of the cone axes depending only on the tilt relative the magnetic field but otherwise being isotropic. We consider the case of random isotropic orientation of cone axes relative the magnetic field taking place in powders. Also we consider the cases of their predominant orientation along or opposite the magnetic field and that of their predominant orientation transverse to the magnetic field which may be relevant for, e.g., liquid crystals. Besides we treat in details the model case of the cone axis directed along the magnetic field. The latter provides direct comparison of the limiting case of our formulas with the textbook formulas for free isotropic rotational diffusion. The dependence of the spin-lattice relaxation rate on the cone half-width yields results similar to those predicted by the model-free approach.

  18. The assumption of equilibrium in models of migration.

    PubMed

    Schachter, J; Althaus, P G

    1993-02-01

    In recent articles Evans (1990) and Harrigan and McGregor (1993) (hereafter HM) scrutinized the equilibrium model of migration presented in a 1989 paper by Schachter and Althaus. This model used standard microeconomics to analyze gross interregional migration flows based on the assumption that gross flows are in approximate equilibrium. HM criticized the model as theoretically untenable, while Evans summoned empirical as well as theoretical objections. HM claimed that equilibrium of gross migration flows could be ruled out on theoretical grounds. They argued that the absence of net migration requires that either all regions have equal populations or that unsustainable regional migration propensities must obtain. In fact some moves are inter- and other are intraregional. It does not follow, however, that the number of interregional migrants will be larger for the more populous region. Alternatively, a country could be divided into a large number of small regions that have equal populations. With uniform propensities to move, each of these analytical regions would experience in equilibrium zero net migration. Hence, the condition that net migration equal zero is entirely consistent with unequal distributions of population across regions. The criticisms of Evans were based both on flawed reasoning and on misinterpretation of the results of a number of econometric studies. His reasoning assumed that the existence of demand shifts as found by Goldfarb and Yezer (1987) and Topel (1986) invalidated the equilibrium model. The equilibrium never really obtains exactly, but economic modeling of migration properly begins with a simple equilibrium model of the system. A careful reading of the papers Evans cited in support of his position showed that in fact they affirmed rather than denied the appropriateness of equilibrium modeling. Zero net migration together with nonzero gross migration are not theoretically incompatible with regional heterogeneity of population, wages, or amenities.

  19. The association between reading abilities and visual-spatial attention in Hong Kong Chinese children.

    PubMed

    Liu, Sisi; Liu, Duo; Pan, Zhihui; Xu, Zhengye

    2018-03-25

    A growing body of research suggests that visual-spatial attention is important for reading achievement. However, few studies have been conducted in non-alphabetic orthographies. This study extended the current research to reading development in Chinese, a logographic writing system known for its visual complexity. Eighty Hong Kong Chinese children were selected and divided into poor reader and typical reader groups, based on their performance on the measures of reading fluency, Chinese character reading, and reading comprehension. The poor and typical readers were matched on age and nonverbal intelligence. A Posner's spatial cueing task was adopted to measure the exogenous and endogenous orienting of visual-spatial attention. Although the typical readers showed the cueing effect in the central cue condition (i.e., responses to targets following valid cues were faster than those to targets following invalid cues), the poor readers did not respond differently in valid and invalid conditions, suggesting an impairment of the endogenous orienting of attention. The two groups, however, showed a similar cueing effect in the peripheral cue condition, indicating intact exogenous orienting in the poor readers. These findings generally supported a link between the orienting of covert attention and Chinese reading, providing evidence for the attentional-deficit theory of dyslexia. Copyright © 2018 John Wiley & Sons, Ltd.

  20. Anthropic tuning of the weak scale and of m{sub u}/m{sub d} in two-Higgs-doublet models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barr, S. M.; Khan, Almas

    2007-08-15

    It is shown that, in a model in which up-type and down-type fermions acquire mass from different Higgs doublets, the anthropic tuning of the Higgs mass parameters can explain the fact that the observed masses of the d and u quarks are nearly the same with d slightly heavier. If Yukawa couplings are assumed not to scan (vary among domains), this would also help explain why t is much heavier than b. It is also pointed out that the existence of dark matter invalidates some earlier anthropic arguments against the viability of domains where the standard model Higgs has positivemore » {mu}{sup 2}, but makes other even stronger arguments possible.« less

  1. Artificial intelligence based approach to forecast PM2.5 during haze episodes: A case study of Delhi, India

    NASA Astrophysics Data System (ADS)

    Mishra, Dhirendra; Goyal, P.; Upadhyay, Abhishek

    2015-02-01

    Delhi has been listed as the worst performer across the world with respect to the presence of alarmingly high level of haze episodes, exposing the residents here to a host of diseases including respiratory disease, chronic obstructive pulmonary disorder and lung cancer. This study aimed to analyze the haze episodes in a year and to develop the forecasting methodologies for it. The air pollutants, e.g., CO, O3, NO2, SO2, PM2.5 as well as meteorological parameters (pressure, temperature, wind speed, wind direction index, relative humidity, visibility, dew point temperature, etc.) have been used in the present study to analyze the haze episodes in Delhi urban area. The nature of these episodes, their possible causes, and their major features are discussed in terms of fine particulate matter (PM2.5) and relative humidity. The correlation matrix shows that temperature, pressure, wind speed, O3, and dew point temperature are the dominating variables for PM2.5 concentrations in Delhi. The hour-by-hour analysis of past data pattern at different monitoring stations suggest that the haze hours were occurred approximately 48% of the total observed hours in the year, 2012 over Delhi urban area. The haze hour forecasting models in terms of PM2.5 concentrations (more than 50 μg/m3) and relative humidity (less than 90%) have been developed through artificial intelligence based Neuro-Fuzzy (NF) techniques and compared with the other modeling techniques e.g., multiple linear regression (MLR), and artificial neural network (ANN). The haze hour's data for nine months, i.e. from January to September have been chosen for training and remaining three months, i.e., October to December in the year 2012 are chosen for validation of the developed models. The forecasted results are compared with the observed values with different statistical measures, e.g., correlation coefficients (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA). The performed analysis has indicated that R has values 0.25 for MLR, 0.53 for ANN, and NF: 0.72, between the observed and predicted PM2.5 concentrations during haze hours invalidation period. The results show that the artificial intelligence implementations have a more reasonable agreement with the observed values. Finally, it can be concluded that the most convincing advantage of artificial intelligence based NF model is capable for better forecasting of haze episodes in Delhi urban area than ANN and MLR models.

  2. Neuronal interactions in areas of spatial attention reflect avoidance of disgust, but orienting to danger.

    PubMed

    Zimmer, Ulrike; Höfler, Margit; Koschutnig, Karl; Ischebeck, Anja

    2016-07-01

    For survival, it is necessary to attend quickly towards dangerous objects, but to turn away from something that is disgusting. We tested whether fear and disgust sounds direct spatial attention differently. Using fMRI, a sound cue (disgust, fear or neutral) was presented to the left or right ear. The cue was followed by a visual target (a small arrow) which was located on the same (valid) or opposite (invalid) side as the cue. Participants were required to decide whether the arrow pointed up- or downwards while ignoring the sound cue. Behaviorally, responses were faster for invalid compared to valid targets when cued by disgust, whereas the opposite pattern was observed for targets after fearful and neutral sound cues. During target presentation, activity in the visual cortex and IPL increased for targets invalidly cued with disgust, but for targets validly cued with fear which indicated a general modulation of activation due to attention. For the TPJ, an interaction in the opposite direction was observed, consistent with its role in detecting targets at unattended positions and in relocating attention. As a whole our results indicate that a disgusting sound directs spatial attention away from its location, in contrast to fearful and neutral sounds. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  3. 3D topography measurements on correlation cells—a new approach to forensic ballistics identifications

    NASA Astrophysics Data System (ADS)

    Song, John; Chu, Wei; Tong, Mingsi; Soons, Johannes

    2014-06-01

    Based on three-dimensional (3D) topography measurements on correlation cells, the National Institute of Standards and Technology (NIST) has developed the ‘NIST Ballistics Identification System (NBIS)’ aimed at accurate ballistics identifications and fast ballistics evidence searches. The 3D topographies are divided into arrays of correlation cells to identify ‘valid correlation areas’ and eliminate ‘invalid correlation areas’ from the matching and identification procedure. A ‘congruent matching cells’ (CMC)’ method using three types of identification parameters of the paired correlation cells (cross correlation function maximum CCFmax, spatial registration position in x-y and registration angle θ) is used for high accuracy ballistics identifications. ‘Synchronous processing’ is proposed for correlating multiple cell pairs at the same time to increase the correlation speed. The proposed NBIS can be used for correlations of both geometrical topographies and optical intensity images. All the correlation parameters and algorithms are in the public domain and subject to open tests. An error rate reporting procedure has been developed that can greatly add to the scientific support for the firearm and toolmark identification specialty, and give confidence to the trier of fact in court proceedings. The NBIS is engineered to employ transparent identification parameters and criteria, statistical models and correlation algorithms. In this way, interoperability between different ballistics identification systems can be more easily achieved. This interoperability will make the NBIS suitable for ballistics identifications and evidence searches with large national databases, such as the National Integrated Ballistic Information Network in the United States.

  4. Atomistic and molecular effects in electric double layers at high surface charges

    DOE PAGES

    Templeton, Jeremy Alan; Lee, Jonathan; Mani, Ali

    2015-06-16

    Here, the Poisson–Boltzmann theory for electrolytes near a charged surface is known to be invalid due to unaccounted physics associated with high ion concentration regimes. In order to investigate this regime, fluids density functional theory (f-DFT) and molecular dynamics (MD) simulations were used to determine electric surface potential as a function of surface charge. Based on these detailed computations, for electrolytes with nonpolar solvent, the surface potential is shown to depend quadratically on the surface charge in the high charge limit. We demonstrate that modified Poisson–Boltzmann theories can model this limit if they are augmented with atomic packing densities providedmore » by MD. However, when the solvent is a highly polar molecule water an intermediate regime is identified in which a constant capacitance is realized. Simulation results demonstrate the mechanism underlying this regime, and for the salt water system studied here, it persists throughout the range of physically realistic surface charge densities so the potential’s quadratic surface charge dependence is not obtained.« less

  5. Exploring rationality in schizophrenia.

    PubMed

    Revsbech, Rasmus; Mortensen, Erik Lykke; Owen, Gareth; Nordgaard, Julie; Jansson, Lennart; Sæbye, Ditte; Flensborg-Madsen, Trine; Parnas, Josef

    2015-06-01

    Empirical studies of rationality (syllogisms) in patients with schizophrenia have obtained different results. One study found that patients reason more logically if the syllogism is presented through an unusual content. To explore syllogism-based rationality in schizophrenia. Thirty-eight first-admitted patients with schizophrenia and 38 healthy controls solved 29 syllogisms that varied in presentation content (ordinary v. unusual) and validity (valid v. invalid). Statistical tests were made of unadjusted and adjusted group differences in models adjusting for intelligence and neuropsychological test performance. Controls outperformed patients on all syllogism types, but the difference between the two groups was only significant for valid syllogisms presented with unusual content. However, when adjusting for intelligence and neuropsychological test performance, all group differences became non-significant. When taking intelligence and neuropsychological performance into account, patients with schizophrenia and controls perform similarly on syllogism tests of rationality. None. © The Royal College of Psychiatrists 2015. This is an open access article distributed under the terms of the Creative Commons Non-Commercial, No Derivatives (CC BY-NC-ND) licence.

  6. Applying lessons learned to enhance human performance and reduce human error for ISS operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, W.R.

    1999-01-01

    A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation ofmore » the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy{close_quote}s Idaho National Engineering and Environmental Laboratory (INEEL) is developing a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper will describe previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS. {copyright} {ital 1999 American Institute of Physics.}« less

  7. Applying lessons learned to enhance human performance and reduce human error for ISS operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, W.R.

    1998-09-01

    A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation ofmore » the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy`s Idaho National Engineering and Environmental Laboratory (INEEL) is developed a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper describes previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS.« less

  8. Testing and validating environmental models

    USGS Publications Warehouse

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series against data time series, and plotting predicted versus observed values) have little diagnostic power. We propose that it may be more useful to statistically extract the relationships of primary interest from the time series, and test the model directly against them.

  9. Self-Interaction Chromatography of mAbs: Accurate Measurement of Dead Volumes.

    PubMed

    Hedberg, S H M; Heng, J Y Y; Williams, D R; Liddell, J M

    2015-12-01

    Measurement of the second virial coefficient B22 for proteins using self-interaction chromatography (SIC) is becoming an increasingly important technique for studying their solution behaviour. In common with all physicochemical chromatographic methods, measuring the dead volume of the SIC packed column is crucial for accurate retention data; this paper examines best practise for dead volume determination. SIC type experiments using catalase, BSA, lysozyme and a mAb as model systems are reported, as well as a number of dead column measurements. It was observed that lysozyme and mAb interacted specifically with Toyopearl AF-Formyl dead columns depending upon pH and [NaCl], invalidating their dead volume usage. Toyopearl AF-Amino packed dead columns showed no such problems and acted as suitable dead columns without any solution condition dependency. Dead volume determinations using dextran MW standards with protein immobilised SIC columns provided dead volume estimates close to those obtained using Toyopearl AF-Amino dead columns. It is concluded that specific interactions between proteins, including mAbs, and select SIC support phases can compromise the use of some standard approaches for estimating the dead volume of SIC columns. Two other methods were shown to provide good estimates for the dead volume.

  10. Development of Modal Analysis for the Study of Global Modes in High Speed Boundary Layer Flows

    NASA Astrophysics Data System (ADS)

    Brock, Joseph Michael

    Boundary layer transition for compressible flows remains a challenging and unsolved problem. In the context of high-speed compressible flow, transitional and turbulent boundary-layers produce significantly higher surface heating caused by an increase in skin-friction. The higher heating associated with transitional and turbulent boundary layers drives thermal protection systems (TPS) and mission trajectory bounds. Proper understanding of the mechanisms that drive transition is crucial to the successful design and operation of the next generation spacecraft. Currently, prediction of boundary-layer transition is based on experimental efforts and computational stability analysis. Computational analysis, anchored by experimental correlations, offers an avenue to assess/predict stability at a reduced cost. Classical methods of Linearized Stability Theory (LST) and Parabolized Stability Equations (PSE) have proven to be very useful for simple geometries/base flows. Under certain conditions the assumptions that are inherent to classical methods become invalid and the use of LST/PSE is inaccurate. In these situations, a global approach must be considered. A TriGlobal stability analysis code, Global Mode Analysis in US3D (GMAUS3D), has been developed and implemented into the unstructured solver US3D. A discussion of the methodology and implementation will be presented. Two flow configurations are presented in an effort to validate/verify the approach. First, stability analysis for a subsonic cylinder wake is performed and results compared to literature. Second, a supersonic blunt cone is considered to directly compare LST/PSE analysis and results generated by GMAUS3D.

  11. Medical cost analysis: application to colorectal cancer data from the SEER Medicare database.

    PubMed

    Bang, Heejung

    2005-10-01

    Incompleteness is a key feature of most survival data. Numerous well established statistical methodologies and algorithms exist for analyzing life or failure time data. However, induced censorship invalidates the use of those standard analytic tools for some survival-type data such as medical costs. In this paper, some valid methods currently available for analyzing censored medical cost data are reviewed. Some cautionary findings under different assumptions are envisioned through application to medical costs from colorectal cancer patients. Cost analysis should be suitably planned and carefully interpreted under various meaningful scenarios even with judiciously selected statistical methods. This approach would be greatly helpful to policy makers who seek to prioritize health care expenditures and to assess the elements of resource use.

  12. [Methodology for the comprehensive evaluation of the quality of performance of activities of medical and social experts].

    PubMed

    Moskalenko, V F; Gorban', Ie M; Marunich, V V; Ipatov, A V; Sergiieni, O V

    2001-01-01

    The paper scientifically substantiates methodology, approaches, criteria, and control indices for assessment of activities of establishments of medical-and-social performance. Most indices for efficiency and certain indices for week points in the work of establishments of the service depend on interaction thereof with curative- and prophylactic institutions; the best results with the problem of prevention of disability and rehabilitation of invalids are supposed to be achieved through collaborative efforts. Other criteria and intermediate indices having an effect on the quality of activities reflect the resource- and trained personnel supplies of establishments of the service, amount of work, organizational measures designed to raise the quality of medical-and-social expert performance.

  13. System and method for modeling and analyzing complex scenarios

    DOEpatents

    Shevitz, Daniel Wolf

    2013-04-09

    An embodiment of the present invention includes a method for analyzing and solving possibility tree. A possibility tree having a plurality of programmable nodes is constructed and solved with a solver module executed by a processor element. The solver module executes the programming of said nodes, and tracks the state of at least a variable through a branch. When a variable of said branch is out of tolerance with a parameter, the solver disables remaining nodes of the branch and marks the branch as an invalid solution. The valid solutions are then aggregated and displayed as valid tree solutions.

  14. The Need for a Kinetics for Biological Transport

    PubMed Central

    Schindler, A. M.; Iberall, A. S.

    1973-01-01

    The traditional theory of transport across capillary membranes via a laminar Poiseuille flow is shown to be invalid. It is demonstrated that the random, diffusive nature of the molecular flow and interactions with the “pore” walls play an important role in the transport process. Neither the continuum Navier-Stokes theory nor the equivalent theory of irreversible thermodynamics is adequate to treat the problem. Combination of near-continuum hydrodynamic theory, noncontinuum kinetic theory, and the theory of fluctuations provides a first step toward modeling both liquid processes in general and membrane transport processes as a specific application. PMID:4726880

  15. Is the Bifactor Model a Better Model or Is It Just Better at Modeling Implausible Responses? Application of Iteratively Reweighted Least Squares to the Rosenberg Self-Esteem Scale.

    PubMed

    Reise, Steven P; Kim, Dale S; Mansolf, Maxwell; Widaman, Keith F

    2016-01-01

    Although the structure of the Rosenberg Self-Esteem Scale (RSES) has been exhaustively evaluated, questions regarding dimensionality and direction of wording effects continue to be debated. To shed new light on these issues, we ask (a) for what percentage of individuals is a unidimensional model adequate, (b) what additional percentage of individuals can be modeled with multidimensional specifications, and (c) what percentage of individuals respond so inconsistently that they cannot be well modeled? To estimate these percentages, we applied iteratively reweighted least squares (IRLS) to examine the structure of the RSES in a large, publicly available data set. A distance measure, d s , reflecting a distance between a response pattern and an estimated model, was used for case weighting. We found that a bifactor model provided the best overall model fit, with one general factor and two wording-related group factors. However, on the basis of d r  values, a distance measure based on individual residuals, we concluded that approximately 86% of cases were adequately modeled through a unidimensional structure, and only an additional 3% required a bifactor model. Roughly 11% of cases were judged as "unmodelable" due to their significant residuals in all models considered. Finally, analysis of d s revealed that some, but not all, of the superior fit of the bifactor model is owed to that model's ability to better accommodate implausible and possibly invalid response patterns, and not necessarily because it better accounts for the effects of direction of wording.

  16. A Secure Dynamic Identity and Chaotic Maps Based User Authentication and Key Agreement Scheme for e-Healthcare Systems.

    PubMed

    Li, Chun-Ta; Lee, Cheng-Chi; Weng, Chi-Yao; Chen, Song-Jhih

    2016-11-01

    Secure user authentication schemes in many e-Healthcare applications try to prevent unauthorized users from intruding the e-Healthcare systems and a remote user and a medical server can establish session keys for securing the subsequent communications. However, many schemes does not mask the users' identity information while constructing a login session between two or more parties, even though personal privacy of users is a significant topic for e-Healthcare systems. In order to preserve personal privacy of users, dynamic identity based authentication schemes are hiding user's real identity during the process of network communications and only the medical server knows login user's identity. In addition, most of the existing dynamic identity based authentication schemes ignore the inputs verification during login condition and this flaw may subject to inefficiency in the case of incorrect inputs in the login phase. Regarding the use of secure authentication mechanisms for e-Healthcare systems, this paper presents a new dynamic identity and chaotic maps based authentication scheme and a secure data protection approach is employed in every session to prevent illegal intrusions. The proposed scheme can not only quickly detect incorrect inputs during the phases of login and password change but also can invalidate the future use of a lost/stolen smart card. Compared the functionality and efficiency with other authentication schemes recently, the proposed scheme satisfies desirable security attributes and maintains acceptable efficiency in terms of the computational overheads for e-Healthcare systems.

  17. Tutorial on Biostatistics: Linear Regression Analysis of Continuous Correlated Eye Data

    PubMed Central

    Ying, Gui-shuang; Maguire, Maureen G; Glynn, Robert; Rosner, Bernard

    2017-01-01

    Purpose To describe and demonstrate appropriate linear regression methods for analyzing correlated continuous eye data. Methods We describe several approaches to regression analysis involving both eyes, including mixed effects and marginal models under various covariance structures to account for inter-eye correlation. We demonstrate, with SAS statistical software, applications in a study comparing baseline refractive error between one eye with choroidal neovascularization (CNV) and the unaffected fellow eye, and in a study determining factors associated with visual field data in the elderly. Results When refractive error from both eyes were analyzed with standard linear regression without accounting for inter-eye correlation (adjusting for demographic and ocular covariates), the difference between eyes with CNV and fellow eyes was 0.15 diopters (D; 95% confidence interval, CI −0.03 to 0.32D, P=0.10). Using a mixed effects model or a marginal model, the estimated difference was the same but with narrower 95% CI (0.01 to 0.28D, P=0.03). Standard regression for visual field data from both eyes provided biased estimates of standard error (generally underestimated) and smaller P-values, while analysis of the worse eye provided larger P-values than mixed effects models and marginal models. Conclusion In research involving both eyes, ignoring inter-eye correlation can lead to invalid inferences. Analysis using only right or left eyes is valid, but decreases power. Worse-eye analysis can provide less power and biased estimates of effect. Mixed effects or marginal models using the eye as the unit of analysis should be used to appropriately account for inter-eye correlation and maximize power and precision. PMID:28102741

  18. Enhancement of wind stress evaluation method under storm conditions

    NASA Astrophysics Data System (ADS)

    Chen, Yingjian; Yu, Xiping

    2016-12-01

    Wind stress is an important driving force for many meteorological and oceanographical processes. However, most of the existing methods for evaluation of the wind stress, including various bulk formulas in terms of the wind speed at a given height and formulas relating the roughness height of the sea surface with wind conditions, predict an ever-increasing tendency of the wind stress coefficient as the wind speed increases, which is inconsistent with the field observations under storm conditions. The wave boundary layer model, which is based on the momentum and energy conservation, has the advantage to take into account the physical details of the air-sea interaction process, but is still invalid under storm conditions without a modification. By including the energy dissipation due to the presence of sea spray, which is speculated to be an important aspect of the air-sea interaction under storm conditions, the wave boundary layer model is improved in this study. The improved model is employed to estimate the wind stress caused by an idealized tropical cyclone motion. The computational results show that the wind stress coefficient reaches its maximal value at a wind speed of about 40 m/s and decreases as the wind speed further increases. This is in fairly good agreement with the field data.

  19. Ground based planetary research

    NASA Technical Reports Server (NTRS)

    1973-01-01

    High spatial resolution spectrophotometric observations made in the wavelength region lambda lambda 0.6 - 2.0 micrometers are used to study the Jovian and Saturnian limb darkening. Limb darkening coefficients (k) of the Minnaert function are derived for the cloud layers of both planets. A value of k = 1.0 is found for Jupiter over the entire disk while values of between 0.75 and 0.90 are found for different latitudes for Saturn. These data are used to derive geometric albedoes (G) for the various belts, zones, spots and regions observed on Jupiter and Saturn. These values of G and k are in turn used to show that an isotropic scattering model is invalid for Jupiter and that at least an asymmetric scattering function, such as the Euler function, is needed to fit the Jovian data. The Jovian scattering function is found to generally vary between 0.960 and 0.994 as a function of wavelength and the feature observed. The Saturn geometric albedoes and values of k indicate that Euler's function fails to adequately model the scattering properties of the Saturnian clouds. As a result it is suggested that simple scattering theory may not apply to the Saturn clouds or that they are better represented by a cumulus cloud model.

  20. Energy, economic growth, and equity in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kannan, N.P.

    1979-01-01

    Decades of economic growth in the United States, although improving the lot of many, have failed to solve the problem of poverty. Islands of acute poverty persist amidst affluence even today, invalidating the conventional wisdom that a growing economy lifts everyone. For better or for worse, economic growth has been mainly dependent upon energy to solve the problem of poverty, and the insidious energy crisis that confronts us today threatens this economic growth and the dream of an equitable society. For this reason it is important to consider all the potential consequences of energy policies that are designed to helpmore » achieve energy self-sufficiency. In this study alternate energy policies are identified and compared for their relative degrees of potential trade-offs. The evaluation of the policies is carried out with the aid of two computer simulation models, ECONOMY1 and FOSSIL1, which are designed to capture the interactions between the energy sector and the rest of the economy of the United States. The study proposes an alternate set of hypotheses that emphasize the dynamics of social conflict over the distributive shares in the economy. The ECONOMY1 model is based on these hypotheses. 103 references, 79 figures, 16 tables.« less

  1. Rock physics model-based prediction of shear wave velocity in the Barnett Shale formation

    NASA Astrophysics Data System (ADS)

    Guo, Zhiqi; Li, Xiang-Yang

    2015-06-01

    Predicting S-wave velocity is important for reservoir characterization and fluid identification in unconventional resources. A rock physics model-based method is developed for estimating pore aspect ratio and predicting shear wave velocity Vs from the information of P-wave velocity, porosity and mineralogy in a borehole. Statistical distribution of pore geometry is considered in the rock physics models. In the application to the Barnett formation, we compare the high frequency self-consistent approximation (SCA) method that corresponds to isolated pore spaces, and the low frequency SCA-Gassmann method that describes well-connected pore spaces. Inversion results indicate that compared to the surroundings, the Barnett Shale shows less fluctuation in the pore aspect ratio in spite of complex constituents in the shale. The high frequency method provides a more robust and accurate prediction of Vs for all the three intervals in the Barnett formation, while the low frequency method collapses for the Barnett Shale interval. Possible causes for this discrepancy can be explained by the fact that poor in situ pore connectivity and low permeability make well-log sonic frequencies act as high frequencies and thus invalidate the low frequency assumption of the Gassmann theory. In comparison, for the overlying Marble Falls and underlying Ellenburger carbonates, both the high and low frequency methods predict Vs with reasonable accuracy, which may reveal that sonic frequencies are within the transition frequencies zone due to higher pore connectivity in the surroundings.

  2. A priori study of subgrid-scale features in turbulent Rayleigh-Bénard convection

    NASA Astrophysics Data System (ADS)

    Dabbagh, F.; Trias, F. X.; Gorobets, A.; Oliva, A.

    2017-10-01

    At the crossroad between flow topology analysis and turbulence modeling, a priori studies are a reliable tool to understand the underlying physics of the subgrid-scale (SGS) motions in turbulent flows. In this paper, properties of the SGS features in the framework of a large-eddy simulation are studied for a turbulent Rayleigh-Bénard convection (RBC). To do so, data from direct numerical simulation (DNS) of a turbulent air-filled RBC in a rectangular cavity of aspect ratio unity and π spanwise open-ended distance are used at two Rayleigh numbers R a ∈{1 08,1 010 } [Dabbagh et al., "On the evolution of flow topology in turbulent Rayleigh-Bénard convection," Phys. Fluids 28, 115105 (2016)]. First, DNS at Ra = 108 is used to assess the performance of eddy-viscosity models such as QR, Wall-Adapting Local Eddy-viscosity (WALE), and the recent S3PQR-models proposed by Trias et al. ["Building proper invariants for eddy-viscosity subgrid-scale models," Phys. Fluids 27, 065103 (2015)]. The outcomes imply that the eddy-viscosity modeling smoothes the coarse-grained viscous straining and retrieves fairly well the effect of the kinetic unfiltered scales in order to reproduce the coherent large scales. However, these models fail to approach the exact evolution of the SGS heat flux and are incapable to reproduce well the further dominant rotational enstrophy pertaining to the buoyant production. Afterwards, the key ingredients of eddy-viscosity, νt, and eddy-diffusivity, κt, are calculated a priori and revealed positive prevalent values to maintain a turbulent wind essentially driven by the mean buoyant force at the sidewalls. The topological analysis suggests that the effective turbulent diffusion paradigm and the hypothesis of a constant turbulent Prandtl number are only applicable in the large-scale strain-dominated areas in the bulk. It is shown that the bulk-dominated rotational structures of vortex-stretching (and its synchronous viscous dissipative structures) hold the highest positive values of νt; however, the zones of backscatter energy and counter-gradient heat transport are related to the areas of compressed focal vorticity. More arguments have been attained through a priori investigation of the alignment trends imposed by existing parameterizations for the SGS heat flux, tested here inside RBC. It is shown that the parameterizations based linearly on the resolved thermal gradient are invalid in RBC. Alternatively, the tensor-diffusivity approach becomes a crucial choice of modeling the SGS heat flux, in particular, the tensorial diffusivity that includes the SGS stress tensor. This and other crucial scrutinies on a future modeling to the SGS heat flux in RBC are sought.

  3. Informed Consent as a Litigation Strategy in the Field of Aesthetic Surgery: An Analysis Based on Court Precedents.

    PubMed

    Park, Bo Young; Kwon, Jungwoo; Kang, So Ra; Hong, Seung Eun

    2016-09-01

    In an increasing number of lawsuits doctors lose, despite providing preoperative patient education, because of failure to prove informed consent. We analyzed judicial precedents associated with insufficient informed consent to identify judicial factors and trends related to aesthetic surgery medical litigation. We collected data from civil trials between 1995 and 2015 that were related to aesthetic surgery and resulted in findings of insufficient informed consent. Based on these data, we analyzed the lawsuits, including the distribution of surgeries, dissatisfactions, litigation expenses, and relationship to informed consent. Cases were found involving the following types of surgery: facial rejuvenation (38 cases), facial contouring surgery (27 cases), mammoplasty (16 cases), blepharoplasty (29 cases), rhinoplasty (21 cases), body-contouring surgery (15 cases), and breast reconstruction (2 cases). Common reasons for postoperative dissatisfaction were deformities (22%), scars (17%), asymmetry (14%), and infections (6%). Most of the malpractice lawsuits occurred in Seoul (population 10 million people; 54% of total plastic surgeons) and in primary-level local clinics (113 cases, 82.5%). In cases in which only invalid informed consent was recognized, the average amount of consolation money was KRW 9,107,143 (USD 8438). In cases in which both violation of non-malfeasance and invalid informed consent were recognized, the average amount of consolation money was KRW 12,741,857 (USD 11,806), corresponding to 38.6% of the amount of the judgment. Surgeons should pay special attention to obtaining informed consent, because it is a double-edged sword; it has clinical purposes for doctors and patients but may also be a litigation strategy for lawyers.

  4. Effects of Data Quality on the Characterization of Aerosol Properties from Multiple Sensors

    NASA Technical Reports Server (NTRS)

    Petrenko, Maksym; Ichoku, Charles; Leptoukh, Gregory

    2011-01-01

    Cross-comparison of aerosol properties between ground-based and spaceborne measurements is an important validation technique that helps to investigate the uncertainties of aerosol products acquired using spaceborne sensors. However, it has been shown that even minor differences in the cross-characterization procedure may significantly impact the results of such validation. Of particular consideration is the quality assurance I quality control (QA/QC) information - an auxiliary data indicating a "confidence" level (e.g., Bad, Fair, Good, Excellent, etc.) conferred by the retrieval algorithms on the produced data. Depending on the treatment of available QA/QC information, a cross-characterization procedure has the potential of filtering out invalid data points, such as uncertain or erroneous retrievals, which tend to reduce the credibility of such comparisons. However, under certain circumstances, even high QA/QC values may not fully guarantee the quality of the data. For example, retrievals in proximity of a cloud might be particularly perplexing for an aerosol retrieval algorithm, resulting in an invalid data that, nonetheless, could be assigned a high QA/QC confidence. In this presentation, we will study the effects of several QA/QC parameters on cross-characterization of aerosol properties between the data acquired by multiple spaceborne sensors. We will utilize the Multi-sensor Aerosol Products Sampling System (MAPSS) that provides a consistent platform for multi-sensor comparison, including collocation with measurements acquired by the ground-based Aerosol Robotic Network (AERONET), The multi-sensor spaceborne data analyzed include those acquired by the Terra-MODIS, Aqua-MODIS, Terra-MISR, Aura-OMI, Parasol-POLDER, and CalipsoCALIOP satellite instruments.

  5. Method for Evaluation of Outage Probability on Random Access Channel in Mobile Communication Systems

    NASA Astrophysics Data System (ADS)

    Kollár, Martin

    2012-05-01

    In order to access the cell in all mobile communication technologies a so called random-access procedure is used. For example in GSM this is represented by sending the CHANNEL REQUEST message from Mobile Station (MS) to Base Transceiver Station (BTS) which is consequently forwarded as an CHANNEL REQUIRED message to the Base Station Controller (BSC). If the BTS decodes some noise on the Random Access Channel (RACH) as random access by mistake (so- called ‘phantom RACH') then it is a question of pure coincidence which èstablishment cause’ the BTS thinks to have recognized. A typical invalid channel access request or phantom RACH is characterized by an IMMEDIATE ASSIGNMENT procedure (assignment of an SDCCH or TCH) which is not followed by sending an ESTABLISH INDICATION from MS to BTS. In this paper a mathematical model for evaluation of the Power RACH Busy Threshold (RACHBT) in order to guaranty in advance determined outage probability on RACH is described and discussed as well. It focuses on Global System for Mobile Communications (GSM) however the obtained results can be generalized on remaining mobile technologies (ie WCDMA and LTE).

  6. A fractal model for nuclear organization: current evidence and biological implications

    PubMed Central

    Bancaud, Aurélien; Lavelle, Christophe; Huet, Sébastien; Ellenberg, Jan

    2012-01-01

    Chromatin is a multiscale structure on which transcription, replication, recombination and repair of the genome occur. To fully understand any of these processes at the molecular level under physiological conditions, a clear picture of the polymorphic and dynamic organization of chromatin in the eukaryotic nucleus is required. Recent studies indicate that a fractal model of chromatin architecture is consistent with both the reaction-diffusion properties of chromatin interacting proteins and with structural data on chromatin interminglement. In this study, we provide a critical overview of the experimental evidence that support a fractal organization of chromatin. On this basis, we discuss the functional implications of a fractal chromatin model for biological processes and propose future experiments to probe chromatin organization further that should allow to strongly support or invalidate the fractal hypothesis. PMID:22790985

  7. Insulin response dysregulation explains abnormal fat storage and increased risk of diabetes mellitus type 2 in Cohen Syndrome.

    PubMed

    Limoge, Floriane; Faivre, Laurence; Gautier, Thomas; Petit, Jean-Michel; Gautier, Elodie; Masson, David; Jego, Gaëtan; El Chehadeh-Djebbar, Salima; Marle, Nathalie; Carmignac, Virginie; Deckert, Valérie; Brindisi, Marie-Claude; Edery, Patrick; Ghoumid, Jamal; Blair, Edward; Lagrost, Laurent; Thauvin-Robinet, Christel; Duplomb, Laurence

    2015-12-01

    Cohen Syndrome (CS) is a rare autosomal recessive disorder, with defective glycosylation secondary to mutations in the VPS13B gene, which encodes a protein of the Golgi apparatus. Besides congenital neutropenia, retinopathy and intellectual deficiency, CS patients are faced with truncal obesity. Metabolism investigations showed abnormal glucose tolerance tests and low HDL values in some patients, and these could be risk factors for the development of diabetes mellitus and/or cardiovascular complications. To understand the mechanisms involved in CS fat storage, we used two models of adipogenesis differentiation: (i) SGBS pre-adipocytes with VPS13B invalidation thanks to siRNA delivery and (ii) CS primary fibroblasts. In both models, VPS13B invalidation led to accelerated differentiation into fat cells, which was confirmed by the earlier and increased expression of specific adipogenic genes, consequent to the increased response of cells to insulin stimulation. At the end of the differentiation protocol, these fat cells exhibited decreased AKT2 phosphorylation after insulin stimulation, which suggests insulin resistance. This study, in association with the in-depth analysis of the metabolic status of the patients, thus allowed us to recommend appropriate nutritional education to prevent the occurrence of diabetes mellitus and to put forward recommendations for the follow-up of CS patients, in particular with regard to the development of metabolic syndrome. We also suggest replacing the term obesity by abnormal fat distribution in CS, which should reduce the number of inappropriate diagnoses in patients who are referred only on the basis of intellectual deficiency associated with obesity. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Event related potentials during covert orientation of visual attention: effects of cue validity and directionality.

    PubMed

    Wright, M J; Geffen, G M; Geffen, L B

    1995-10-01

    Covert orientation of attention was studied in 30 adults who fixated warning cues and pressed a button at target onset. Directional cues (arrows) indicated the most probable (p = 0.8) side of target occurrence. Subjects responded fastest when validly cued, slowest to invalidly cued targets, and at an intermediate rate when the cue (a cross) was not directional. Directional cues took longer to evaluate (increased N1 and P2 latencies) and produced more focussed attention and greater response preparation (enhanced CNV and P3 amplitude) than non-directional cues. These findings indicate that the expectancy of a target can be manipulated by a spatial cue at three levels, sensory, attention, and response preparation, and lead to changes in the sensory perceptual processing of the target. Validly cued targets produced an increase in P1 amplitude reflecting attention enhanced sensory processing whereas invalidly cued targets increased N1 and P3 amplitudes reflecting the re-orientation of attention, and further processing and updating of information required of low probability stimuli respectively. P3 latency to invalidly cued targets was also delayed reflecting the additional processes required to shift attention to a new location. The P3 latency validity effect was smaller than that found for response time suggesting response execution may also be affected by spatial attention.

  9. The interference of introversion-extraversion and depressive symptomatology with reasoning performance: a behavioural study.

    PubMed

    Papageorgiou, Charalabos; Rabavilas, Andreas D; Stachtea, Xanthy; Giannakakis, Giorgos A; Kyprianou, Miltiades; Papadimitriou, George N; Stefanis, Costas N

    2012-04-01

    The objective of this study was to investigate the link between the Eysenck Personality Questionnaire (EPQ) scores and depressive symptomatology with reasoning performance induced by a task including valid and invalid Aristotelian syllogisms. The EPQ and the Zung Depressive Scale (ZDS) were completed by 48 healthy subjects (27 male, 21 female) aged 33.5 ± 9.0 years. Additionally, the subjects engaged into two reasoning tasks (valid vs. invalid syllogisms). Analysis showed that the judgment of invalid syllogisms is a more difficult task than of valid judgments (65.1% vs. 74.6% of correct judgments respectively, p < 0.01). In both conditions, the subjects' degree of confidence is significantly higher when they make a correct judgment than when they make an incorrect judgment (83.8 ± 11.2 vs. 75.3 ± 17.3, p < 0.01). Subjects with extraversion as measured by EPQ and high sexual desire as rated by the relative ZDS subscale are more prone to make incorrect judgments in the valid syllogisms, while, at the same time, they are more confident in their responses. The effects of extraversion/introversion and sexual desire on the outcome measures of the valid condition are not commutative but additive. These findings indicate that extraversion/introversion and sexual desire variations may have a detrimental effect in the reasoning performance.

  10. Solving da Vinci stereopsis with depth-edge-selective V2 cells

    PubMed Central

    Assee, Andrew; Qian, Ning

    2007-01-01

    We propose a new model for da Vinci stereopsis based on a coarse-to-fine disparity-energy computation in V1 and disparity-boundary-selective units in V2. Unlike previous work, our model contains only binocular cells, relies on distributed representations of disparity, and has a simple V1-to-V2 feedforward structure. We demonstrate with random dot stereograms that the V2 stage of our model is able to determine the location and the eye-of-origin of monocularly occluded regions and improve disparity map computation. We also examine a few related issues. First, we argue that since monocular regions are binocularly defined, they cannot generally be detected by monocular cells. Second, we show that our coarse-to-fine V1 model for conventional stereopsis explains double matching in Panum’s limiting case. This provides computational support to the notion that the perceived depth of a monocular bar next to a binocular rectangle may not be da Vinci stereopsis per se (Gillam et al., 2003). Third, we demonstrate that some stimuli previously deemed invalid have simple, valid geometric interpretations. Our work suggests that studies of da Vinci stereopsis should focus on stimuli more general than the bar-and-rectangle type and that disparity-boundary-selective V2 cells may provide a simple physiological mechanism for da Vinci stereopsis. PMID:17698163

  11. A new system model for radar polarimeters

    NASA Technical Reports Server (NTRS)

    Freeman, Anthony

    1991-01-01

    The validity of the 2 x 2 receive R and transmit T model for radar polarimeter systems, first proposed by Zebker et al. (1987), is questioned. The model is found to be invalid for many practical realizations of radar polarimeters, which can lead to significant errors in the calibration of polarimetric radar images. A more general model is put forward, which addresses the system defects which cause the 2 x 2 model to break down. By measuring one simple parameter from a polarimetric active radar calibration (PARC), it is possible to transform the scattering matrix measurements made by a radar polarimeter to a format compatible with a 2 x 2 R and T matrix model. Alternatively, the PARC can be used to verify the validity of the 2 x 2 model for any polarimetric radar system. Recommendations for the use of PARCs in polarimetric calibration and to measure the orientation angle of the horizontal (H) and vertical (V) coordinate system are also presented.

  12. A new system model for radar polarimeters

    NASA Astrophysics Data System (ADS)

    Freeman, Anthony

    1991-09-01

    The validity of the 2 x 2 receive R and transmit T model for radar polarimeter systems, first proposed by Zebker et al. (1987), is questioned. The model is found to be invalid for many practical realizations of radar polarimeters, which can lead to significant errors in the calibration of polarimetric radar images. A more general model is put forward, which addresses the system defects which cause the 2 x 2 model to break down. By measuring one simple parameter from a polarimetric active radar calibration (PARC), it is possible to transform the scattering matrix measurements made by a radar polarimeter to a format compatible with a 2 x 2 R and T matrix model. Alternatively, the PARC can be used to verify the validity of the 2 x 2 model for any polarimetric radar system. Recommendations for the use of PARCs in polarimetric calibration and to measure the orientation angle of the horizontal (H) and vertical (V) coordinate system are also presented.

  13. MISR Level 2 TOA/Cloud Versioning

    Atmospheric Science Data Center

    2017-10-11

    ... public release. Add trap singular matrix condition. Add test for invalid look vectors. Use different metadata to test for validity of time tags. Fix incorrectly addressed array. Introduced bug ...

  14. Arbitrary-step randomly delayed robust filter with application to boost phase tracking

    NASA Astrophysics Data System (ADS)

    Qin, Wutao; Wang, Xiaogang; Bai, Yuliang; Cui, Naigang

    2018-04-01

    The conventional filters such as extended Kalman filter, unscented Kalman filter and cubature Kalman filter assume that the measurement is available in real-time and the measurement noise is Gaussian white noise. But in practice, both two assumptions are invalid. To solve this problem, a novel algorithm is proposed by taking the following four steps. At first, the measurement model is modified by the Bernoulli random variables to describe the random delay. Then, the expression of predicted measurement and covariance are reformulated, which could get rid of the restriction that the maximum number of delay must be one or two and the assumption that probabilities of Bernoulli random variables taking the value one are equal. Next, the arbitrary-step randomly delayed high-degree cubature Kalman filter is derived based on the 5th-degree spherical-radial rule and the reformulated expressions. Finally, the arbitrary-step randomly delayed high-degree cubature Kalman filter is modified to the arbitrary-step randomly delayed high-degree cubature Huber-based filter based on the Huber technique, which is essentially an M-estimator. Therefore, the proposed filter is not only robust to the randomly delayed measurements, but robust to the glint noise. The application to the boost phase tracking example demonstrate the superiority of the proposed algorithms.

  15. Mind the Noise When Identifying Computational Models of Cognition from Brain Activity.

    PubMed

    Kolossa, Antonio; Kopp, Bruno

    2016-01-01

    The aim of this study was to analyze how measurement error affects the validity of modeling studies in computational neuroscience. A synthetic validity test was created using simulated P300 event-related potentials as an example. The model space comprised four computational models of single-trial P300 amplitude fluctuations which differed in terms of complexity and dependency. The single-trial fluctuation of simulated P300 amplitudes was computed on the basis of one of the models, at various levels of measurement error and at various numbers of data points. Bayesian model selection was performed based on exceedance probabilities. At very low numbers of data points, the least complex model generally outperformed the data-generating model. Invalid model identification also occurred at low levels of data quality and under low numbers of data points if the winning model's predictors were closely correlated with the predictors from the data-generating model. Given sufficient data quality and numbers of data points, the data-generating model could be correctly identified, even against models which were very similar to the data-generating model. Thus, a number of variables affects the validity of computational modeling studies, and data quality and numbers of data points are among the main factors relevant to the issue. Further, the nature of the model space (i.e., model complexity, model dependency) should not be neglected. This study provided quantitative results which show the importance of ensuring the validity of computational modeling via adequately prepared studies. The accomplishment of synthetic validity tests is recommended for future applications. Beyond that, we propose to render the demonstration of sufficient validity via adequate simulations mandatory to computational modeling studies.

  16. New Penicillium and Talaromyces species from honey, pollen and nests of stingless bees.

    PubMed

    Barbosa, Renan N; Bezerra, Jadson D P; Souza-Motta, Cristina M; Frisvad, Jens C; Samson, Robert A; Oliveira, Neiva T; Houbraken, Jos

    2018-04-13

    Penicillium and Talaromyces species have a worldwide distribution and are isolated from various materials and hosts, including insects and their substrates. The aim of this study was to characterize the Penicillium and Talaromyces species obtained during a survey of honey, pollen and the inside of nests of Melipona scutellaris. A total of 100 isolates were obtained during the survey and 82% of those strains belonged to Penicillium and 18% to Talaromyces. Identification of these isolates was performed based on phenotypic characters and β-tubulin and ITS sequencing. Twenty-one species were identified in Penicillium and six in Talaromyces, including seven new species. These new species were studied in detail using a polyphasic approach combining phenotypic, molecular and extrolite data. The four new Penicillium species belong to sections Sclerotiora (Penicillium fernandesiae sp. nov., Penicillium mellis sp. nov., Penicillium meliponae sp. nov.) and Gracilenta (Penicillium apimei sp. nov.) and the three new Talaromyces species to sections Helici (Talaromyces pigmentosus sp. nov.), Talaromyces (Talaromyces mycothecae sp. nov.) and Trachyspermi (Talaromyces brasiliensis sp. nov.). The invalidly described species Penicillium echinulonalgiovense sp. nov. was also isolated during the survey and this species is validated here.

  17. CRISPR/Cas9 mutagenesis invalidates a putative cancer dependency targeted in on-going clinical trials

    PubMed Central

    Lin, Ann; Giuliano, Christopher J; Sayles, Nicole M; Sheltzer, Jason M

    2017-01-01

    The Maternal Embryonic Leucine Zipper Kinase (MELK) has been reported to be a genetic dependency in several cancer types. MELK RNAi and small-molecule inhibitors of MELK block the proliferation of various cancer cell lines, and MELK knockdown has been described as particularly effective against the highly-aggressive basal/triple-negative subtype of breast cancer. Based on these preclinical results, the MELK inhibitor OTS167 is currently being tested as a novel chemotherapy agent in several clinical trials. Here, we report that mutagenizing MELK with CRISPR/Cas9 has no effect on the fitness of basal breast cancer cell lines or cell lines from six other cancer types. Cells that harbor null mutations in MELK exhibit wild-type doubling times, cytokinesis, and anchorage-independent growth. Furthermore, MELK-knockout lines remain sensitive to OTS167, suggesting that this drug blocks cell division through an off-target mechanism. In total, our results undermine the rationale for a series of current clinical trials and provide an experimental approach for the use of CRISPR/Cas9 in preclinical target validation that can be broadly applied. DOI: http://dx.doi.org/10.7554/eLife.24179.001 PMID:28337968

  18. On the validity of the modified equation approach to the stability analysis of finite-difference methods

    NASA Technical Reports Server (NTRS)

    Chang, Sin-Chung

    1987-01-01

    The validity of the modified equation stability analysis introduced by Warming and Hyett was investigated. It is shown that the procedure used in the derivation of the modified equation is flawed and generally leads to invalid results. Moreover, the interpretation of the modified equation as the exact partial differential equation solved by a finite-difference method generally cannot be justified even if spatial periodicity is assumed. For a two-level scheme, due to a series of mathematical quirks, the connection between the modified equation approach and the von Neuman method established by Warming and Hyett turns out to be correct despite its questionable original derivation. However, this connection is only partially valid for a scheme involving more than two time levels. In the von Neumann analysis, the complex error multiplication factor associated with a wave number generally has (L-1) roots for an L-level scheme. It is shown that the modified equation provides information about only one of these roots.

  19. Why granular media are thermal after all

    NASA Astrophysics Data System (ADS)

    Liu, Mario; Jiang, Yimin

    2017-06-01

    Two approaches exist to account for granular behavior. The thermal one considers the total entropy, which includes microscopic degrees of freedom such as phonons; the athermal one (as with the Edward entropy) takes grains as elementary. Granular solid hydrodynamics (GSH) belongs to the first, DEM, granular kinetic theory and athermal statistical mechanics (ASM) to the second. A careful discussion of their conceptual differences is given here. Three noteworthy insights or results are: (1) While DEM and granular kinetic theory are well justified to take grains as elementary, any athermal entropic consideration is bound to run into trouble. (2) Many general principles are taken as invalid in granular media. Yet within the thermal approach, energy conservation and fluctuation-dissipation theorem remain valid, granular temperatures equilibrate, and phase space is well explored in a grain at rest. Hence these are abnormalities of the athermal approximation, not of granular media as such. (3) GSH is a wide-ranged continuum mechanical description of granular dynamics.

  20. Simultaneous estimation of multiple phases in digital holographic interferometry using state space analysis

    NASA Astrophysics Data System (ADS)

    Kulkarni, Rishikesh; Rastogi, Pramod

    2018-05-01

    A new approach is proposed for the multiple phase estimation from a multicomponent exponential phase signal recorded in multi-beam digital holographic interferometry. It is capable of providing multidimensional measurements in a simultaneous manner from a single recording of the exponential phase signal encoding multiple phases. Each phase within a small window around each pixel is appproximated with a first order polynomial function of spatial coordinates. The problem of accurate estimation of polynomial coefficients, and in turn the unwrapped phases, is formulated as a state space analysis wherein the coefficients and signal amplitudes are set as the elements of a state vector. The state estimation is performed using the extended Kalman filter. An amplitude discrimination criterion is utilized in order to unambiguously estimate the coefficients associated with the individual signal components. The performance of proposed method is stable over a wide range of the ratio of signal amplitudes. The pixelwise phase estimation approach of the proposed method allows it to handle the fringe patterns that may contain invalid regions.

  1. Metastable decoherence-free subspaces and electromagnetically induced transparency in interacting many-body systems

    NASA Astrophysics Data System (ADS)

    Macieszczak, Katarzyna; Zhou, YanLi; Hofferberth, Sebastian; Garrahan, Juan P.; Li, Weibin; Lesanovsky, Igor

    2017-10-01

    We investigate the dynamics of a generic interacting many-body system under conditions of electromagnetically induced transparency (EIT). This problem is of current relevance due to its connection to nonlinear optical media realized by Rydberg atoms. In an interacting system the structure of the dynamics and the approach to the stationary state becomes far more complex than in the case of conventional EIT. In particular, we discuss the emergence of a metastable decoherence-free subspace, whose dimension for a single Rydberg excitation grows linearly in the number of atoms. On approach to stationarity this leads to a slow dynamics, which renders the typical assumption of fast relaxation invalid. We derive analytically the effective nonequilibrium dynamics in the decoherence-free subspace, which features coherent and dissipative two-body interactions. We discuss the use of this scenario for the preparation of collective entangled dark states and the realization of general unitary dynamics within the spin-wave subspace.

  2. Predicting the Overall Spatial Quality of Automotive Audio Systems

    NASA Astrophysics Data System (ADS)

    Koya, Daisuke

    The spatial quality of automotive audio systems is often compromised due to their unideal listening environments. Automotive audio systems need to be developed quickly due to industry demands. A suitable perceptual model could evaluate the spatial quality of automotive audio systems with similar reliability to formal listening tests but take less time. Such a model is developed in this research project by adapting an existing model of spatial quality for automotive audio use. The requirements for the adaptation were investigated in a literature review. A perceptual model called QESTRAL was reviewed, which predicts the overall spatial quality of domestic multichannel audio systems. It was determined that automotive audio systems are likely to be impaired in terms of the spatial attributes that were not considered in developing the QESTRAL model, but metrics are available that might predict these attributes. To establish whether the QESTRAL model in its current form can accurately predict the overall spatial quality of automotive audio systems, MUSHRA listening tests using headphone auralisation with head tracking were conducted to collect results to be compared against predictions by the model. Based on guideline criteria, the model in its current form could not accurately predict the overall spatial quality of automotive audio systems. To improve prediction performance, the QESTRAL model was recalibrated and modified using existing metrics of the model, those that were proposed from the literature review, and newly developed metrics. The most important metrics for predicting the overall spatial quality of automotive audio systems included those that were interaural cross-correlation (IACC) based, relate to localisation of the frontal audio scene, and account for the perceived scene width in front of the listener. Modifying the model for automotive audio systems did not invalidate its use for domestic audio systems. The resulting model predicts the overall spatial quality of 2- and 5-channel automotive audio systems with a cross-validation performance of R. 2 = 0.85 and root-mean-squareerror (RMSE) = 11.03%.

  3. Preventing disease transmission by deceased tissue donors by testing blood for viral nucleic acid.

    PubMed

    Strong, D Michael; Nelson, Karen; Pierce, Marge; Stramer, Susan L

    2005-01-01

    Nucleic acid testing (NAT) has reduced the risk of transmitting infectious disease through blood transfusion. Currently NAT for HIV-1 and HCV are FDA licensed and performed by nearly all blood collection facilities, but HBV NAT is performed under an investigational study protocol. Residual risk estimates indicate that NAT could potentially reduce disease transmission through transplanted tissue. However, tissue donor samples obtained post-mortem have the potential to produce an invalid NAT result due to inhibition of amplification reactions by hemolysis and other factors. The studies reported here summarize the development of protocols to allow NAT of deceased donor samples with reduced rates of invalid results. Using these protocols, inventories from two tissue centers were tested with greater than 99% of samples producing a valid test result.

  4. Reasoning from an incompatibility: False dilemma fallacies and content effects.

    PubMed

    Brisson, Janie; Markovits, Henry; Robert, Serge; Schaeken, Walter

    2018-03-23

    In the present studies, we investigated inferences from an incompatibility statement. Starting with two propositions that cannot be true at the same time, these inferences consist of deducing the falsity of one from the truth of the other or deducing the truth of one from the falsity of the other. Inferences of this latter form are relevant to human reasoning since they are the formal equivalent of a discourse manipulation called the false dilemma fallacy, often used in politics and advertising in order to force a choice between two selected options. Based on research on content-related variability in conditional reasoning, we predicted that content would have an impact on how reasoners treat incompatibility inferences. Like conditional inferences, they present two invalid forms for which the logical response is one of uncertainty. We predicted that participants would endorse a smaller proportion of the invalid incompatibility inferences when more counterexamples are available. In Study 1, we found the predicted pattern using causal premises translated into incompatibility statements with many and few counterexamples. In Study 2A, we replicated the content effects found in Study 1, but with premises for which the incompatibility statement is a non-causal relation between classes. These results suggest that the tendency to fall into the false dilemma fallacy is modulated by the background knowledge of the reasoner. They also provide additional evidence on the link between semantic information retrieval and deduction.

  5. Harnessing the theoretical foundations of the exponential and beta-Poisson dose-response models to quantify parameter uncertainty using Markov Chain Monte Carlo.

    PubMed

    Schmidt, Philip J; Pintar, Katarina D M; Fazil, Aamir M; Topp, Edward

    2013-09-01

    Dose-response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose-response model parameters are estimated using limited epidemiological data is rarely quantified. Second-order risk characterization approaches incorporating uncertainty in dose-response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta-Poisson dose-response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta-Poisson dose-response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta-Poisson dose-response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta-Poisson model are proposed, and simple algorithms to evaluate actual beta-Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta-Poisson dose-response model parameters is attributable to the absence of low-dose data. This region includes beta-Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility. © Her Majesty the Queen in Right of Canada 2013. Reproduced with the permission of the Minister of the Public Health Agency of Canada.

  6. The Emperors sham - wrong assumption that sham needling is sham.

    PubMed

    Lundeberg, Thomas; Lund, Iréne; Näslund, Jan; Thomas, Moolamanil

    2008-12-01

    During the last five years a large number of randomised controlled clinical trials (RCTs) have been published on the efficacy of acupuncture in different conditions. In most of these studies verum is compared with sham acupuncture. In general both verum and sham have been found to be effective, and often with little reported difference in outcome. This has repeatedly led to the conclusion that acupuncture is no more effective than placebo treatment. However, this conclusion is based on the assumption that sham acupuncture is inert. Since sham acupuncture evidently is merely another form of acupuncture from the physiological perspective, the assumption that sham is sham is incorrect and conclusions based on this assumption are therefore invalid. Clinical guidelines based on such conclusions may therefore exclude suffering patients from valuable treatments.

  7. Superadiabatic driving of a three-level quantum system

    NASA Astrophysics Data System (ADS)

    Theisen, M.; Petiziol, F.; Carretta, S.; Santini, P.; Wimberger, S.

    2017-07-01

    We study superadiabatic quantum control of a three-level quantum system whose energy spectrum exhibits multiple avoided crossings. In particular, we investigate the possibility of treating the full control task in terms of independent two-level Landau-Zener problems. We first show that the time profiles of the elements of the full control Hamiltonian are characterized by peaks centered around the crossing times. These peaks decay algebraically for large times. In principle, such a power-law scaling invalidates the hypothesis of perfect separability. Nonetheless, we address the problem from a pragmatic point of view by studying the fidelity obtained through separate control as a function of the intercrossing separation. This procedure may be a good approach to achieve approximate adiabatic driving of a specific instantaneous eigenstate in realistic implementations.

  8. Thermal Stress Analysis of Floating-Gate Tunneling Oxide Electrically Erasable Programmable Read Only Memory During Manufacturing Process

    NASA Astrophysics Data System (ADS)

    Zong, Xiang-fu; Wang, Xu; Weng, Yu-min; Yan, Ren-jin; Tang, Guo-an; Zhang, Zhao-qiang

    1998-10-01

    In this study, finite element modeling was used to evaluate the residual thermal stress in floating-gate tunneling oxide electrically erasable programmable read only memory (FLOTOX E2 PROMs) manufacturing process. Special attention is paid to the tunnel oxide region, in which high field electron injection is the basis to E2 PROMs operation. Calculated results show the presence of large stresses and stress gradients at the fringe. This may contribute to the invalidation of E2 PROMs. A possible failure mechanism of E2 PROM related to residual thermal stress-induced leakage is proposed.

  9. Mental models and human reasoning

    PubMed Central

    Johnson-Laird, Philip N.

    2010-01-01

    To be rational is to be able to reason. Thirty years ago psychologists believed that human reasoning depended on formal rules of inference akin to those of a logical calculus. This hypothesis ran into difficulties, which led to an alternative view: reasoning depends on envisaging the possibilities consistent with the starting point—a perception of the world, a set of assertions, a memory, or some mixture of them. We construct mental models of each distinct possibility and derive a conclusion from them. The theory predicts systematic errors in our reasoning, and the evidence corroborates this prediction. Yet, our ability to use counterexamples to refute invalid inferences provides a foundation for rationality. On this account, reasoning is a simulation of the world fleshed out with our knowledge, not a formal rearrangement of the logical skeletons of sentences. PMID:20956326

  10. Shortening anomalies in supersymmetric theories

    DOE PAGES

    Gomis, Jaume; Komargodski, Zohar; Ooguri, Hirosi; ...

    2017-01-17

    We present new anomalies in two-dimensional N = (2, 2) superconformal theories. They obstruct the shortening conditions of chiral and twisted chiral multiplets at coincident points. This implies that marginal couplings cannot be promoted to background superfields in short representations. Therefore, standard results that follow from N = (2, 2) spurion analysis are invalidated. These anomalies appear only if supersymmetry is enhanced beyond N = (2; 2). These anomalies explain why the conformal manifolds of the K 3 and T 4 sigma models are not Kähler and do not factorize into chiral and twisted chiral moduli spaces and why theremore » are no N = (2, 2) gauged linear sigma models that cover these conformal manifolds. We also present these results from the point of view of the Riemann curvature of conformal manifolds.« less

  11. Shortening anomalies in supersymmetric theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gomis, Jaume; Komargodski, Zohar; Ooguri, Hirosi

    We present new anomalies in two-dimensional N = (2, 2) superconformal theories. They obstruct the shortening conditions of chiral and twisted chiral multiplets at coincident points. This implies that marginal couplings cannot be promoted to background superfields in short representations. Therefore, standard results that follow from N = (2, 2) spurion analysis are invalidated. These anomalies appear only if supersymmetry is enhanced beyond N = (2; 2). These anomalies explain why the conformal manifolds of the K 3 and T 4 sigma models are not Kähler and do not factorize into chiral and twisted chiral moduli spaces and why theremore » are no N = (2, 2) gauged linear sigma models that cover these conformal manifolds. We also present these results from the point of view of the Riemann curvature of conformal manifolds.« less

  12. Robust criticality of an Ising model on rewired directed networks

    NASA Astrophysics Data System (ADS)

    Lipowski, Adam; Gontarek, Krzysztof; Lipowska, Dorota

    2015-06-01

    We show that preferential rewiring, which is supposed to mimic the behavior of financial agents, changes a directed-network Ising ferromagnet with a single critical point into a model with robust critical behavior. For the nonrewired random graph version, due to a constant number of out-links for each site, we write a simple mean-field-like equation describing the behavior of magnetization; we argue that it is exact and support the claim with extensive Monte Carlo simulations. For the rewired version, this equation is obeyed only at low temperatures. At higher temperatures, rewiring leads to strong heterogeneities, which apparently invalidates mean-field arguments and induces large fluctuations and divergent susceptibility. Such behavior is traced back to the formation of a relatively small core of agents that influence the entire system.

  13. Radiometric Characterization Results for the IKONOS, Quickbird, and OrbView-3 Sensor

    NASA Technical Reports Server (NTRS)

    Holekamp, Kara; Aaron, David; Thome, Kurtis

    2006-01-01

    Radiometric calibration of commercial imaging satellite products is required to ensure that science and application communities better understand commercial imaging satellite properties. Inaccurate radiometric calibrations can lead to erroneous decisions and invalid conclusions and can limit intercomparisons with other systems. To address this calibration need, the NASA Applied Sciences Directorate (ASD) at Stennis Space Center established a commercial satellite imaging radiometric calibration team consisting of three independent groups: NASA ASD, the University of Arizona Remote Sensing Group, and South Dakota State University. Each group independently determined the absolute radiometric calibration coefficients of available high-spatial-resolution commercial 4-band multispectral products, in the visible though near-infrared spectrum, from GeoEye(tradeMark) (formerly SpaceImaging(Registered TradeMark)) IKONOS, DigitalGlobe(Regitered TradeMark) QuickBird, and GeoEye (formerly ORBIMAGE(Registered TradeMark) OrbView. Each team member employed some variant of reflectance-based vicarious calibration approach, requiring ground-based measurements coincident with image acquisitions and radiative transfer calculations. Several study sites throughout the United States that covered a significant portion of the sensor's dynamic range were employed. Satellite at-sensor radiance values were compared to those estimated by each independent team member to evaluate the sensor's radiometric accuracy. The combined results of this evaluation provide the user community with an independent assessment of these sensors' absolute calibration values.

  14. 16 CFR 305.24 - Stayed or invalid parts.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... CONCERNING DISCLOSURES REGARDING ENERGY CONSUMPTION AND WATER USE OF CERTAIN HOME APPLIANCES AND OTHER PRODUCTS REQUIRED UNDER THE ENERGY POLICY AND CONSERVATION ACT (âAPPLIANCE LABELING RULEâ) Effect of This...

  15. 16 CFR 305.24 - Stayed or invalid parts.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... CONCERNING DISCLOSURES REGARDING ENERGY CONSUMPTION AND WATER USE OF CERTAIN HOME APPLIANCES AND OTHER PRODUCTS REQUIRED UNDER THE ENERGY POLICY AND CONSERVATION ACT (âAPPLIANCE LABELING RULEâ) Effect of This...

  16. 16 CFR 305.24 - Stayed or invalid parts.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... CONCERNING DISCLOSURES REGARDING ENERGY CONSUMPTION AND WATER USE OF CERTAIN HOME APPLIANCES AND OTHER PRODUCTS REQUIRED UNDER THE ENERGY POLICY AND CONSERVATION ACT (âAPPLIANCE LABELING RULEâ) Effect of This...

  17. 16 CFR 305.24 - Stayed or invalid parts.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... CONCERNING DISCLOSURES REGARDING ENERGY CONSUMPTION AND WATER USE OF CERTAIN HOME APPLIANCES AND OTHER PRODUCTS REQUIRED UNDER THE ENERGY POLICY AND CONSERVATION ACT (âAPPLIANCE LABELING RULEâ) Effect of This...

  18. A multi-site cognitive task analysis for biomedical query mediation.

    PubMed

    Hruby, Gregory W; Rasmussen, Luke V; Hanauer, David; Patel, Vimla L; Cimino, James J; Weng, Chunhua

    2016-09-01

    To apply cognitive task analyses of the Biomedical query mediation (BQM) processes for EHR data retrieval at multiple sites towards the development of a generic BQM process model. We conducted semi-structured interviews with eleven data analysts from five academic institutions and one government agency, and performed cognitive task analyses on their BQM processes. A coding schema was developed through iterative refinement and used to annotate the interview transcripts. The annotated dataset was used to reconstruct and verify each BQM process and to develop a harmonized BQM process model. A survey was conducted to evaluate the face and content validity of this harmonized model. The harmonized process model is hierarchical, encompassing tasks, activities, and steps. The face validity evaluation concluded the model to be representative of the BQM process. In the content validity evaluation, out of the 27 tasks for BQM, 19 meet the threshold for semi-valid, including 3 fully valid: "Identify potential index phenotype," "If needed, request EHR database access rights," and "Perform query and present output to medical researcher", and 8 are invalid. We aligned the goals of the tasks within the BQM model with the five components of the reference interview. The similarity between the process of BQM and the reference interview is promising and suggests the BQM tasks are powerful for eliciting implicit information needs. We contribute a BQM process model based on a multi-site study. This model promises to inform the standardization of the BQM process towards improved communication efficiency and accuracy. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. A Multi-Site Cognitive Task Analysis for Biomedical Query Mediation

    PubMed Central

    Hruby, Gregory W.; Rasmussen, Luke V.; Hanauer, David; Patel, Vimla; Cimino, James J.; Weng, Chunhua

    2016-01-01

    Objective To apply cognitive task analyses of the Biomedical query mediation (BQM) processes for EHR data retrieval at multiple sites towards the development of a generic BQM process model. Materials and Methods We conducted semi-structured interviews with eleven data analysts from five academic institutions and one government agency, and performed cognitive task analyses on their BQM processes. A coding schema was developed through iterative refinement and used to annotate the interview transcripts. The annotated dataset was used to reconstruct and verify each BQM process and to develop a harmonized BQM process model. A survey was conducted to evaluate the face and content validity of this harmonized model. Results The harmonized process model is hierarchical, encompassing tasks, activities, and steps. The face validity evaluation concluded the model to be representative of the BQM process. In the content validity evaluation, out of the 27 tasks for BQM, 19 meet the threshold for semi-valid, including 3 fully valid: “Identify potential index phenotype,” “If needed, request EHR database access rights,” and “Perform query and present output to medical researcher”, and 8 are invalid. Discussion We aligned the goals of the tasks within the BQM model with the five components of the reference interview. The similarity between the process of BQM and the reference interview is promising and suggests the BQM tasks are powerful for eliciting implicit information needs. Conclusions We contribute a BQM process model based on a multi-site study. This model promises to inform the standardization of the BQM process towards improved communication efficiency and accuracy. PMID:27435950

  20. Simultaneous estimation of diet composition and calibration coefficients with fatty acid signature data

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; Budge, Suzanne M.; Thiemann, Gregory W.; Rode, Karyn D.

    2017-01-01

    Knowledge of animal diets provides essential insights into their life history and ecology, although diet estimation is challenging and remains an active area of research. Quantitative fatty acid signature analysis (QFASA) has become a popular method of estimating diet composition, especially for marine species. A primary assumption of QFASA is that constants called calibration coefficients, which account for the differential metabolism of individual fatty acids, are known. In practice, however, calibration coefficients are not known, but rather have been estimated in feeding trials with captive animals of a limited number of model species. The impossibility of verifying the accuracy of feeding trial derived calibration coefficients to estimate the diets of wild animals is a foundational problem with QFASA that has generated considerable criticism. We present a new model that allows simultaneous estimation of diet composition and calibration coefficients based only on fatty acid signature samples from wild predators and potential prey. Our model performed almost flawlessly in four tests with constructed examples, estimating both diet proportions and calibration coefficients with essentially no error. We also applied the model to data from Chukchi Sea polar bears, obtaining diet estimates that were more diverse than estimates conditioned on feeding trial calibration coefficients. Our model avoids bias in diet estimates caused by conditioning on inaccurate calibration coefficients, invalidates the primary criticism of QFASA, eliminates the need to conduct feeding trials solely for diet estimation, and consequently expands the utility of fatty acid data to investigate aspects of ecology linked to animal diets.

Top