Science.gov

Sample records for evolving reliable relationships

  1. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, Jamie; Toon, Troy; Adams, Timothy C.; Miranda, David J.

    2016-01-01

    This paper describes the methodology that was developed to allocate reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program's subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. Allocating is an iterative process; as systems moved beyond their conceptual and preliminary design phases this provided an opportunity for the reliability engineering team to reevaluate allocations based on updated designs and maintainability characteristics of the components. Trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper will discuss the value of modifying reliability and maintainability allocations made for the GSDO subsystems as the program nears the end of its design phase.

  2. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, Troy; Toon, Jamie; Conner, Angelo C.; Adams, Timothy C.; Miranda, David J.

    2016-01-01

    This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program’s subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.

  3. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, T.; Toon, J.; Conner, A.; Adams, T.; Miranda, D.

    2016-01-01

    This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) programs subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.

  4. Towards resolving Lamiales relationships: insights from rapidly evolving chloroplast sequences

    PubMed Central

    2010-01-01

    Background In the large angiosperm order Lamiales, a diverse array of highly specialized life strategies such as carnivory, parasitism, epiphytism, and desiccation tolerance occur, and some lineages possess drastically accelerated DNA substitutional rates or miniaturized genomes. However, understanding the evolution of these phenomena in the order, and clarifying borders of and relationships among lamialean families, has been hindered by largely unresolved trees in the past. Results Our analysis of the rapidly evolving trnK/matK, trnL-F and rps16 chloroplast regions enabled us to infer more precise phylogenetic hypotheses for the Lamiales. Relationships among the nine first-branching families in the Lamiales tree are now resolved with very strong support. Subsequent to Plocospermataceae, a clade consisting of Carlemanniaceae plus Oleaceae branches, followed by Tetrachondraceae and a newly inferred clade composed of Gesneriaceae plus Calceolariaceae, which is also supported by morphological characters. Plantaginaceae (incl. Gratioleae) and Scrophulariaceae are well separated in the backbone grade; Lamiaceae and Verbenaceae appear in distant clades, while the recently described Linderniaceae are confirmed to be monophyletic and in an isolated position. Conclusions Confidence about deep nodes of the Lamiales tree is an important step towards understanding the evolutionary diversification of a major clade of flowering plants. The degree of resolution obtained here now provides a first opportunity to discuss the evolution of morphological and biochemical traits in Lamiales. The multiple independent evolution of the carnivorous syndrome, once in Lentibulariaceae and a second time in Byblidaceae, is strongly supported by all analyses and topological tests. The evolution of selected morphological characters such as flower symmetry is discussed. The addition of further sequence data from introns and spacers holds promise to eventually obtain a fully resolved plastid tree of

  5. Young People and Alcohol in Italy: An Evolving Relationship

    ERIC Educational Resources Information Center

    Beccaria, Franca; Prina, Franco

    2010-01-01

    In Italy, commonly held opinions and interpretations about the relationship between young people and alcohol are often expressed as generalizations and approximations. In order to further understanding of the relationship between young people and alcohol in contemporary Italy, we have gathered, compared and discussed all the available data, both…

  6. Young People and Alcohol in Italy: An Evolving Relationship

    ERIC Educational Resources Information Center

    Beccaria, Franca; Prina, Franco

    2010-01-01

    In Italy, commonly held opinions and interpretations about the relationship between young people and alcohol are often expressed as generalizations and approximations. In order to further understanding of the relationship between young people and alcohol in contemporary Italy, we have gathered, compared and discussed all the available data, both…

  7. Models of Shared Leadership: Evolving Structures and Relationships.

    ERIC Educational Resources Information Center

    Hallinger, Philip; Richardson, Don

    1988-01-01

    Explores potential changes in the power relationships among teachers and principals. Describes and analyzes the following models of teacher decision-making: (1) Instructional Leadership Teams; (2) Principals' Advisory Councils; (3) School Improvement Teams; and (4) Lead Teacher Committees. (FMW)

  8. Risk and responsibility: a complex and evolving relationship.

    PubMed

    Kermisch, Céline

    2012-03-01

    This paper analyses the nature of the relationship between risk and responsibility. Since neither the concept of risk nor the concept of responsibility has an unequivocal definition, it is obvious that there is no single interpretation of their relationship. After introducing the different meanings of responsibility used in this paper, we analyse four conceptions of risk. This allows us to make their link with responsibility explicit and to determine if a shift in the connection between risk and responsibility can be outlined. (1) In the engineer's paradigm, the quantitative conception of risk does not include any concept of responsibility. Their relationship is indirect, the locus of responsibility being risk management. (2) In Mary Douglas' cultural theory, risks are constructed through the responsibilities they engage. (3) Rayner and (4) Wolff go further by integrating forms of responsibility in the definition of risk itself. Analysis of these four frameworks shows that the concepts of risk and responsibility are increasingly intertwined. This tendency is reinforced by increasing public awareness and a call for the integration of a moral dimension in risk management. Therefore, we suggest that a form of virtue-responsibility should also be integrated in the concept of risk. © Springer Science+Business Media B.V. 2010

  9. ELECTRICAL SUBSTATION RELIABILITY EVALUATION WITH EMPHASIS ON EVOLVING INTERDEPENDENCE ON COMMUNICATION INFRASTRUCTURE.

    SciTech Connect

    AZARM,M.A.; BARI,R.; YUE,M.; MUSICKI,Z.

    2004-09-12

    This study developed a probabilistic methodology for assessment of the reliability and security of electrical energy distribution networks. This included consideration of the future grid system, which will rely heavily on the existing digitally based communication infrastructure for monitoring and protection. Event tree and fault tree methods were utilized. The approach extensively modeled the types of faults that a grid could potentially experience, the response of the grid, and the specific design of the protection schemes. We demonstrated the methods by applying it to a small sub-section of a hypothetical grid based on an existing electrical grid system of a metropolitan area. The results showed that for a typical design that relies on communication network for protection, the communication network reliability could contribute significantly to the frequency of loss of electrical power. The reliability of the communication network could become a more important contributor to the electrical grid reliability as the utilization of the communication network significantly increases in the near future to support ''smart'' transmission and/or distributed generation.

  10. Do aggressive signals evolve towards higher reliability or lower costs of assessment?

    PubMed

    Ręk, P

    2014-12-01

    It has been suggested that the evolution of signals must be a wasteful process for the signaller, aimed at the maximization of signal honesty. However, the reliability of communication depends not only on the costs paid by signallers but also on the costs paid by receivers during assessment, and less attention has been given to the interaction between these two types of costs during the evolution of signalling systems. A signaller and receiver may accept some level of signal dishonesty by choosing signals that are cheaper in terms of assessment but that are stabilized with less reliable mechanisms. I studied the potential trade-off between signal reliability and the costs of signal assessment in the corncrake (Crex crex). I found that the birds prefer signals that are less costly regarding assessment rather than more reliable. Despite the fact that the fundamental frequency of calls was a strong predictor of male size, it was ignored by receivers unless they could directly compare signal variants. My data revealed a response advantage of costly signals when comparison between calls differing with fundamental frequencies is fast and straightforward, whereas cheap signalling is preferred in natural conditions. These data might improve our understanding of the influence of receivers on signal design because they support the hypothesis that fully honest signalling systems may be prone to dishonesty based on the effects of receiver costs and be replaced by signals that are cheaper in production and reception but more susceptible to cheating.

  11. ELECTRICAL SUBSTATION RELIABILITY EVALUATION WITH EMPHASIS ON EVOLVING INTERDEPENDENCE ON COMMUNICATION INFRASTRUCTURE.

    SciTech Connect

    AZARM,M.A.BARI,R.A.MUSICKI,Z.

    2004-01-15

    The objective of this study is to develop a methodology for a probabilistic assessment of the reliability and security of electrical energy distribution networks. This includes consideration of the future grid system, which will rely heavily on the existing digitally based communication infrastructure for monitoring and protection. Another important objective of this study is to provide information and insights from this research to Consolidated Edison Company (Con Edison) that could be useful in the design of the new network segment to be installed in the area of the World Trade Center in lower Manhattan. Our method is microscopic in nature and relies heavily on the specific design of the portion of the grid being analyzed. It extensively models the types of faults that a grid could potentially experience, the response of the grid, and the specific design of the protection schemes. We demonstrate that the existing technology can be extended and applied to the electrical grid and to the supporting communication network. A small subsection of a hypothetical grid based on the existing New York City electrical grid system of Con Edison is used to demonstrate the methods. Sensitivity studies show that in the current design the frequency for the loss of the main station is sensitive to the communication network reliability. The reliability of the communication network could become a more important contributor to the electrical grid reliability as the utilization of the communication network significantly increases in the near future to support ''smart'' transmission and/or distributed generation. The identification of potential failure modes and their likelihood can support decisions on potential modifications to the network including hardware, monitoring instrumentation, and protection systems.

  12. Considering context: reliable entity networks through contextual relationship extraction

    NASA Astrophysics Data System (ADS)

    David, Peter; Hawes, Timothy; Hansen, Nichole; Nolan, James J.

    2016-05-01

    Existing information extraction techniques can only partially address the problem of exploiting unreadable-large amounts text. When discussion of events and relationships is limited to simple, past-tense, factual descriptions of events, current NLP-based systems can identify events and relationships and extract a limited amount of additional information. But the simple subset of available information that existing tools can extract from text is only useful to a small set of users and problems. Automated systems need to find and separate information based on what is threatened or planned to occur, has occurred in the past, or could potentially occur. We address the problem of advanced event and relationship extraction with our event and relationship attribute recognition system, which labels generic, planned, recurring, and potential events. The approach is based on a combination of new machine learning methods, novel linguistic features, and crowd-sourced labeling. The attribute labeler closes the gap between structured event and relationship models and the complicated and nuanced language that people use to describe them. Our operational-quality event and relationship attribute labeler enables Warfighters and analysts to more thoroughly exploit information in unstructured text. This is made possible through 1) More precise event and relationship interpretation, 2) More detailed information about extracted events and relationships, and 3) More reliable and informative entity networks that acknowledge the different attributes of entity-entity relationships.

  13. Changes of scaling relationships in an evolving population: The example of "sedimentary" stylolites

    NASA Astrophysics Data System (ADS)

    Peacock, D. C. P.; Korneva, I.; Nixon, C. W.; Rotevatn, A.

    2017-03-01

    Bed-parallel (;sedimentary;) stylolites are used as an example of a population that evolves by the addition of new components, their growth and their merger. It is shown that this style of growth controls the changes in the scaling relationships of the population. Stylolites tend to evolve in carbonate rocks through time, for example by compaction during progressive burial. The evolution of a population of stylolites, and their likely effects on porosity, are demonstrated using simple numerical models. Starting with a power-law distribution, the adding of new stylolites, the increase in their amplitudes and their merger decrease the slope of magnitude versus cumulative frequency of the population. The population changes to a non-power-law distribution as smaller stylolites merge to form larger stylolites. The results suggest that other populations can be forward- or backward-modelled, such as fault lengths, which also evolve by the addition of components, their growth and merger. Consideration of the ways in which populations change improves understanding of scaling relationships and vice versa, and would assist in the management of geofluid reservoirs.

  14. Craniosacral rhythm: reliability and relationships with cardiac and respiratory rates.

    PubMed

    Hanten, W P; Dawson, D D; Iwata, M; Seiden, M; Whitten, F G; Zink, T

    1998-03-01

    Craniosacral rhythm (CSR) has long been the subject of debate, both over its existence and its use as a therapeutic tool in evaluation and treatment. Origins of this rhythm are unknown, and palpatory findings lack scientific support. The purpose of this study was to determine the intra- and inter-examiner reliabilities of the palpation of the rate of the CSR and the relationship between the rate of the CSR and the heart or respiratory rates of subjects and examiners. The rates of the CSR of 40 healthy adults were palpated twice by each of two examiners. The heart and respiratory rates of the examiners and the subjects were recorded while the rates of the subjects' CSR were palpated by the examiners. Intraclass correlation coefficients were calculated to determine the intra- and inter-examiner reliabilities of the palpation. Two multiple regression analyses, one for each examiner, were conducted to analyze the relationships between the rate of the CSR and the heart and respiratory rates of the subjects and the examiners. The intraexaminer reliability coefficients were 0.78 for examiner A and 0.83 for examiner B, and the interexaminer reliability coefficient was 0.22. The result of the multiple regression analysis for examiner A was R = 0.46 and adjusted R2 = 0.12 (p = 0.078) and for examiner B was R = 0.63 and adjusted R2 = 0.32 (p = 0.001). The highest bivariate correlation was found between the CSR and the subject's heart rate (r = 0.30) for examiner A and between the CSR and the examiner's heart rate (r = 0.42) for examiner B. The results indicated that a single examiner may be able to palpate the rate of the CSR consistently, if that is what we truly measured. It is possible that the perception of CSR is illusory. The rate of the CSR palpated by two examiners is not consistent. The results of the regression analysis of one examiner offered no validation to those of the other. It appears that a subject's CSR is not related to the heart or respiratory rates of the

  15. How Mentoring Relationships Evolve: A Longitudinal Study of Academic Pediatricians in a Physician Educator Faculty Development Program

    ERIC Educational Resources Information Center

    Balmer, Dorene; D'Alessandro, Donna; Risko, Wanessa; Gusic, Maryellen E.

    2011-01-01

    Introduction: Mentoring is increasingly recognized as central to career development. Less attention has been paid, however, to how mentoring relationships evolve over time. To provide a more complete picture of these complex relationships, the authors explored mentoring from a mentee's perspective within the context of a three-year faculty…

  16. On the relationship between coefficient alpha and composite reliability.

    PubMed

    Peterson, Robert A; Kim, Yeolib

    2013-01-01

    Cronbach's coefficient alpha is the most widely used estimator of the reliability of tests and scales. However, it has been criticized as being a lower bound and hence underestimating true reliability. A popular alternative to coefficient alpha is composite reliability, which is usually calculated in conjunction with structural equation modeling. A quantitative analysis of 2,524 pairs of coefficient alpha and composite reliability values derived from empirical investigations revealed that although the average composite reliability value (.86) exceeded the average corresponding coefficient alpha value (.84), the difference was relatively inconsequential for practical applications such as meta-analysis.

  17. The Unidimensional Relationship Closeness Scale (URCS): Reliability and Validity Evidence for a New Measure of Relationship Closeness

    ERIC Educational Resources Information Center

    Dibble, Jayson L.; Levine, Timothy R.; Park, Hee Sun

    2012-01-01

    A fundamental dimension along which all social and personal relationships vary is closeness. The Unidimensional Relationship Closeness Scale (URCS) is a 12-item self-report scale measuring the closeness of social and personal relationships. The reliability and validity of the URCS were assessed with college dating couples (N = 192), female friends…

  18. Reliability assurance program and its relationship to other regulations

    SciTech Connect

    Polich, T.J.

    1994-12-31

    The need for a safety-oriented reliability effort for the nuclear industry was identified by the U.S. Nuclear Regulatory Commission (NRC) in the Three Mile Island Action Plan (NUREG-0660) Item II.C.4. In SECY-89-013, {open_quotes}Design Requirements Related to the Evolutionary ALWR,{close_quotes} the staff stated that the reliability assurance program (RAP) would be required for design certification to ensure that the design reliability of safety-significant structures, systems, and components (SSCs) is maintained over the life of a plant. In November 1988, the staff informed the advanced light water reactor (ALWR) vendors and the Electric Power Research Institute (EPRI) that it was considering this matter. Since that time, the staff has had numerous interactions with industry regarding RAP. These include discussions and subsequent safety evaluation reports on the EPRI utilities requirements document and for both Evolutionary Designs. The RAP has also been discussed in SECY-93-087, {open_quotes}Policy, Technical, and Licensing Issues Pertaining to Evolutionary and Advanced Light-Water Reactor (ALWR) Designs{close_quotes} and SECY-94-084, {open_quotes}Policy and Technical Issues Associated With the Regulatory Treatment of Non-Safety Systems in Passive Plant Designs.{close_quotes}

  19. Generalized storage-reliability-yield relationships for rainwater harvesting systems

    NASA Astrophysics Data System (ADS)

    Hanson, L. S.; Vogel, R. M.

    2014-07-01

    Sizing storage for rainwater harvesting (RWH) systems is often a difficult design consideration, as the system must be designed specifically for the local rainfall pattern. We introduce a generally applicable method for estimating the required storage by using regional regression equations to account for climatic differences in the behavior of RWH systems across the entire continental United States. A series of simulations for 231 locations with continuous daily precipitation records enable the development of storage-reliability-yield (SRY) relations at four useful reliabilities, 0.8, 0.9, 0.95, and 0.98. Multivariate, log-linear regression results in storage equations that include demand, collection area and local precipitation statistics. The continental regression equations demonstrated excellent goodness-of-fit (R2 0.96-0.99) using only two precipitation parameters, and fits improved when three geographic regions with more homogeneous rainfall characteristics were considered. The SRY models can be used to obtain a preliminary estimate of how large to build a storage tank almost anywhere in the United States based on desired yield and reliability, collection area, and local rainfall statistics. Our methodology could be extended to other regions of world, and the equations presented herein could be used to investigate how RWH systems would respond to changes in climatic variability. The resulting model may also prove useful in regional planning studies to evaluate the net benefits which result from the broad use of RWH to meet water supply requirements. We outline numerous other possible extensions to our work, which when taken together, illustrate the value of our initial generalized SRY model for RWH systems.

  20. The relationship between reliability and bonding techniques in hybrid systems

    NASA Technical Reports Server (NTRS)

    Kinser, D. L.; Graff, S. M.; Caruso, S. V.

    1975-01-01

    Differential thermal expansion has been shown to be responsible for many observed failures in ceramic chip capacitors mounted on alumina substrates. The present work has shown that the mounting techniques used in bonding the capacitors have a marked effect upon the thermally induced mechanical stress and thus the failure rate. A mathematical analysis of a composite model of the capacitor-substrate system to predict the magnitude of thermally induced stresses has been conducted. It has been observed that the stresses in more compliant bonding systems such as soft lead/tin and indium solders are significantly lower than those in hard solder and epoxy systems. The marked dependence upon heating and cooling rate has proven to be a determining factor in the prediction of failure in both the indium and tin/lead solder systems. This study has shown that the harder or higher melting solders are less susceptible to thermal cycling effects but that they are more likely to fail during initial processing operations. Recommendations are made concerning the optimum bonding system for the achievement of maximum reliability.

  1. The relationship between reliability and bonding techniques in hybrid microcircuits

    NASA Technical Reports Server (NTRS)

    Caruso, S. V.; Kinser, D. L.; Graff, S. M.; Allen, R. V.

    1975-01-01

    Differential thermal expansion was shown to be responsible for many observed failures in ceramic chip capacitors mounted on alumina substrates. It is shown that the mounting techniques used in bonding the capacitors have a marked effect upon the thermally induced mechanical stress and thus the failure rate. A mathematical analysis was conducted of a composite model of the capacitor-substrate system to predict the magnitude of thermally induced stresses. It was experimentally observed that the stresses in more compliant bonding systems such as soft lead tin and indium solders are significantly lower than those in hard solder and epoxy systems. The marked dependence upon heating and cooling rate was proven to be a determining factor in the prediction of failure solder systems. It was found that the harder or higher melting solders are less susceptible to thermal cycling effects but that they are more likely to fail during initial processing operations. Strain gage techniques were used to determine thermally induced expansion stresses of the capacitors and the alumina substrates. The compliance of the different bonding mediums was determined. From the data obtained, several recommendations are made concerning the optimum bonding system for the achievement of maximum reliability.

  2. Quantitative structure-activity relationships by evolved neural networks for the inhibition of dihydrofolate reductase by pyrimidines.

    PubMed

    Landavazo, Dana G; Fogel, Gary B; Fogel, David B

    2002-02-01

    Evolutionary computation provides a useful method for training neural networks in the face of multiple local optima. This paper begins with a description of methods for quantitative structure activity relationships (QSAR). An overview of artificial neural networks for pattern recognition problems such as QSAR is presented and extended with the description of how evolutionary computation can be used to evolve neural networks. Experiments are conducted to examine QSAR for the inhibition of dihydrofolate reductase by pyrimidines using evolved neural networks. Results indicate the utility of evolutionary algorithms and neural networks for the predictive task at hand. Furthermore, results that are comparable or perhaps better than those published previously were obtained using only a small fraction of the previously required degrees of freedom.

  3. Animals Used in Research and Education, 1966-2016: Evolving Attitudes, Policies, and Relationships.

    PubMed

    Lairmore, Michael D; Ilkiw, Jan

    2015-01-01

    Since the inception of the Association of American Veterinary Medical Colleges (AAVMC), the use of animals in research and education has been a central element of the programs of member institutions. As veterinary education and research programs have evolved over the past 50 years, so too have societal views and regulatory policies. AAVMC member institutions have continually responded to these events by exchanging best practices in training their students in the framework of comparative medicine and the needs of society. Animals provide students and faculty with the tools to learn the fundamental knowledge and skills of veterinary medicine and scientific discovery. The study of animal models has contributed extensively to medicine, veterinary medicine, and basic sciences as these disciplines seek to understand life processes. Changing societal views over the past 50 years have provided active examination and continued refinement of the use of animals in veterinary medical education and research. The future use of animals to educate and train veterinarians will likely continue to evolve as technological advances are applied to experimental design and educational systems. Natural animal models of both human and animal health will undoubtedly continue to serve a significant role in the education of veterinarians and in the development of new treatments of animal and human disease. As it looks to the future, the AAVMC as an organization will need to continue to support and promote best practices in the humane care and appropriate use of animals in both education and research.

  4. On the relationships between generative encodings, regularity, and learning abilities when evolving plastic artificial neural networks.

    PubMed

    Tonelli, Paul; Mouret, Jean-Baptiste

    2013-01-01

    A major goal of bio-inspired artificial intelligence is to design artificial neural networks with abilities that resemble those of animal nervous systems. It is commonly believed that two keys for evolving nature-like artificial neural networks are (1) the developmental process that links genes to nervous systems, which enables the evolution of large, regular neural networks, and (2) synaptic plasticity, which allows neural networks to change during their lifetime. So far, these two topics have been mainly studied separately. The present paper shows that they are actually deeply connected. Using a simple operant conditioning task and a classic evolutionary algorithm, we compare three ways to encode plastic neural networks: a direct encoding, a developmental encoding inspired by computational neuroscience models, and a developmental encoding inspired by morphogen gradients (similar to HyperNEAT). Our results suggest that using a developmental encoding could improve the learning abilities of evolved, plastic neural networks. Complementary experiments reveal that this result is likely the consequence of the bias of developmental encodings towards regular structures: (1) in our experimental setup, encodings that tend to produce more regular networks yield networks with better general learning abilities; (2) whatever the encoding is, networks that are the more regular are statistically those that have the best learning abilities.

  5. Do phytoplankton communities evolve through a self-regulatory abundance-diversity relationship?

    PubMed

    Roy, Shovonlal

    2009-02-01

    A small group of phytoplankton species that produce toxic or allelopathic chemicals has a significant effect on plankton dynamics in marine ecosystems. The species of non-toxic phytoplankton, which are large in number, are affected by the toxin-allelopathy of those species. By analysis of the abundance data of marine phytoplankton collected from the North-West coast of the Bay of Bengal, an empirical relationship between the abundance of the potential toxin-producing species and the species diversity of the non-toxic phytoplankton is formulated. A change-point analysis demonstrates that the diversity of non-toxic phytoplankton increases with the increase of toxic species up to a certain level. However, for a massive increase of the toxin-producing species the diversity of phytoplankton at species level reduces gradually. Following the results, a deterministic relationship between the abundance of toxic phytoplankton and the diversity of non-toxic phytoplankton is developed. The abundance-diversity relationship develops a unimodal pathway through which the abundance of toxic species regulates the diversity of phytoplankton. These results contribute to the current understanding of the coexistence and biodiversity of phytoplankton, the top-down vs. bottom-up debate, and to that of abundance-diversity relationship in marine ecosystems.

  6. Suprafamilial relationships among Rodentia and the phylogenetic effect of removing fast-evolving nucleotides in mitochondrial, exon and intron fragments

    PubMed Central

    2008-01-01

    Background The number of rodent clades identified above the family level is contentious, and to date, no consensus has been reached on the basal evolutionary relationships among all rodent families. Rodent suprafamilial phylogenetic relationships are investigated in the present study using ~7600 nucleotide characters derived from two mitochondrial genes (Cytochrome b and 12S rRNA), two nuclear exons (IRBP and vWF) and four nuclear introns (MGF, PRKC, SPTBN, THY). Because increasing the number of nucleotides does not necessarily increase phylogenetic signal (especially if the data is saturated), we assess the potential impact of saturation for each dataset by removing the fastest-evolving positions that have been recognized as sources of inconsistencies in phylogenetics. Results Taxonomic sampling included multiple representatives of all five rodent suborders described. Fast-evolving positions for each dataset were identified individually using a discrete gamma rate category and sites belonging to the most rapidly evolving eighth gamma category were removed. Phylogenetic tree reconstructions were performed on individual and combined datasets using Parsimony, Bayesian, and partitioned Maximum Likelihood criteria. Removal of fast-evolving positions enhanced the phylogenetic signal to noise ratio but the improvement in resolution was not consistent across different data types. The results suggested that elimination of fastest sites only improved the support for nodes moderately affected by homoplasy (the deepest nodes for introns and more recent nodes for exons and mitochondrial genes). Conclusion The present study based on eight DNA fragments supports a fully resolved higher level rodent phylogeny with moderate to significant nodal support. Two inter-suprafamilial associations emerged. The first comprised a monophyletic assemblage containing the Anomaluromorpha (Anomaluridae + Pedetidae) + Myomorpha (Muridae + Dipodidae) as sister clade to the Castorimorpha

  7. The evolving role of third parties in the hospital physician relationship.

    PubMed

    Burns, Lawton R; Nash, David B; Wholey, Douglas R

    2007-01-01

    Hospital-physician relationships (HPRs) are a key concern for both parties. Hospital interest has been driven historically by the desire for the physician's clinical business, the need to combat managed care, and now the threats posed by single specialty hospitals, medical device vendors, and consumerism. Physician interest has been driven by fears of managed care and desires for new sources of revenue. The dyadic relationships between hospitals and physicians are thus motivated and influenced by the role of third parties. This article analyzes the history of HPRs and the succession of third parties. The analysis illustrates that the role of third parties has shifted from a unifying one to one that divides hospitals and physicians. This shift presents both opportunities and problems.

  8. Evolving Relationship Structures in Multi-sourcing Arrangements: The Case of Mission Critical Outsourcing

    NASA Astrophysics Data System (ADS)

    Heitlager, Ilja; Helms, Remko; Brinkkemper, Sjaak

    Information Technology Outsourcing practice and research mainly considers the outsourcing phenomenon as a generic fulfilment of the IT function by external parties. Inspired by the logic of commodity, core competencies and economies of scale; assets, existing departments and IT functions are transferred to external parties. Although the generic approach might work for desktop outsourcing, where standardisation is the dominant factor, it does not work for the management of mission critical applications. Managing mission critical applications requires a different approach where building relationships is critical. The relationships involve inter and intra organisational parties in a multi-sourcing arrangement, called an IT service chain, consisting of multiple (specialist) parties that have to collaborate closely to deliver high quality services.

  9. Pudor, honor, and autoridad: the evolving patient-physician relationship in Spain.

    PubMed

    Epstein, R M; Borrell i Carrió, F

    2001-10-01

    The expression of emotion and the sharing of information are determined by cultural factors, consultation time, and the structure of the health care system. Two emblematic situations in Spain - the expression of aggression in the patient-physician encounter, and the withholding of diagnostic information from the patient - have not been well-described in their sociocultural context. To explore these, the authors observed and participated in clinical practice and teaching in several settings throughout Spain and analyzed field notes using qualitative methods. In this paper, we explore three central constructs - modesty (pudor), dignity (honor), and authority (autoridad) - and their expressions in patient-physician encounters. We define two types of emotions in clinical settings - public, extroverted expressions of anger and exuberance; and private, deeply held feelings of fear and grief that tend to be expressed through the arts and religion. Premature reassurance and withholding of information are interpreted as attempts to reconstruct the honor and pudor of the patient. Physician authority and perceived loyalty to the government-run health care system generate conflict and aggression in the patient-physician relationship. These clinical behaviors are contextualized within cultural definitions of effective communication, an ideal patient-physician relationship, the role of the family, and ethical behavior. Despite agreement on the goals of medicine, the behavioral manifestations of empathy and caring in Spain contrast substantially with northern European and North American cultures.

  10. Assessing the Complex and Evolving Relationship between Charges and Payments in US Hospitals: 1996 – 2012

    PubMed Central

    Bulchis, Anne G.; Lomsadze, Liya; Joseph, Jonathan; Baral, Ranju; Bui, Anthony L.; Horst, Cody; Johnson, Elizabeth; Dieleman, Joseph L.

    2016-01-01

    Background In 2013 the United States spent $2.9 trillion on health care, more than in any previous year. Much of the debate around slowing health care spending growth focuses on the complicated pricing system for services. Our investigation contributes to knowledge of health care spending by assessing the relationship between charges and payments in the inpatient hospital setting. In the US, charges and payments differ because of a complex set of incentives that connect health care providers and funders. Our methodology can also be applied to adjust charge data to reflect actual spending. Methods We extracted cause of health care encounter (cause), primary payer (payer), charge, and payment information for 50,172 inpatient hospital stays from 1996 through 2012. We used linear regression to assess the relationship between charges and payments, stratified by payer, year, and cause. We applied our estimates to a large, nationally representative hospital charge sample to estimate payments. Results The average amount paid per $1 charged varies significantly across three dimensions: payer, year, and cause. Among the 10 largest causes of health care spending, average payments range from 23 to 55 cents per dollar charged. Over time, the amount paid per dollar charged is decreasing for those with private or public insurance, signifying that inpatient charges are increasing faster than the amount insurers pay. Conversely, the amount paid by out-of-pocket payers per dollar charged is increasing over time for several causes. Applying our estimates to a nationally representative hospital charge sample generates payment estimates which align with the official US estimates of inpatient spending. Conclusions The amount paid per $1 charged fluctuates significantly depending on the cause of a health care encounter and the primary payer. In addition, the amount paid per charge is changing over time. Transparent accounting of hospital spending requires a detailed assessment of the

  11. Review: The evolving placenta: different developmental paths to a hemochorial relationship.

    PubMed

    Enders, A C; Carter, A M

    2012-02-01

    The way in which maternal blood is associated with trophoblast prior to the formation of the different types of hemochorial placenta may be conveniently grouped into four main patterns: a transitory endotheliochorial condition; maternal blood released into a mass of trophoblast; maternal blood confined to lacunae; and fetal villi entering preexisting maternal blood sinuses. Although it might be considered logical that developing placentas would pass through an endotheliochorial stage to become hemochorial, this developmental pattern is seen only as a transient stage in several species of bats and sciuromorph rodents. More commonly a mass of trophoblast at the junction with the endometrium serves as a meshwork through which maternal blood passes, with subsequent organization of a labyrinth when the fetal vascular component is organized. The initial trophoblast meshwork may be cellular or syncytial, often leading to a similar relationship in the spongy zone and labyrinth. Old World monkeys, apes and humans have a lacunar stage prior to establishing a villous hemochorial condition. New World monkeys lack a true lacunar stage, retaining portions of maternal vessels for some time and initially forming a trabecular arrangement similar to though differently arrived at than that in the tarsier. In armadillos, preexisting maternal venous sinuses are converted into an intervillous blood space by intruding fetal villi. Variations from the major patterns of development also occur. The way in which the definitive placental form is achieved developmentally should be considered when using placental structure to extrapolate evolution of placentation.

  12. The linearity and reliability of the mechanomyographic amplitude versus submaximal isometric force relationship.

    PubMed

    Beck, Travis W; DeFreitas, Jason M; Stock, Matt S

    2009-10-01

    The purpose of this study was to investigate the linearity and reliability of the mechanomyographic (MMG) amplitude versus submaximal isometric force relationship for the vastus lateralis. Twenty healthy subjects (mean +/- SD age = 24.0 +/- 4.3 years) volunteered to perform submaximal isometric muscle actions of the dominant leg extensors from 10 to 50% of the maximum voluntary contraction (MVC) on two separate occasions. During each muscle action, the surface MMG signal was detected from the vastus lateralis. The coefficients of determination for the MMG amplitude versus isometric force relationship ranged from r(2) = 0.001 to 0.962, thus indicating a wide range of linearity between subjects. In addition, the linear MMG amplitude versus force slope coefficient was not particularly reliable, with an intraclass correlation coefficient of 0.743 and a standard error of the measurement of 50.66% of the mean value. These findings indicated that the MMG amplitude versus submaximal isometric force relationship did not demonstrate sufficient linearity and reliability to be used for examining the effects of interventions (e.g. training, detraining, stretching, etc). Future studies need to be done to determine the cause(s) for this lack of linearity and reliability and possible techniques that can be used to improve it.

  13. The Neighborhood Environment Walkability Scale for the Republic of Korea: Reliability and Relationship with Walking

    PubMed Central

    KIM, Hyunshik; CHOI, Younglae; MA, Jiameng; HYUNG, Kuam; MIYASHITA, Masashi; LEE, Sunkyoung

    2016-01-01

    Background: The aim of the study was to analyze the reliability of the Korean version of the NEWS and to investigate the relationship between walking and environmental factors by gender. Methods: A total of 1407 Korean adults, aged 20–59 yr, participated in the study. Data were collected between Sep 2013 and Oct 2013. To examine the test-retest reliability, 281 of the 1407 participants were asked to answer the same questionnaire (Korean NEWS-A scale) after a 7-d interval. Results: The ICC range of the entire questionnaire was 0.71–0.88. The item on land use mix-diversity had the highest ICC, and that on physical barriers had the lowest. In addition, presents the partial correlation coefficients for walking and the NEWS-A score, adjusted for social demographic variables. Overall, land use mix-diversity (P<0.034) and land use mix-access (P<0.014) showed a positive relationship with walking. Discussion: Examination of the reliability of the Korean NEWS-A scale based on Korean adults who reside in large cities showed that all items had statistically satisfactory reliability. Korean NEWS-A scale may be a useful measure for assessing environmental correlates of walking among population in Korea. PMID:28032060

  14. The Neighborhood Environment Walkability Scale for the Republic of Korea: Reliability and Relationship with Walking.

    PubMed

    Kim, Hyunshik; Choi, Younglae; Ma, Jiameng; Hyung, Kuam; Miyashita, Masashi; Lee, Sunkyoung

    2016-11-01

    The aim of the study was to analyze the reliability of the Korean version of the NEWS and to investigate the relationship between walking and environmental factors by gender. A total of 1407 Korean adults, aged 20-59 yr, participated in the study. Data were collected between Sep 2013 and Oct 2013. To examine the test-retest reliability, 281 of the 1407 participants were asked to answer the same questionnaire (Korean NEWS-A scale) after a 7-d interval. The ICC range of the entire questionnaire was 0.71-0.88. The item on land use mix-diversity had the highest ICC, and that on physical barriers had the lowest. In addition, presents the partial correlation coefficients for walking and the NEWS-A score, adjusted for social demographic variables. Overall, land use mix-diversity (P<0.034) and land use mix-access (P<0.014) showed a positive relationship with walking. Examination of the reliability of the Korean NEWS-A scale based on Korean adults who reside in large cities showed that all items had statistically satisfactory reliability. Korean NEWS-A scale may be a useful measure for assessing environmental correlates of walking among population in Korea.

  15. Frontier orbital engineering of photo-hydrogen-evolving molecular devices: a clear relationship between the H2-evolving activity and the energy level of the LUMO.

    PubMed

    Masaoka, Shigeyuki; Mukawa, Yuichiro; Sakai, Ken

    2010-07-07

    Two new Ru(II)Pt(II) dimers, [Ru(bpy)(2)(mu-L2)PtCl(2)](2+) (5) and [Ru(bpy)(2)(mu-L3)PtCl(2)](2+) (6), were synthesized and characterized, and their electrochemical and spectroscopic properties together with their photo-hydrogen-evolving activities were evaluated (bpy = 2,2'-bypridine; L2 = 4'-[1,10]phenanthrolin-5-ylcarbamoyl)-[2,2']bipyridinyl-4-carboxylic acid ethyl ester; L3 = 4'-methyl-[2,2']bipyridinyl-4-carboxylic acid [1,10]phenanthrolin-5-ylamide). The structures of 5 and 6 are basically identical with that of the first active model of a photo-hydrogen-evolving molecular device developed in our group, [Ru(bpy)(2)(mu-L1)PtCl(2)](2+) (4) (L1 = 4'-([1,10]phenanthrolin-5-ylcarbamoyl)-[2,2']bipyridinyl-4-carboxylic acid), except for the difference in the substituent group at the 4-position of the bpy moiety bound to Pt(II) (-COOH for 4; -COOEt for 5; -CH(3) for 6). Electrochemical studies revealed that the first reduction potential of 5 (E(1/2) = -1.23 V) is nearly consistent with that of 4 (E(1/2) = -1.20 V) but is more positive than that of 6 (E(1/2) = -1.39 V), where the first reduction is associated with the reduction of the bpy moiety bound to Pt(II), consistent with a general tendency that the first reduction of bpy shows an anodic shift upon introduction of electron-withdrawing group. Density functional theory (DFT) calculations for 5 and 6 also show that the lowest unoccupied molecular orbital (LUMO) corresponds to the pi* orbital of the bpy moiety bound to Pt(II) for all the Ru(II)Pt(II) dimers, and the energy level of the LUMO of 6 is destabilized compared with those of 4 and 5, consistent with the results of the electrochemical studies. The photochemical hydrogen evolution from water driven by 4-6 in the presence a sacrificial electron donor (EDTA) was investigated. 5 was found to be active as an H(2)-evolving catalyst, while 6 shows no activity at all. However, 6 was found to drive photochemical H(2) evolution in the presence of both EDTA and

  16. Visual perspective in autobiographical memories: reliability, consistency, and relationship to objective memory performance.

    PubMed

    Siedlecki, Karen L

    2015-01-01

    Visual perspective in autobiographical memories was examined in terms of reliability, consistency, and relationship to objective memory performance in a sample of 99 individuals. Autobiographical memories may be recalled from two visual perspectives--a field perspective in which individuals experience the memory through their own eyes, or an observer perspective in which individuals experience the memory from the viewpoint of an observer in which they can see themselves. Participants recalled nine word-cued memories that differed in emotional valence (positive, negative and neutral) and rated their memories on 18 scales. Results indicate that visual perspective was the most reliable memory characteristic overall and is consistently related to emotional intensity at the time of recall and amount of emotion experienced during the memory. Visual perspective is unrelated to memory for words, stories, abstract line drawings or faces.

  17. Structural and reliability analysis of quality of relationship index in cancer patients.

    PubMed

    Cousson-Gélie, Florence; de Chalvron, Stéphanie; Zozaya, Carole; Lafaye, Anaïs

    2013-01-01

    Among psychosocial factors affecting emotional adjustment and quality of life, social support is one of the most important and widely studied in cancer patients, but little is known about the perception of support in specific significant relationships in patients with cancer. This study examined the psychometric properties of the Quality of Relationship Inventory (QRI) by evaluating its factor structure and its convergent and discriminant validity in a sample of cancer patients. A total of 388 patients completed the QRI. Convergent validity was evaluated by testing the correlations between the QRI subscales and measures of general social support, anxiety and depression symptoms. Discriminant validity was examined by testing group comparison. The QRI's longitudinal invariance across time was also tested. Principal axis factor analysis with promax rotation identified three factors accounting for 42.99% of variance: perceived social support, depth, and interpersonal conflict. Estimates of reliability with McDonald's ω coefficient were satisfactory for all the QRI subscales (ω ranging from 0.75 - 0.85). Satisfaction from general social support was negatively correlated with the interpersonal conflict subscale and positively with the depth subscale. The interpersonal conflict and social support scales were correlated with depression and anxiety scores. We also found a relative stability of QRI subscales (measured 3 months after the first evaluation) and differences between partner status and gender groups. The Quality of Relationship Inventory is a valid tool for assessing the quality of social support in a particular relationship with cancer patients.

  18. Impact of relationships between test and training animals and among training animals on reliability of genomic prediction.

    PubMed

    Wu, X; Lund, M S; Sun, D; Zhang, Q; Su, G

    2015-10-01

    One of the factors affecting the reliability of genomic prediction is the relationship among the animals of interest. This study investigated the reliability of genomic prediction in various scenarios with regard to the relationship between test and training animals, and among animals within the training data set. Different training data sets were generated from EuroGenomics data and a group of Nordic Holstein bulls (born in 2005 and afterwards) as a common test data set. Genomic breeding values were predicted using a genomic best linear unbiased prediction model and a Bayesian mixture model. The results showed that a closer relationship between test and training animals led to a higher reliability of genomic predictions for the test animals, while a closer relationship among training animals resulted in a lower reliability. In addition, the Bayesian mixture model in general led to a slightly higher reliability of genomic prediction, especially for the scenario of distant relationships between training and test animals. Therefore, to prevent a decrease in reliability, constant updates of the training population with animals from more recent generations are required. Moreover, a training population consisting of less-related animals is favourable for reliability of genomic prediction. © 2015 Blackwell Verlag GmbH.

  19. Palmar Creases: Classification, Reliability and Relationships to Fetal Alcohol Spectrum Disorders (FASD).

    PubMed

    Mattison, Siobhán M; Brunson, Emily K; Holman, Darryl J

    2015-09-01

    A normal human palm contains 3 major creases: the distal transverse crease; the proximal transverse crease; and the thenar crease. Because permanent crease patterns are thought to be laid down during the first trimester, researchers have speculated that deviations in crease patterns could be indicative of insults during fetal development. The purpose of this study was twofold: (1) to compare the efficacy and reliability of two coding methods, the first (M1) classifying both "simiana" and Sydney line variants and the second (M2) counting the total number of crease points of origin on the radial border of the hand; and (2) to ascertain the relationship between palmar crease patterns and fetal alcohol spectrum disorders (FASD). Bilateral palm prints were taken using the carbon paper and tape method from 237 individuals diagnosed with FASD and 190 unexposed controls. All prints were coded for crease variants under M1 and M2. Additionally, a random sample of 98 matched (right and left) prints was selected from the controls to determine the reliabilities of M1 and M2. For this analysis, each palm was read twice, at different times, by two readers. Intra-observer Kappa coefficients were similar under both methods, ranging from 0.804-0.910. Inter-observer Kappa coefficients ranged from 0.582-0.623 under M1 and from 0.647-0.757 under M2. Using data from the entire sample of 427 prints and controlling for sex and ethnicity (white v. non-white), no relationship was found between palmar crease variants and FASD. Our results suggest that palmar creases can be classified reliably, but palmar crease patterns may not be affected by fetal alcohol exposure.

  20. Neighborhood Environment Walkability Scale for Youth (NEWS-Y): reliability and relationship with physical activity.

    PubMed

    Rosenberg, Dori; Ding, Ding; Sallis, James F; Kerr, Jacqueline; Norman, Gregory J; Durant, Nefertiti; Harris, Sion K; Saelens, Brian E

    2009-01-01

    To examine the psychometric properties of the Neighborhood Environment Walkability Scale-Youth (NEWS-Y) and explore its associations with context-specific and overall physical activity (PA) among youth. In 2005, parents of children ages 5-11 (n=116), parents of adolescents ages 12-18 (n=171), and adolescents ages 12-18 (n=171) from Boston, Cincinnati, and San Diego, completed NEWS-Y surveys regarding perceived land use mix-diversity, recreation facility availability, pedestrian/automobile traffic safety, crime safety, aesthetics, walking/cycling facilities, street connectivity, land use mix-access, and residential density. A standardized neighborhood environment score was derived. Self-reported activity in the street and in parks, and walking to parks, shops, school, and overall physical activity were assessed. The NEWS-Y subscales had acceptable test-retest reliability (ICC range .56-.87). Being active in a park, walking to a park, walking to shops, and walking to school were related to multiple environmental attributes in all three participant groups. Total neighborhood environment, recreation facilities, walking and cycling facilities, and land use mix-access had the most consistent relationships with specific types of activity. The NEWS-Y has acceptable reliability and subscales were significantly correlated with specific types of youth PA. The NEWS-Y can be used to examine neighborhood environment correlates of youth PA.

  1. Establishing a Reliable Depth-Age Relationship for the Denali Ice Core

    NASA Astrophysics Data System (ADS)

    Wake, C. P.; Osterberg, E. C.; Winski, D.; Ferris, D.; Kreutz, K. J.; Introne, D.; Dalton, M.

    2015-12-01

    Reliable climate reconstruction from ice core records requires the development of a reliable depth-age relationship. We have established a sub-annual resolution depth-age relationship for the upper 198 meters of a 208 m ice core recovered in 2013 from Mt. Hunter (3,900 m asl), Denali National Park, central Alaska. The dating of the ice core was accomplished via annual layer counting of glaciochemical time-series combined with identification of reference horizons from volcanic eruptions and atmospheric nuclear weapons testing. Using the continuous ice core melter system at Dartmouth College, sub-seasonal samples have been collected and analyzed for major ions, liquid conductivity, particle size and concentration, and stable isotope ratios. Annual signals are apparent in several of the chemical species measured in the ice core samples. Calcium and magnesium peak in the spring, ammonium peaks in the summer, methanesulfonic acid (MSA) peaks in the autumn, and stable isotopes display a strong seasonal cycle with the most depleted values occurring during the winter. Thin ice layers representing infrequent summertime melt were also used to identify summer layers in the core. Analysis of approximately one meter sections of the core via nondestructive gamma spectrometry over depths from 84 to 124 m identified a strong radioactive cesium-137 peak at 89 m which corresponds to the 1963 layer deposited during extensive atmospheric nuclear weapons testing. Peaks in the sulfate and chloride record have been used for the preliminary identification of volcanic signals preserved in the ice core, including ten events since 1883. We are confident that the combination of robust annual layers combined with reference horizons provides a timescale for the 20th century that has an error of less than 0.5 years, making calibrations between ice core records and the instrumental climate data particularly robust. Initial annual layer counting through the entire 198 m suggests the Denali Ice

  2. Study on Precipitation Anomalies of North of China in April and Its relationship to Sea Surface Temperature Evolvement

    NASA Astrophysics Data System (ADS)

    Song, Y.; Li, Z.; Guan, Y.

    2012-04-01

    Using monthly precipitation data in North of China for 1960-2007, American NCEP/NCAR monthly reanalysis data and NOAA SST (sea surface temperature) data, and SST indices data in Climate System Monitoring Bulletin collected by National Climate Center, this paper studied the general circulation, large-scale weather system anomalous characteristics and SSTA evolvement with more rainfall of North of China in April. The results showed that precipitation differences between months in spring in North of China were quite obvious, and the correlation coefficients between precipitation of North of China in April and that in March and in May were not significant respectively. The linear trend of precipitation in April was out of phase with that in spring. It was meaningful to study precipitation in April solely. The space pattern of first leading mode of EOF analysis for precipitation of North of China in April indicated that rainfall changed synchronously. In years of more rainfall in April showed negative phase of EU pattern in 500hPa geopotential height field of high latitude in the Northern Hemisphere, and North of China located at where cold and warm air masses met, which availed reinforcement of south wind and ascending motion. In middle and high latitudes was latitudinal circulation, and North of China was controlled by warm ridge and latitudinal large-scale front zone; In years of less rainfall, meridional circulation prevailed and large-scale front zone located northward and presented meridional pattern, and North of China was affected by cold air mass. At the same time, water vapor was transported strongly from Pacific, South China Sea and southwest of China, and reached Northeast of China. In years of less rainfall, the water vapor transportation was quite weak. The rainfall was related closely to sea surface temperature anomalies, especially to the Indian Ocean, the middle and east of Pacific, middle and south of Pacific and northwest of Pacific where there were

  3. An Examination of Coach and Player Relationships According to the Adapted LMX 7 Scale: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Caliskan, Gokhan

    2015-01-01

    The current study aims to test the reliability and validity of the Leader-Member Exchange (LMX 7) scale with regard to coach--player relationships in sports settings. A total of 330 professional soccer players from the Turkish Super League as well as from the First and Second Leagues participated in this study. Factor analyses were performed to…

  4. Merlino-Perkins Father-Daughter Relationship Inventory (MP-FDI): Construction, Reliability, Validity, and Implications for Counseling and Research

    ERIC Educational Resources Information Center

    Merlino Perkins, Rose J.

    2008-01-01

    The Merlino-Perkins Father-Daughter Relationship Inventory, a self-report instrument, assesses women's childhood interactions with supportive, doting, distant, controlling, tyrannical, physically abusive, absent, and seductive fathers. Item and scale development, psychometric findings drawn from factor analyses, reliability assessments, and…

  5. An Examination of Coach and Player Relationships According to the Adapted LMX 7 Scale: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Caliskan, Gokhan

    2015-01-01

    The current study aims to test the reliability and validity of the Leader-Member Exchange (LMX 7) scale with regard to coach--player relationships in sports settings. A total of 330 professional soccer players from the Turkish Super League as well as from the First and Second Leagues participated in this study. Factor analyses were performed to…

  6. Reliability and Validity of a Self-Concept Scale for Researchers in Family Relationships

    ERIC Educational Resources Information Center

    Rathus, Spencer A.; Siegel, Larry J.

    1976-01-01

    Self-concept questionnaire was shown to have high test-retest reliability, but only fair to moderate split-half (odd-even) reliability. Validity was adequate. The scale will serve as a heuristic device for family counselors who require a rapid assessment of a child's self-esteem. (Author)

  7. The reliability-quality relationship for quality systems and quality risk management.

    PubMed

    Claycamp, H Gregg; Rahaman, Faiad; Urban, Jason M

    2012-01-01

    Engineering reliability typically refers to the probability that a system, or any of its components, will perform a required function for a stated period of time and under specified operating conditions. As such, reliability is inextricably linked with time-dependent quality concepts, such as maintaining a state of control and predicting the chances of losses from failures for quality risk management. Two popular current good manufacturing practice (cGMP) and quality risk management tools, failure mode and effects analysis (FMEA) and root cause analysis (RCA) are examples of engineering reliability evaluations that link reliability with quality and risk. Current concepts in pharmaceutical quality and quality management systems call for more predictive systems for maintaining quality; yet, the current pharmaceutical manufacturing literature and guidelines are curiously silent on engineering quality. This commentary discusses the meaning of engineering reliability while linking the concept to quality systems and quality risk management. The essay also discusses the difference between engineering reliability and statistical (assay) reliability. The assurance of quality in a pharmaceutical product is no longer measured only "after the fact" of manufacturing. Rather, concepts of quality systems and quality risk management call for designing quality assurance into all stages of the pharmaceutical product life cycle. Interestingly, most assays for quality are essentially static and inform product quality over the life cycle only by being repeated over time. Engineering process reliability is the fundamental concept that is meant to anticipate quality failures over the life cycle of the product. Reliability is a well-developed theory and practice for other types of manufactured products and manufacturing processes. Thus, it is well known to be an appropriate index of manufactured product quality. This essay discusses the meaning of reliability and its linkages with quality

  8. The enduring and evolving relationship between social class and breast cancer burden: a review of the literature.

    PubMed

    Klassen, Ann C; Smith, Katherine C

    2011-06-01

    Breast cancer in women has historically been seen as a "cancer of affluence" and there is a well-documented higher incidence among women of higher social class, as well as in societies with higher resources. However, the relationship between social class and breast cancer disease characteristics, especially those associated with poorer prognosis, is less well documented, and the overall relationship between breast cancer mortality and social class has been shown to vary. Furthermore, rapid changes in women's health and health-related behaviors in societies around the world may have an impact on both incidence and mortality patterns for breast cancer in the future. A PUBMED search on breast cancer and social class (incorporating the MeSH-nested concept of SES) yielded 403 possible studies published between 1978 and 2009, of which 90 met criteria for review. Our review discusses conceptualization and measurement of women's social class in each study, as well as findings related to breast cancer incidence, tumor biology or mortality, associated with social class. We found mostly consistent evidence that breast cancer incidence continues to be higher in higher social class groups, with some modification of risk with adjustment for known risk factors, including physical activity and reproductive history. However, biologic characteristics associated with poorer prognosis were negatively associated with social class (i.e., greater occurrence among disadvantaged women), and mortality from breast cancer showed inconsistent relationship to social class. We discuss these studies in relation to the growing burden of breast cancer among low resource groups and countries, and the need for cancer control strategies reflecting the emerging demographics of breast cancer risk. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. The Relationship Quality Interview: Evidence of Reliability, Convergent and Divergent Validity, and Incremental Utility

    ERIC Educational Resources Information Center

    Lawrence, Erika; Barry, Robin A.; Brock, Rebecca L.; Bunde, Mali; Langer, Amie; Ro, Eunyoe; Fazio, Emily; Mulryan, Lorin; Hunt, Sara; Madsen, Lisa; Dzankovic, Sandra

    2011-01-01

    Relationship satisfaction and adjustment have been the target outcome variables for almost all couple research and therapies. In contrast, far less attention has been paid to the assessment of relationship quality. The present study introduces the Relationship Quality Interview (RQI), a semistructured, behaviorally anchored individual interview.…

  10. Reliability of a Field Test of Defending and Attacking Agility in Australian Football and Relationships to Reactive Strength.

    PubMed

    Young, Warren B; Murray, Mitch P

    2017-02-01

    Young, WB and Murray, MP. Reliability of a field test of defending and attacking agility in Australian football and relationships to reactive strength. J Strength Cond Res 31(2): 509-516, 2017-Defending and attacking agility tests for Australian football do not exist, and it is unknown whether any physical qualities correlate with these types of agility. The purposes of this study were to develop new field tests of defending and attacking agility for Australian Rules football, to determine whether they were reliable, and to describe the relationship between the agility tests to determine their specificity. Because the reactive strength (RS) of the lower limb muscles has been previously correlated with change-of-direction speed, we also investigated the relationship between this quality and the agility tests. Nineteen male competitive recreational-level Australian Rules football players were assessed on the agility tests and a drop jump test to assess RS. Interday and interrater reliability was also assessed. The agility tests involved performing 10 trials of one-on-one agility tasks against 2 testers (opponents), in which the objective was to be in a position to tackle (defending) or to evade (attacking) the opponent. Both agility tests had good reliability (intraclass correlation > 0.8, %CV < 3, and no significant differences between test occasions [p > 0.05], and interrater reliability was very high [r = 0.997, p < 0.001]). The common variance between the agility tests was 45%, indicating that they represented relatively independent skills. There was a large correlation between RS and defending agility (r = 0.625, p = 0.004), and a very large correlation with attacking agility (r = 0.731, p < 0.001). Defending and attacking agility have different characteristics, possibly related to the footwork, physical, and cognitive demands of each. Nonetheless, RS seems to be important for agility, especially for attacking agility.

  11. Reliable Attention Network Scores and Mutually Inhibited Inter-network Relationships Revealed by Mixed Design and Non-orthogonal Method.

    PubMed

    Wang, Yi-Feng; Jing, Xiu-Juan; Liu, Feng; Li, Mei-Ling; Long, Zhi-Liang; Yan, Jin H; Chen, Hua-Fu

    2015-05-21

    The attention system can be divided into alerting, orienting, and executive control networks. The efficiency and independence of attention networks have been widely tested with the attention network test (ANT) and its revised versions. However, many studies have failed to find effects of attention network scores (ANSs) and inter-network relationships (INRs). Moreover, the low reliability of ANSs can not meet the demands of theoretical and empirical investigations. Two methodological factors (the inter-trial influence in the event-related design and the inter-network interference in orthogonal contrast) may be responsible for the unreliability of ANT. In this study, we combined the mixed design and non-orthogonal method to explore ANSs and directional INRs. With a small number of trials, we obtained reliable and independent ANSs (split-half reliability of alerting: 0.684; orienting: 0.588; and executive control: 0.616), suggesting an individual and specific attention system. Furthermore, mutual inhibition was observed when two networks were operated simultaneously, indicating a differentiated but integrated attention system. Overall, the reliable and individual specific ANSs and mutually inhibited INRs provide novel insight into the understanding of the developmental, physiological and pathological mechanisms of attention networks, and can benefit future experimental and clinical investigations of attention using ANT.

  12. Resolution of phylogenetic relationships among recently evolved species as a function of amount of DNA sequence: an empirical study based on woodpeckers (Aves: Picidae).

    PubMed

    DeFilippis, V R; Moore, W S

    2000-07-01

    Synonymous substitutions in the 13 mitochondrial encoded protein genes form a large pool of characters that should approach the ideal for phylogenetic analysis of being independently and identically distributed. Pooling sequences from multiple mitochondrial protein-coding genes should result in statistically more powerful estimates of relationships among species that diverged sufficiently recently that most nucleotide substitutions are synonymous. Cytochrome oxidase I (COI) was sequenced for woodpecker species for which cytochrome b (cyt b) sequences were available. A pairing-design test based on the normal distribution indicated that cyt b evolves more rapidly than COI when all nucleotides are compared but their rates are equal for synonymous substitutions. Nearly all of the phylogenetically informative substitutions among woodpeckers are synonymous. Statistical support for relationships, as measured by bootstrap proportions, increased as the number of nucleotides increased from 1047 (cyt b) to 1512 (COI) to 2559 nucleotides (aggregate data set). Pseudo-bootstrap replicates showed the same trend and increasing the amount of sequence beyond the actual length of 2559 nucleotides to 5120 (2x) resulted in stronger bootstrap support, even though the amount of phylogenetic information was the same. However, the amount of sequence required to resolve an internode depends on the length of the internode and its depth in the phylogeny.

  13. Relationship between evolving epileptiform activity and delayed loss of mitochondrial activity after asphyxia measured by near-infrared spectroscopy in preterm fetal sheep

    PubMed Central

    Bennet, L; Roelfsema, V; Pathipati, P; Quaedackers, J S; Gunn, A J

    2006-01-01

    Early onset cerebral hypoperfusion after birth is highly correlated with neurological injury in premature infants, but the relationship with the evolution of injury remains unclear. We studied changes in cerebral oxygenation, and cytochrome oxidase (CytOx) using near-infrared spectroscopy in preterm fetal sheep (103–104 days of gestation, term is 147 days) during recovery from a profound asphyxial insult (n = 7) that we have shown produces severe subcortical injury, or sham asphyxia (n = 7). From 1 h after asphyxia there was a significant secondary fall in carotid blood flow (P < 0.001), and total cerebral blood volume, as reflected by total haemoglobin (P < 0.005), which only partially recovered after 72 h. Intracerebral oxygenation (difference between oxygenated and deoxygenated haemoglobin concentrations) fell transiently at 3 and 4 h after asphyxia (P < 0.01), followed by a substantial increase to well over sham control levels (P < 0.001). CytOx levels were normal in the first hour after occlusion, was greater than sham control values at 2–3 h (P < 0.05), but then progressively fell, and became significantly suppressed from 10 h onward (P < 0.01). In the early hours after reperfusion the fetal EEG was highly suppressed, with a superimposed mixture of fast and slow epileptiform transients; overt seizures developed from 8 ± 0.5 h. These data strongly indicate that severe asphyxia leads to delayed, evolving loss of mitochondrial oxidative metabolism, accompanied by late seizures and relative luxury perfusion. In contrast, the combination of relative cerebral deoxygenation with evolving epileptiform transients in the early recovery phase raises the possibility that these early events accelerate or worsen the subsequent mitochondrial failure. PMID:16484298

  14. Relationship between evolving epileptiform activity and delayed loss of mitochondrial activity after asphyxia measured by near-infrared spectroscopy in preterm fetal sheep.

    PubMed

    Bennet, L; Roelfsema, V; Pathipati, P; Quaedackers, J S; Gunn, A J

    2006-04-01

    Early onset cerebral hypoperfusion after birth is highly correlated with neurological injury in premature infants, but the relationship with the evolution of injury remains unclear. We studied changes in cerebral oxygenation, and cytochrome oxidase (CytOx) using near-infrared spectroscopy in preterm fetal sheep (103-104 days of gestation, term is 147 days) during recovery from a profound asphyxial insult (n= 7) that we have shown produces severe subcortical injury, or sham asphyxia (n= 7). From 1 h after asphyxia there was a significant secondary fall in carotid blood flow (P < 0.001), and total cerebral blood volume, as reflected by total haemoglobin (P < 0.005), which only partially recovered after 72 h. Intracerebral oxygenation (difference between oxygenated and deoxygenated haemoglobin concentrations) fell transiently at 3 and 4 h after asphyxia (P < 0.01), followed by a substantial increase to well over sham control levels (P < 0.001). CytOx levels were normal in the first hour after occlusion, was greater than sham control values at 2-3 h (P < 0.05), but then progressively fell, and became significantly suppressed from 10 h onward (P < 0.01). In the early hours after reperfusion the fetal EEG was highly suppressed, with a superimposed mixture of fast and slow epileptiform transients; overt seizures developed from 8 +/- 0.5 h. These data strongly indicate that severe asphyxia leads to delayed, evolving loss of mitochondrial oxidative metabolism, accompanied by late seizures and relative luxury perfusion. In contrast, the combination of relative cerebral deoxygenation with evolving epileptiform transients in the early recovery phase raises the possibility that these early events accelerate or worsen the subsequent mitochondrial failure.

  15. The Relationship between Scoring Procedures and Focus and the Reliability of Direct Writing Assessment Scores.

    ERIC Educational Resources Information Center

    Wolfe, Edward W.; Kao, Chi-Wen

    This paper reports the results of an analysis of the relationship between scorer behaviors and score variability. Thirty-six essay scorers were interviewed and asked to perform a think-aloud task as they scored 24 essays. Each comment made by a scorer was coded according to its content focus (i.e. appearance, assignment, mechanics, communication,…

  16. Linearity and reliability of the mechanomyographic amplitude versus concentric dynamic constant external resistance relationships for the bench press exercise.

    PubMed

    Stock, Matt S; Beck, Travis W; DeFreitas, Jason M; Dillon, Michael A

    2010-03-01

    The purpose of the present study was to examine the linearity and reliability of the mechanomyographic (MMG) amplitude versus concentric dynamic constant external resistance (DCER) relationships for the bench press exercise. Twenty-one resistance-trained men (mean +/- SD age = 23.5 +/- 2.7 yr; 1 repetition maximum [1RM] bench press = 125.4 +/- 18.4 kg) volunteered to perform submaximal bench press muscle actions as explosively as possible from 10% to 90% of the 1RM on 2 separate occasions. During each muscle action, surface MMG signals were detected from both the right and left pectoralis major and triceps brachii, and the concentric portion of the range of motion was selected for analysis. The coefficients of determination for the MMG amplitude versus concentric DCER relationships ranged from r2 = 0.010 to 0.980 for the right pectoralis major, r2 = 0.010 to 0.943 for the left pectoralis major, r2 = 0.010 to 0.920 for the right triceps brachii, and r2 = 0.020 to 0.915 for the left triceps brachii, thus indicating a wide range of linearity between subjects. The intraclass correlation coefficients (ICC) and corresponding standard error of measurements (SEM) for the linear slope coefficients for these relationships were 0.592 (39.3% of the mean value), 0.537 (41.9% of the mean value), 0.625 (42.0% of the mean value), and 0.460 (60.2% of the mean value) for the right pectoralis major, the left pectoralis major, the right triceps brachii, and the left triceps brachii, respectively. These data demonstrated that these relationships were neither linear nor reliable enough to be used for assessing issues such as the neural versus hypertrophic contributions to training-induced strength gains and the mechanisms underlying cross-education.

  17. Linearity and reliability of the EMG amplitude versus dynamic torque relationships for the superficial quadriceps femoris muscles.

    PubMed

    Stock, M S; Beck, T W; DeFreitas, J M; Dillon, M A

    2010-03-01

    The purpose of the present investigation was to determine the linearity and reliability of the electromyographic (EMG) amplitude versus dynamic torque relationships for the vastus lateralis (VL), rectus femoris (RF), and vastus medialis (VM). Nine healthy men (mean +/- SD age = 25.3 +/- 4.7 years) and eleven healthy women (mean +/- SD age = 22.0 +/- 1.3 years) performed a series of randomly ordered, submaximal to maximal, concentric isokinetic muscle actions of the leg extensors at 30 degrees x s(1) on two occasions separated by at least 48 hours. During each muscle action, surface EMG signals were detected from the VL, RF and VM of the dominant thigh with bipolar surface electrode arrangements. The coefficients of determination for the EMG amplitude versus dynamic torque relationships ranged from r2 = 0.75-0.98 and 0.64-0.99 for the VL, r2 = 0.79-0.99 and 0.60-0.98 for the RFE and r2 = 0.44-0.98 and 0.51-0.98 for the VM for trials 1 and2, respectively. In some cases, the linear EMG amplitude versus torque slope coefficient for trial 1 was significantly different from that for trial 2 for the VL and RF, but not for the VM. The intraclass correlation coefficients for the linear EMG amplitude versus torque coefficients were 0.730 (VL), 0.709 (RF), and 0.888 (VM). These results indicated that the EMG amplitude versus dynamic torque relationships for the superficial quadriceps femoris muscles did not demonstrate enough linearity and reliability to be used for examining the contributions of neural versus hypertrophic factors to training-induced strength gains.

  18. Self Evolving Modular Network

    NASA Astrophysics Data System (ADS)

    Tokunaga, Kazuhiro; Kawabata, Nobuyuki; Furukawa, Tetsuo

    We propose a novel modular network called the Self-Evolving Modular Network (SEEM). The SEEM has a modular network architecture with a graph structure and these following advantages: (1) new modules are added incrementally to allow the network to adapt in a self-organizing manner, and (2) graph's paths are formed based on the relationships between the models represented by modules. The SEEM is expected to be applicable to evolving functions of an autonomous robot in a self-organizing manner through interaction with the robot's environment and categorizing large-scale information. This paper presents the architecture and an algorithm for the SEEM. Moreover, performance characteristic and effectiveness of the network are shown by simulations using cubic functions and a set of 3D-objects.

  19. Linearity and reliability of the mechanomyographic amplitude versus dynamic torque relationships for the superficial quadriceps femoris muscles.

    PubMed

    Stock, Matthew S; Beck, Travis W; Defreitas, Jason M; Dillon, Michael A

    2010-03-01

    The purpose of this investigation was to examine the linearity and reliability of the mechanomyographic (MMG) amplitude versus dynamic torque relationships for the vastus lateralis (VL), rectus femoris (RF), and vastus medialis (VM) muscles. Nine healthy men and 11 healthy women performed submaximal to maximal, concentric, isokinetic muscle actions of the leg extensors at 30 degrees s(-1) on two occasions. Surface MMG signals were detected from the VL, RF, and VM of the dominant thigh during both trials. The ranges of the coefficients of determination for the MMG amplitude versus dynamic torque relationships were 0.01-0.94 for the VL, 0.01-0.84 for the RF, and 0.19-0.96 for the VM. The intraclass correlation coefficients for the linear MMG amplitude versus torque slope coefficients were 0.823 (VL), 0.792 (RF), and 0.927 (VM). These results indicate that, when analyzed for individual subjects, the MMG amplitude versus dynamic torque relationships demonstrated inconsistent linearity. When using MMG in the clinical setting, dynamic muscle actions of the superficial quadriceps femoris muscles do not appear to be appropriate for assessing changes in muscle function during strength training.

  20. The Inventory of Teacher-Student Relationships: Factor Structure, Reliability, and Validity among African American Youth in Low-Income Urban Schools

    ERIC Educational Resources Information Center

    Murray, Christopher; Zvoch, Keith

    2011-01-01

    This study investigates the factor structure, reliability, and validity of the Inventory of Teacher-Student Relationships (IT-SR), a measure that was developed by adapting the widely used Inventory of Parent and Peer Attachments (Armsden & Greenberg, 1987) for use in the context of teacher-student relationships. The instrument was field tested…

  1. The Inventory of Teacher-Student Relationships: Factor Structure, Reliability, and Validity among African American Youth in Low-Income Urban Schools

    ERIC Educational Resources Information Center

    Murray, Christopher; Zvoch, Keith

    2011-01-01

    This study investigates the factor structure, reliability, and validity of the Inventory of Teacher-Student Relationships (IT-SR), a measure that was developed by adapting the widely used Inventory of Parent and Peer Attachments (Armsden & Greenberg, 1987) for use in the context of teacher-student relationships. The instrument was field tested…

  2. Flexed Truncal Posture in Parkinson Disease: Measurement Reliability and Relationship With Physical and Cognitive Impairments, Mobility, and Balance.

    PubMed

    Forsyth, Aimi L; Paul, Serene S; Allen, Natalie E; Sherrington, Catherine; Fung, Victor S C; Canning, Colleen G

    2017-04-01

    Flexed truncal posture is common in people with Parkinson disease (PD); however, little is known about the mechanisms responsible or its effect on physical performance. This cross-sectional study aimed to establish the reliability of a truncal posture measurement and explore relationships between PD impairments and truncal posture, as well as truncal posture and balance and mobility. A total of 82 people with PD participated. Truncal posture was measured in standing as the distance between vertebra C7 and a wall. Univariate and multivariate regression analyses were performed with truncal posture and impairments, including global axial symptoms, tremor, bradykinesia, rigidity, freezing of gait (FOG), reactive stepping and executive function, as well as truncal posture with balance and mobility measures. The truncal posture measure had excellent test-retest reliability (ICC3,1 0.79, 95% CI 0.60-0.89, P < 0.001). Global axial symptoms had the strongest association with truncal posture (adjusted R = 0.08, P = 0.01), although the majority of the variance remains unexplained. Post hoc analysis revealed that several impairments were associated with truncal posture only in those who did not report FOG. Flexed truncal posture was associated with poorer performance of most balance and mobility tasks after adjustment for age, gender, disease severity, and duration (adjusted R = 0.24-0.33, P < 0.001-0.03). The C7 to wall measurement is highly reliable in people with PD. Global axial symptoms were independently associated with truncal posture. Greater flexed truncal posture was associated with poorer balance and mobility. Further studies are required to elucidate the mechanisms responsible for flexed truncal posture and the impact on activity.Video Abstract available for more insights from the authors (see Video, Supplemental Digital Content 1, http://links.lww.com/JNPT/A164).

  3. Relative and absolute reliability of a modified agility T-test and its relationship with vertical jump and straight sprint.

    PubMed

    Sassi, Radhouane Haj; Dardouri, Wajdi; Yahmed, Mohamed Haj; Gmada, Nabil; Mahfoudhi, Mohamed Elhedi; Gharbi, Zied

    2009-09-01

    The aims of this study were to evaluate the reliability of a modified agility T-test (MAT) and to examine its relationship to the free countermovement jump (FCMJ) and the 10-m straight sprint (10mSS). In this new version, we preserved the same nature of displacement of the T-test but we reduced the total distance to cover. A total of 86 subjects (34 women: age = 22.6 +/- 1.4 years; weight = 63.7 +/- 10.2 kg; height = 1.65 +/- 0.05 m; body mass index = 23.3 +/- 3.3 kg x m(-2) and 52 men: age = 22.4 +/- 1.5 years; weight = 68.7 +/- 8.0 kg; height = 1.77 +/- 0.06 m; body mass index = 22.0 +/- 2.0 kg x m(-2)) performed MAT, T-test, FCMJ, and 10mSS. Our results showed no difference between test-retest MAT scores. Intraclass reliability of the MAT was greater than 0.90 across the trials (0.92 and 0.95 for women and men, respectively). The mean difference (bias) +/- the 95% limits of agreement was 0.03 +/- 0.37 seconds for women and 0.03 +/- 0.33 seconds for men. MAT was correlated to the T-test (r = 0.79, p < 0.001 and r = 0.75, p < 0.001 for women and men, respectively). Significant correlations were found between both MAT and FCMJ, and MAT and 10mSS for women (r = -0.47, p < 0.01 and r = 0.34, p < 0.05, respectively). No significant correlations were found between MAT and all other tests for men. These results indicate that MAT is a reliable test to assess agility. The weak relationship between MAT and strength and straight speed suggests that agility requires other determinants of performance as coordination. Considering that field sports generally include sprints with change direction over short distance, MAT seems to be more specific than the T-test when assessing agility.

  4. Changing and Evolving Relationships between Two- and Four-Year Colleges and Universities: They're Not Your Parents' Community Colleges Anymore

    ERIC Educational Resources Information Center

    Labov, Jay B.

    2012-01-01

    This paper describes a summit on Community Colleges in the Evolving STEM Education Landscape organized by a committee of the National Research Council (NRC) and the National Academy of Engineering (NAE) and held at the Carnegie Institution for Science on December 15, 2011. This summit followed a similar event organized by Dr. Jill Biden, spouse of…

  5. How Does the Strength of the Relationships between Cognitive Abilities Evolve over the Life Span for Low-IQ vs High-IQ Adults?

    ERIC Educational Resources Information Center

    Facon, Bruno

    2008-01-01

    The present study was designed to examine how the correlations between cognitive abilities evolve during adulthood. Data from 1104 participants on the French version of the Wechsler Adult Intelligence Scale-Third Edition were analyzed. The entire sample was divided into four age groups (16-24 years; 25-44 years; 45-69 years and 70-89 years), which…

  6. Changing and Evolving Relationships between Two- and Four-Year Colleges and Universities: They're Not Your Parents' Community Colleges Anymore

    ERIC Educational Resources Information Center

    Labov, Jay B.

    2012-01-01

    This paper describes a summit on Community Colleges in the Evolving STEM Education Landscape organized by a committee of the National Research Council (NRC) and the National Academy of Engineering (NAE) and held at the Carnegie Institution for Science on December 15, 2011. This summit followed a similar event organized by Dr. Jill Biden, spouse of…

  7. How reliable are randomised controlled trials for studying the relationship between diet and disease? A narrative review.

    PubMed

    Temple, Norman J

    2016-08-01

    Large numbers of randomised controlled trials (RCT) have been carried out in order to investigate diet-disease relationships. This article examines eight sets of studies and compares the findings with those from epidemiological studies (cohort studies in seven of the cases). The studies cover the role of dietary factors in blood pressure, body weight, cancer and heart disease. In some cases, the findings from the two types of study are consistent, whereas in other cases the findings appear to be in conflict. A critical evaluation of this evidence suggests factors that may account for conflicting findings. Very often RCT recruit subjects with a history of the disease under study (or at high risk of it) and have a follow-up of only a few weeks or months. Cohort studies, in contrast, typically recruit healthy subjects and have a follow-up of 5-15 years. Owing to these differences, findings from RCT are not necessarily more reliable than those from well-designed prospective cohort studies. We cannot assume that the results of RCT can be freely applied beyond the specific features of the studies.

  8. Relationship of lung function loss to level of initial function: correcting for measurement error using the reliability coefficient.

    PubMed Central

    Irwig, L; Groeneveld, H; Becklake, M

    1988-01-01

    The regression of lung function change on the initial lung function level is biased when the initial level is measured with random error. Several methods have been proposed to obtain unbiased estimates of regression coefficients in such circumstances. We apply these methods to examine the relationship between lung function loss over 11 years and its initial level in 433 men aged about 20 when first seen. On theoretical and practical grounds the best method is the correction of the regression coefficient using the reliability coefficient. This is defined as the ratio of the error free variance to the variance of the variable measured with error, and is easily estimated as the correlation between repeat measurements of the underlying level. In young men the loss of some lung functions (forced vital capacity [FVC], forced expiratory volume in one second [FEV1], forced expiratory flow in the middle half of expiration, and the ratio FEV1/FVC) do not appear to be related to initial level. PMID:3256581

  9. Evolvable synthetic neural system

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A. (Inventor)

    2009-01-01

    An evolvable synthetic neural system includes an evolvable neural interface operably coupled to at least one neural basis function. Each neural basis function includes an evolvable neural interface operably coupled to a heuristic neural system to perform high-level functions and an autonomic neural system to perform low-level functions. In some embodiments, the evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy.

  10. The Assessment of Positivity and Negativity in Social Networks: The Reliability and Validity of the Social Relationships Index

    ERIC Educational Resources Information Center

    Campo, Rebecca A.; Uchino, Bert N.; Holt-Lunstad, Julianne; Vaughn, Allison; Reblin, Maija; Smith, Timothy W.

    2009-01-01

    The Social Relationships Index (SRI) was designed to examine positivity and negativity in social relationships. Unique features of this scale include its brevity and the ability to examine relationship positivity and negativity at the level of the specific individual and social network. The SRI's psychometric properties were examined in three…

  11. Making Reliability Arguments in Classrooms

    ERIC Educational Resources Information Center

    Parkes, Jay; Giron, Tilia

    2006-01-01

    Reliability methodology needs to evolve as validity has done into an argument supported by theory and empirical evidence. Nowhere is the inadequacy of current methods more visible than in classroom assessment. Reliability arguments would also permit additional methodologies for evidencing reliability in classrooms. It would liberalize methodology…

  12. Reprocessing the Hipparcos data of evolved stars. III. Revised Hipparcos period-luminosity relationship for galactic long-period variable stars

    NASA Astrophysics Data System (ADS)

    Knapp, G. R.; Pourbaix, D.; Platais, I.; Jorissen, A.

    2003-06-01

    We analyze the K band luminosities of a sample of galactic long-period variables using parallaxes measured by the Hipparcos mission. The parallaxes are in most cases re-computed from the Hipparcos Intermediate Astrometric Data using improved astrometric fits and chromaticity corrections. The K band magnitudes are taken from the literature and from measurements by COBE, and are corrected for interstellar and circumstellar extinction. The sample contains stars of several spectral types: M, S and C, and of several variability classes: Mira, semiregular SRa, and SRb. We find that the distribution of stars in the period-luminosity plane is independent of circumstellar chemistry, but that the different variability types have different P-L distributions. Both the Mira variables and the SRb variables have reasonably well-defined period-luminosity relationships, but with very different slopes. The SRa variables are distributed between the two classes, suggesting that they are a mixture of Miras and SRb, rather than a separate class of stars. New period-luminosity relationships are derived based on our revised Hipparcos parallaxes. The Miras show a similar period-luminosity relationship to that found for Large Magellanic Cloud Miras by Feast et al. (\\cite{Feast-1989:a}). The maximum absolute K magnitude of the sample is about -8.2 for both Miras and semi-regular stars, only slightly fainter than the expected AGB limit. We show that the stars with the longest periods (P>400 d) have high mass loss rates and are almost all Mira variables. Based on observations from the Hipparcos astrometric satellite operated by the European Space Agency (ESA \\cite{Hipparcos}). Table \\ref{Tab:data1} is only available in electronic form at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsweb.u-strasbg.fr/cgi-bin/qcat?J/A+A/403/993

  13. Evolving Sensitivity Balances Boolean Networks

    PubMed Central

    Luo, Jamie X.; Turner, Matthew S.

    2012-01-01

    We investigate the sensitivity of Boolean Networks (BNs) to mutations. We are interested in Boolean Networks as a model of Gene Regulatory Networks (GRNs). We adopt Ribeiro and Kauffman’s Ergodic Set and use it to study the long term dynamics of a BN. We define the sensitivity of a BN to be the mean change in its Ergodic Set structure under all possible loss of interaction mutations. Insilico experiments were used to selectively evolve BNs for sensitivity to losing interactions. We find that maximum sensitivity was often achievable and resulted in the BNs becoming topologically balanced, i.e. they evolve towards network structures in which they have a similar number of inhibitory and excitatory interactions. In terms of the dynamics, the dominant sensitivity strategy that evolved was to build BNs with Ergodic Sets dominated by a single long limit cycle which is easily destabilised by mutations. We discuss the relevance of our findings in the context of Stem Cell Differentiation and propose a relationship between pluripotent stem cells and our evolved sensitive networks. PMID:22586459

  14. Reliability, Validity, and Associations with Sexual Behavior among Ghanaian Teenagers of Scales Measuring Four Dimensions Relationships with Parents and Other Adults

    PubMed Central

    Bingenheimer, Jeffrey B.; Asante, Elizabeth; Ahiadeke, Clement

    2013-01-01

    Little research has been done on the social contexts of adolescent sexual behaviors in sub-Saharan Africa. As part of a longitudinal cohort study (N=1275) of teenage girls and boys in two Ghanaian towns, interviewers administered a 26 item questionnaire module intended to assess four dimensions of youth-adult relationships: monitoring conflict, emotional support, and financial support. Confirmatory factor and traditional psychometric analyses showed the four scales to be reliable. Known-groups comparisons provided evidence of their validity. All four scales had strong bivariate associations with self-reported sexual behavior (odds ratios = 1.66, 0.74, 0.47, and 0.60 for conflict, support, monitoring, and financial support). The instrument is practical for use in sub-Saharan African settings and produces measures that are reliable, valid, and predictive of sexual behavior in youth. PMID:25821286

  15. Traditional vs. Sport-Specific Vertical Jump Tests: Reliability, Validity, and Relationship With the Legs Strength and Sprint Performance in Adult and Teen Soccer and Basketball Players.

    PubMed

    Rodríguez-Rosell, David; Mora-Custodio, Ricardo; Franco-Márquez, Felipe; Yáñez-García, Juan M; González-Badillo, Juan J

    2017-01-01

    Rodríguez-Rosell, D, Mora-Custodio, R, Franco-Márquez, F, Yáñez-García, JM, González-Badillo, JJ. Traditional vs. sport-specific vertical jump tests: reliability, validity, and relationship with the legs strength and sprint performance in adult and teen soccer and basketball players. J Strength Cond Res 31(1): 196-206, 2017-The vertical jump is considered an essential motor skill in many team sports. Many protocols have been used to assess vertical jump ability. However, controversy regarding test selection still exists based on the reliability and specificity of the tests. The main aim of this study was to analyze the reliability and validity of 2 standardized (countermovement jump [CMJ] and Abalakov jump [AJ]) and 2 sport-specific (run-up with 2 [2-LEGS] or 1 leg [1-LEG] take-off jump) vertical jump tests, and their usefulness as predictors of sprint and strength performance for soccer (n = 127) and basketball (n = 59) players in 3 different categories (Under-15, Under-18, and Adults). Three attempts for each of the 4 jump tests were recorded. Twenty-meter sprint time and estimated 1 repetition maximum in full squat were also evaluated. All jump tests showed high intraclass correlation coefficients (0.969-0.995) and low coefficients of variation (1.54-4.82%), although 1-LEG was the jump test with the lowest absolute and relative reliability. All selected jump tests were significantly correlated (r = 0.580-0.983). Factor analysis resulted in the extraction of one principal component, which explained 82.90-95.79% of the variance of all jump tests. The 1-LEG test showed the lowest associations with sprint and strength performance. The results of this study suggest that CMJ and AJ are the most reliable tests for the estimation of explosive force in soccer and basketball players in different age categories.

  16. [Reliability evaluation for thirteen parameters describing anteroposterior apical base relationship in Angle Class II division 1 patients].

    PubMed

    Liu, Dong-Xu; Zhang, Lei; Wang, Chun-Ling; Zhang, Xiao-Yan; Guo, Jing

    2006-08-01

    To study 13 different cephalometric measurements of the anteroposterior jaw relationship in Angle Class II division 1 statistically and geometrically, and to discuss the effects of various factors on this relationship, and then to choose the most adequate measurements. 120 patients with skeletal malocclusion Angle Class II division 1 were selected. The samples consisted of 60 men and 60 women between 20 to 28 years of age. Lateral cephalometric radiograph for each patient was taken in natural head posture by the same operator. A wire plumb line and suspended weight recorded the true vertical on each radiograph. The subject was then asked to determine the self-balanced position of the head. After determining the self-balanced neutral position, the subject was asked to look into his/her own eyes in the mirror. ANB angle, A-B plane angle, Wits appraisal, AF-BF distance, AXB angle, AB/SN4 distance, AB/PP distance, AXD angle, AD/SN distance, SGn/AB angle, APDI, FABA angle, beta angle and AB/HP distance were measured. Coefficient correlations among measurements were tabulated to determine which combination would produce a higher value. Fuzzy grouping analysis was made. Statistically significant and highly correlated relationships was found among many measurements except Wits appraisal and SGn/AB angle. The thirteen measurements could be divided into five clusters. The cephalometric measurements except Wits appraisal and SGn/AB angle can be used to evaluate anteroposterior jaw relationship. The AB/SN4 distance is the most adequate measurements. ANB angle, A-B plane angle, AF-BF distance, AXB angle, AB/PP distance, APDI, FABA angle and beta angle are similar in describing anteroposterior apical base relationship in Angle Class II division 1 patients.

  17. Reliability, Factor Structure, and Associations With Measures of Problem Relationship and Behavior of the Personality Inventory for DSM-5 in a Sample of Italian Community-Dwelling Adolescents.

    PubMed

    Somma, Antonella; Borroni, Serena; Maffei, Cesare; Giarolli, Laura E; Markon, Kristian E; Krueger, Robert F; Fossati, Andrea

    2017-01-10

    In order to assess the reliability, factorial validity, and criterion validity of the Personality Inventory for DSM-5 (PID-5) among adolescents, 1,264 Italian high school students were administered the PID-5. Participants were also administered the Questionnaire on Relationships and Substance Use as a criterion measure. In the full sample, McDonald's ω values were adequate for the PID-5 scales (median ω = .85, SD = .06), except for Suspiciousness. However, all PID-5 scales showed average inter-item correlation values in the .20-.55 range. Exploratory structural equation modeling analyses provided moderate support for the a priori model of PID-5 trait scales. Ordinal logistic regression analyses showed that selected PID-5 trait scales predicted a significant, albeit moderate (Cox & Snell R(2) values ranged from .08 to .15, all ps < .001) amount of variance in Questionnaire on Relationships and Substance Use variables.

  18. An examination of the linearity and reliability of the electromyographic amplitude versus dynamic constant external resistance relationships using monopolar and bipolar recording methods.

    PubMed

    Stock, Matt S; Beck, Travis W; Defreitas, Jason M; Dillon, Michael A

    2010-12-15

    The purpose of this study was to examine the linearity and reliability of the electromyographic (EMG) amplitude versus dynamic constant external resistance (DCER) relationships for monopolar and bipolar recording techniques during concentric and eccentric muscle actions. Nineteen healthy men (mean ± SD age = 22.9 ± 2.5 years) performed a series of randomly ordered, submaximal to maximal, unilateral DCER muscle actions of the dominant forearm flexors on two occasions separated by at least 48 h. Specifically, the subjects lifted and lowered weights corresponding to 10-100% of the one repetition maximum (1-RM) in 10% increments. During each muscle action, monopolar and bipolar surface EMG signals were detected simultaneously from the biceps brachii. For the monopolar and bipolar methods, the coefficients of determination for the EMG amplitude versus DCER relationships ranged from 0.64-0.98 and 0.38-0.98 for the concentric muscle actions and 0.45-0.98 and 0.45-0.98 for the eccentric muscle actions, respectively. The intraclass correlation coefficients (ICC) and corresponding standard errors of measurement (SEM) for the linear slope coefficients for the EMG amplitude versus DCER relationships were 0.682 (18.4%) and 0.594 (21.8%) with the monopolar method and 0.810 (25.6%) and 0.774 (17.6%) with the bipolar method for the concentric and eccentric muscle actions, respectively. These findings indicated that monopolar and bipolar recording techniques may be used with a similar degree of linearity and reliability for the EMG amplitude versus concentric and eccentric DCER relationships. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Genomic medicine: evolving science, evolving ethics

    PubMed Central

    Soden, Sarah E; Farrow, Emily G; Saunders, Carol J; Lantos, John D

    2012-01-01

    Genomic medicine is rapidly evolving. Next-generation sequencing is changing the diagnostic paradigm by allowing genetic testing to be carried out more quickly, less expensively and with much higher resolution; pushing the envelope on existing moral norms and legal regulations. Early experience with implementation of next-generation sequencing to diagnose rare genetic conditions in symptomatic children suggests ways that genomic medicine might come to be used and some of the ethical issues that arise, impacting test design, patient selection, consent, sequencing analysis and communication of results. The ethical issues that arise from use of new technologies cannot be satisfactorily analyzed until they are understood and they cannot be understood until the technologies are deployed in the real world. PMID:23173007

  20. VLSI reliability

    SciTech Connect

    Sabnis, A.G. )

    1990-01-01

    This book presents major topics in IC reliability from basic concepts to packaging issues. Other topics covered include failure analysis techniques, radiation effects, and reliability assurance and qualification. This book offers insight into the practical aspects of VLSI reliability.

  1. Reproducibility and Reliability Of QTc and QTcd Measurements and Their Relationships with Left Ventricular Hypertrophy in Hemodialysis Patients.

    PubMed

    Alonso, Maria Angélica Gonçalves; Lima, Valentine de Almeida Costa de Castro; Carreira, Maria Angela Magalhães de Queiroz; Lugon, Jocemir Ronaldo

    2017-08-07

    Left ventricular hypertrophy (LVH) is very common in hemodialysis patients and an independent risk factor for mortality in this population. The myocardial remodeling underlying the LVH can affect ventricular repolarization causing abnormalities in QT interval. to evaluate the reproducibility and reliability of measurements of corrected QT interval (QTc) and its dispersion (QTcd) and correlate these parameters with LVH in hemodialysis patients. Case-control study involving hemodialysis patients and a control group. Clinical examination, blood sampling, transthoracic echocardiogram, and electrocardiogram were performed. Intra- and interobserver correlation and concordance tests were performed by Pearson´s correlation, Cohen's Kappa coefficient and Bland Altman diagram. Linear regression was used to analyze association of QTc or QTcd with HVE. Forty-one HD patients and 37 controls concluded the study. Hemodialysis patients tended to have higher values of QTc, QTcd and left ventricular mass index (LVMi) than controls but statistical significance was not found. Correlation and concordance tests depicted better results for QTc than for QTcd. In HD patients, a poor but significant correlation was found between QTc and LVMi (R2 = 0.12; p = 0.03). No correlation was found between values of QTcd and LVMi (R2= 0.00; p=0.940). For the control group, the correspondent values were R2= 0.00; p = 0.67 and R2= 0.00; p = 0.94, respectively. We found that QTc interval, in contrast to QTcd, is a reproducible and reliable measure and had a weak but positive correlation with LVMi in HD patients. A hipertrofia ventricular esquerda (HVE) é muito comum em pacientes em hemodiálise e um fator de risco independente de mortalidade nessa população. O remodelamento do miocárdio, subjacente à HVE, pode afetar a repolarização ventricular, causando anormalidades no intervalo QT. avaliar a reprodutibilidade e confiabilidade das medidas do intervalo QT corrigido (QTc) e sua dispersão (QTcd

  2. Feeding practice among 6-36 months old in Tanzania and Uganda: reliability and relationship with early childhood caries, ECC.

    PubMed

    Masumo, Ray; Bardsen, Asgeir; Mashoto, Kijakazi; Åstrøm, Anne Nordrehaug

    2013-09-01

    To assess the reproducibility of caregivers' responses to dietary recall from birth and 24-h dietary recall with respect to infants' intake of sugared snacks and to assess whether those assessment methods provide comparable results for groups of infants. Re-test reliability and clinical covariates of time to first exposure of sugared snacks and time to termination of breastfeeding were also examined. It was hypothesized that time to first exposure/termination would vary according to socio-demographic profile and ECC. Interviews and clinical oral examinations were carried out in Kampala and Manyara, including 1221 and 816 child-caregiver pairs. Reproducibility was assessed using Cohen's kappa and Intra Class Correlation Coefficient, ICC. Adjusted Cox regression was used to model time to first exposure of sugared snacks and time to termination of breastfeeding. Cohen's kappa for intake of sugar items ranged from 0.40-1.0, with no differences observed between average intakes at test-re-test. Mean sugar score based on 24-h recall increased significantly by increasing quartiles of the sugar score based on recall from birth. Cox regression revealed that the odds ratio, OR, for early exposure to various sugared snacks and the ORs for early termination of breastfeeding were significantly smaller in infants with than without ECC. Fair-to-good reproducibility was established. Infant's sugar consumption emerge as early as 6 months of age. Survival of any breastfeeding and non-exposure to sugared snacks was most prolonged among infants with ECC. This has implications for interventions needed to improve feeding habits of infants and toddlers.

  3. An Evolving Astrobiology Glossary

    NASA Astrophysics Data System (ADS)

    Meech, K. J.; Dolci, W. W.

    2009-12-01

    One of the resources that evolved from the Bioastronomy 2007 meeting was an online interdisciplinary glossary of terms that might not be universally familiar to researchers in all sub-disciplines feeding into astrobiology. In order to facilitate comprehension of the presentations during the meeting, a database driven web tool for online glossary definitions was developed and participants were invited to contribute prior to the meeting. The glossary was downloaded and included in the conference registration materials for use at the meeting. The glossary web tool is has now been delivered to the NASA Astrobiology Institute so that it can continue to grow as an evolving resource for the astrobiology community.

  4. Microbiota and diabetes: an evolving relationship.

    PubMed

    Tilg, Herbert; Moschen, Alexander R

    2014-09-01

    The gut microbiota affects numerous biological functions throughout the body and its characterisation has become a major research area in biomedicine. Recent studies have suggested that gut bacteria play a fundamental role in diseases such as obesity, diabetes and cardiovascular disease. Data are accumulating in animal models and humans suggesting that obesity and type 2 diabetes (T2D) are associated with a profound dysbiosis. First human metagenome-wide association studies demonstrated highly significant correlations of specific intestinal bacteria, certain bacterial genes and respective metabolic pathways with T2D. Importantly, especially butyrate-producing bacteria such as Roseburia intestinalis and Faecalibacterium prausnitzii concentrations were lower in T2D subjects. This supports the increasing evidence, that butyrate and other short-chain fatty acids are able to exert profound immunometabolic effects. Endotoxaemia, most likely gut-derived has also been observed in patients with metabolic syndrome and T2D and might play a key role in metabolic inflammation. A further hint towards an association between microbiota and T2D has been derived from studies in pregnancy showing that major gut microbial shifts occurring during pregnancy affect host metabolism. Interestingly, certain antidiabetic drugs such as metformin also interfere with the intestinal microbiota. Specific members of the microbiota such as Akkermansia muciniphila might be decreased in diabetes and when administered to murines exerted antidiabetic effects. Therefore, as a 'gut signature' becomes more evident in T2D, a better understanding of the role of the microbiota in diabetes might provide new aspects regarding its pathophysiological relevance and pave the way for new therapeutic principles.

  5. Genetics and Medicine: An Evolving Relationship

    ERIC Educational Resources Information Center

    Scriver, Charles R.; And Others

    1978-01-01

    Described is the importance of genetic factors in health and disease and calls for the development of services for genetic screening, diagnosis, and counseling. Such services presently available in Canada are described. (BB)

  6. Methods Evolved by Observation

    ERIC Educational Resources Information Center

    Montessori, Maria

    2016-01-01

    Montessori's idea of the child's nature and the teacher's perceptiveness begins with amazing simplicity, and when she speaks of "methods evolved," she is unveiling a methodological system for observation. She begins with the early childhood explosion into writing, which is a familiar child phenomenon that Montessori has written about…

  7. The Skin Cancer and Sun Knowledge (SCSK) Scale: Validity, Reliability, and Relationship to Sun-Related Behaviors Among Young Western Adults.

    PubMed

    Day, Ashley K; Wilson, Carlene; Roberts, Rachel M; Hutchinson, Amanda D

    2014-08-01

    Increasing public knowledge remains one of the key aims of skin cancer awareness campaigns, yet diagnosis rates continue to rise. It is essential we measure skin cancer knowledge adequately so as to determine the nature of its relationship to sun-related behaviors. This study investigated the psychometric properties of a new measure of skin cancer knowledge, the Skin Cancer and Sun Knowledge (SCSK) scale. A total of 514 Western young adults (females n = 320, males n = 194) aged 18 to 26 years completed measures of skin type, skin cancer knowledge, tanning behavior, sun exposure, and sun protection. Two-week test-retest of the SCSK was conducted with 52 participants. Internal reliability of the SCSK scale was acceptable (KR-20 = .69), test-retest reliability was high (r = .83, n = 52), and acceptable levels of face, content, and incremental validity were demonstrated. Skin cancer knowledge (as measured by SCSK) correlated with sun protection, sun exposure, and tanning behaviors in the female sample, but not in the males. Skin cancer knowledge appears to be more relevant to the behavior of young women than that of young males. We recommend that future research establish the validity of the SCSK across a range of participant groups.

  8. Evolvability Characterization in the Context of SOA

    NASA Astrophysics Data System (ADS)

    Arciniegas H., Jose L.; Dueñas L., Juan C.

    Service-Oriented Architecture (SOA) is an architectural style which promotes reuse of self-contained services. These self-contained services allow a better consideration of software quality characteristics as they can be independently analyzed. In our work, the evolvability quality characteristic has been considered, due to its impact in the stages of Maintenance and Evolution (M&E) for the software enterprises. Three goals are underlined in this paper: first, the relationship between SOA and quality characteristics focusing on a precise definition of evolvability of a software product from the SOA perspective, second a M&E model for SOA, and finally, some experiences are presented in order to assess evolvability in real software products. Two case studies have been executed: the first one analyzing the evolvability of the OSGi framework. And in the second case, the model is used in local Small and Medium Enterprises (SMEs), where an improvement process has been executed.

  9. Relationship Between Agility Tests and Short Sprints: Reliability and Smallest Worthwhile Difference in National Collegiate Athletic Association Division-I Football Players.

    PubMed

    Mann, J Bryan; Ivey, Pat A; Mayhew, Jerry L; Schumacher, Richard M; Brechue, William F

    2016-04-01

    The Pro-Agility test (I-Test) and 3-cone drill (3-CD) are widely used in football to assess quickness in change of direction. Likewise, the 10-yard (yd) sprint, a test of sprint acceleration, is gaining popularity for testing physical competency in football players. Despite their frequent use, little information exists on the relationship between agility and sprint tests as well the reliability and degree of change necessary to indicate meaningful improvement resulting from training. The purpose of this study was to determine the reliability and smallest worthwhile difference (SWD) of the I-Test and 3-CD and the relationship of sprint acceleration to their performance. Division-I football players (n = 64, age = 20.5 ± 1.2 years, height = 185.2 ± 6.1 cm, body mass = 107.8 ± 20.7 kg) performed duplicate trials in each test during 2 separate weeks at the conclusion of a winter conditioning period. The better time of the 2 trials for each week was used for comparison. The 10-yd sprint was timed electronically, whereas the I-Test and 3-CD were hand timed by experienced testers. Each trial was performed on an indoor synthetic turf, with players wearing multicleated turf shoes. There was no significant difference (p > 0.06) between test weeks for the I-Test (4.53 ± 0.35 vs. 4.54 ± 0.31 seconds), 3-CD (7.45 ± 0.06 vs. 7.49 ± 0.06 seconds), or 10-yd sprint (1.85 ± 0.12 vs. 1.84 ± 0.12 seconds). The intraclass correlation coefficients (ICC) for 3-CD (ICC = 0.962) and 10-yd sprint (ICC = 0.974) were slightly higher than for the I-Test (ICC = 0.914). These values lead to acceptable levels of the coefficient of variation for each test (1.2, 1.2, and 1.9%, respectively). The SWD% indicated that a meaningful improvement due to training would require players to decrease their times by 6.6% for I-Test, 3.7% for 3-CD, and 3.8% for 10-yd sprint. Performance in agility and short sprint tests are highly related and reliable in college football players, providing quantifiable

  10. Evolvable Neural Software System

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  11. Evolving Robust Gene Regulatory Networks

    PubMed Central

    Noman, Nasimul; Monjo, Taku; Moscato, Pablo; Iba, Hitoshi

    2015-01-01

    Design and implementation of robust network modules is essential for construction of complex biological systems through hierarchical assembly of ‘parts’ and ‘devices’. The robustness of gene regulatory networks (GRNs) is ascribed chiefly to the underlying topology. The automatic designing capability of GRN topology that can exhibit robust behavior can dramatically change the current practice in synthetic biology. A recent study shows that Darwinian evolution can gradually develop higher topological robustness. Subsequently, this work presents an evolutionary algorithm that simulates natural evolution in silico, for identifying network topologies that are robust to perturbations. We present a Monte Carlo based method for quantifying topological robustness and designed a fitness approximation approach for efficient calculation of topological robustness which is computationally very intensive. The proposed framework was verified using two classic GRN behaviors: oscillation and bistability, although the framework is generalized for evolving other types of responses. The algorithm identified robust GRN architectures which were verified using different analysis and comparison. Analysis of the results also shed light on the relationship among robustness, cooperativity and complexity. This study also shows that nature has already evolved very robust architectures for its crucial systems; hence simulation of this natural process can be very valuable for designing robust biological systems. PMID:25616055

  12. Highly-evolved stars

    NASA Technical Reports Server (NTRS)

    Heap, S. R.

    1981-01-01

    The ways in which the IUE has proved useful in studying highly evolved stars are reviewed. The importance of high dispersion spectra for abundance analyses of the sd0 stars and for studies of the wind from the central star of NGC 6543 and the wind from the 0 type component of Vela X-1 is shown. Low dispersion spectra are used for absolute spectrophotometry of the dwarf nova, Ex Hya. Angular resolution is important for detecting and locating UV sources in globular clusters.

  13. Regolith Evolved Gas Analyzer

    NASA Technical Reports Server (NTRS)

    Hoffman, John H.; Hedgecock, Jud; Nienaber, Terry; Cooper, Bonnie; Allen, Carlton; Ming, Doug

    2000-01-01

    The Regolith Evolved Gas Analyzer (REGA) is a high-temperature furnace and mass spectrometer instrument for determining the mineralogical composition and reactivity of soil samples. REGA provides key mineralogical and reactivity data that is needed to understand the soil chemistry of an asteroid, which then aids in determining in-situ which materials should be selected for return to earth. REGA is capable of conducting a number of direct soil measurements that are unique to this instrument. These experimental measurements include: (1) Mass spectrum analysis of evolved gases from soil samples as they are heated from ambient temperature to 900 C; and (2) Identification of liberated chemicals, e.g., water, oxygen, sulfur, chlorine, and fluorine. REGA would be placed on the surface of a near earth asteroid. It is an autonomous instrument that is controlled from earth but does the analysis of regolith materials automatically. The REGA instrument consists of four primary components: (1) a flight-proven mass spectrometer, (2) a high-temperature furnace, (3) a soil handling system, and (4) a microcontroller. An external arm containing a scoop or drill gathers regolith samples. A sample is placed in the inlet orifice where the finest-grained particles are sifted into a metering volume and subsequently moved into a crucible. A movable arm then places the crucible in the furnace. The furnace is closed, thereby sealing the inner volume to collect the evolved gases for analysis. Owing to the very low g forces on an asteroid compared to Mars or the moon, the sample must be moved from inlet to crucible by mechanical means rather than by gravity. As the soil sample is heated through a programmed pattern, the gases evolved at each temperature are passed through a transfer tube to the mass spectrometer for analysis and identification. Return data from the instrument will lead to new insights and discoveries including: (1) Identification of the molecular masses of all of the gases

  14. Validity and reliability of the Patient Centred Assessment Method for patient complexity and relationship with hospital length of stay: a prospective cohort study.

    PubMed

    Yoshida, Shuhei; Matsushima, Masato; Wakabayashi, Hidetaka; Mutai, Rieko; Murayama, Shinichi; Hayashi, Tetsuro; Ichikawa, Hiroko; Nakano, Yuko; Watanabe, Takamasa; Fujinuma, Yasuki

    2017-05-09

    Several instruments for evaluating patient complexity have been developed from a biopsychosocial perspective. Although relationships between the results obtained by these instruments and the length of stay in hospital have been examined, many instruments are complicated and not easy to use. The Patient Centred Assessment Method (PCAM) is a candidate for practical use. This study aimed to test the validity and reliability of the PCAM and examine the correlations between length of hospital stay and PCAM scores in a regional secondary care hospital in Japan. Prospective cohort study. Two hundred and one patients admitted to Ouji Coop Hospital between July 2014 and September 2014. PCAM total score in initial phase of hospital admission. Length of stay in hospital. Among 201 patients (Female/Male=98/103) with mean (SD) age of 77.4±11.9 years, the mean PCAM score was 25±7.3 and mean (SD) length of stay in hospital (LOS) 34.1±40.9 days. Using exploratory factor analysis to examine construct validity, PCAM evidently has a two-factor structure, comprising medicine-oriented and patient-oriented complexity. The Spearman rank correlation coefficient for evaluating criterion-based validity between PCAM and INTERMED was 0.90. For reliability, Cronbach's alpha was 0.85. According to negative binomial regression analyses, PCAM scores are a statistically significant predictor (p<0.001) of LOS after adjusting for age, gender, Mini Nutritional Assessment Short-Form, Charlson Comorbidity Index, serum sodium concentration, total number of medications and whether public assistance was required. In another model, each factor in PCAM was independently correlated with length of stay in hospital after adjustment (medicine-oriented complexity: p=0.001, patient-oriented complexity: p=0.014). PCAM is a reliable and valid measurement of patient complexity and PCAM scores have a significant correlation with hospital length of stay. © Article author(s) (or their employer(s) unless otherwise

  15. Psychometric development and reliability analysis of a patient satisfaction with interpersonal relationship with navigator measure: a multi-site patient navigation research program study.

    PubMed

    Jean-Pierre, Pascal; Fiscella, Kevin; Winters, Paul C; Post, Douglas; Wells, Kristen J; McKoy, June M; Battaglia, Tracy; Simon, Melissa A; Kilbourn, Kristin

    2012-09-01

    Patient navigation (PN) is a method for addressing racial-ethnic and socioeconomically based disparities in cancer-related care. Patient navigators provide logistic and emotional support to underserved patients to facilitate successful completion of diagnostic and treatment care. Yet, little is known about patient satisfaction with the relationship with a navigator due to a dearth of instruments measuring satisfaction. The objective of this study was to validate the Patient Satisfaction with Interpersonal Relationship with Navigator (PSN-I) measure for patients undergoing diagnostic and/or therapeutic cancer care. We administered the PSN-I to 783 participants from the nine different sites of the National Cancer Institute sponsored Patient Navigation Research Program. We evaluated the latent structure and internal consistency of the PSN-I using principal components analysis (PCA) and Cronbach coefficient alpha (α), respectively. We used correlation analyses to examine divergence and convergence of the PSN-I with the Patient Satisfaction with Cancer-related Care (PSCC), the Rapid Estimate of Adult Literacy in Medicine (REALM) Long Form, and patients' demographics. The PCA revealed a coherent set of items that explicates 76.6% of the variance in PSN-I. Reliability assessment revealed high internal consistency (α ranging from 0.95 to 0.96). The PSN-I had good face validity as well as convergent and divergent validities as indicated by moderate correlations with score on the PSCC (all ps < 0.0001) and non-significant correlations with primary language, marital status, and scores on the REALM Long Form (all ps > 0.05). The PSN-I is a valid and suitable measure of satisfaction with a patient navigator for the present sample. Copyright © 2011 John Wiley & Sons, Ltd.

  16. Our evolving universe

    NASA Astrophysics Data System (ADS)

    Longair, Malcolm S.

    Our Evolving Universe is a lucid, non-technical and infectiously enthusiastic introduction to current astronomy and cosmology. Highly illustrated throughout with the latest colour images from the world's most advanced telescopes, it also provides a colourful view of our Universe. Malcolm Longair takes us on a breathtaking tour of the most dramatic recent results astronomers have on the birth of stars, the hunt for black holes and dark matter, on gravitational lensing and the latest tests of the Big Bang. He leads the reader right up to understand the key questions that future research in astronomy and cosmology must answer. A clear and comprehensive glossary of technical terms is also provided. For the general reader, student or professional wishing to understand the key questions today's astronomers and cosmologists are trying to answer, this is an invaluable and inspiring read.

  17. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1990-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predicitons for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates prodcued by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  18. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1989-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predictions for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates produced by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  19. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1990-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predicitons for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates prodcued by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  20. Why did heterospory evolve?

    PubMed

    Petersen, Kurt B; Burd, Martin

    2016-10-11

    The primitive land plant life cycle featured the production of spores of unimodal size, a condition called homospory. The evolution of bimodal size distributions with small male spores and large female spores, known as heterospory, was an innovation that occurred repeatedly in the history of land plants. The importance of desiccation-resistant spores for colonization of the land is well known, but the adaptive value of heterospory has never been well established. It was an addition to a sexual life cycle that already involved male and female gametes. Its role as a precursor to the evolution of seeds has received much attention, but this is an evolutionary consequence of heterospory that cannot explain the transition from homospory to heterospory (and the lack of evolutionary reversal from heterospory to homospory). Enforced outcrossing of gametophytes has often been mentioned in connection to heterospory, but we review the shortcomings of this argument as an explanation of the selective advantage of heterospory. Few alternative arguments concerning the selective forces favouring heterospory have been proposed, a paucity of attention that is surprising given the importance of this innovation in land plant evolution. In this review we highlight two ideas that may lead us to a better understanding of why heterospory evolved. First, models of optimal resource allocation - an approach that has been used for decades in evolutionary ecology to help understand parental investment and other life-history patterns - suggest that an evolutionary increase in spore size could reach a threshold at which small spores yielding small, sperm-producing gametophytes would return greater fitness per unit of resource investment than would large spores and bisexual gametophytes. With the advent of such microspores, megaspores would evolve under frequency-dependent selection. This argument can account for the appearance of heterospory in the Devonian, when increasingly tall and complex

  1. Evolving a photosynthetic organelle.

    PubMed

    Nakayama, Takuro; Archibald, John M

    2012-04-24

    The evolution of plastids from cyanobacteria is believed to represent a singularity in the history of life. The enigmatic amoeba Paulinella and its 'recently' acquired photosynthetic inclusions provide a fascinating system through which to gain fresh insight into how endosymbionts become organelles.The plastids, or chloroplasts, of algae and plants evolved from cyanobacteria by endosymbiosis. This landmark event conferred on eukaryotes the benefits of photosynthesis--the conversion of solar energy into chemical energy--and in so doing had a huge impact on the course of evolution and the climate of Earth 1. From the present state of plastids, however, it is difficult to trace the evolutionary steps involved in this momentous development, because all modern-day plastids have fully integrated into their hosts. Paulinella chromatophora is a unicellular eukaryote that bears photosynthetic entities called chromatophores that are derived from cyanobacteria and has thus received much attention as a possible example of an organism in the early stages of organellogenesis. Recent studies have unlocked the genomic secrets of its chromatophore 23 and provided concrete evidence that the Paulinella chromatophore is a bona fide photosynthetic organelle 4. The question is how Paulinella can help us to understand the process by which an endosymbiont is converted into an organelle.

  2. Communicability across evolving networks.

    PubMed

    Grindrod, Peter; Parsons, Mark C; Higham, Desmond J; Estrada, Ernesto

    2011-04-01

    Many natural and technological applications generate time-ordered sequences of networks, defined over a fixed set of nodes; for example, time-stamped information about "who phoned who" or "who came into contact with who" arise naturally in studies of communication and the spread of disease. Concepts and algorithms for static networks do not immediately carry through to this dynamic setting. For example, suppose A and B interact in the morning, and then B and C interact in the afternoon. Information, or disease, may then pass from A to C, but not vice versa. This subtlety is lost if we simply summarize using the daily aggregate network given by the chain A-B-C. However, using a natural definition of a walk on an evolving network, we show that classic centrality measures from the static setting can be extended in a computationally convenient manner. In particular, communicability indices can be computed to summarize the ability of each node to broadcast and receive information. The computations involve basic operations in linear algebra, and the asymmetry caused by time's arrow is captured naturally through the noncommutativity of matrix-matrix multiplication. Illustrative examples are given for both synthetic and real-world communication data sets. We also discuss the use of the new centrality measures for real-time monitoring and prediction.

  3. Evolving Concepts of Asthma

    PubMed Central

    Ray, Anuradha; Wenzel, Sally E.

    2015-01-01

    Our understanding of asthma has evolved over time from a singular disease to a complex of various phenotypes, with varied natural histories, physiologies, and responses to treatment. Early therapies treated most patients with asthma similarly, with bronchodilators and corticosteroids, but these therapies had varying degrees of success. Similarly, despite initial studies that identified an underlying type 2 inflammation in the airways of patients with asthma, biologic therapies targeted toward these type 2 pathways were unsuccessful in all patients. These observations led to increased interest in phenotyping asthma. Clinical approaches, both biased and later unbiased/statistical approaches to large asthma patient cohorts, identified a variety of patient characteristics, but they also consistently identified the importance of age of onset of disease and the presence of eosinophils in determining clinically relevant phenotypes. These paralleled molecular approaches to phenotyping that developed an understanding that not all patients share a type 2 inflammatory pattern. Using biomarkers to select patients with type 2 inflammation, repeated trials of biologics directed toward type 2 cytokine pathways saw newfound success, confirming the importance of phenotyping in asthma. Further research is needed to clarify additional clinical and molecular phenotypes, validate predictive biomarkers, and identify new areas for possible interventions. PMID:26161792

  4. Evolving synergetic interactions

    PubMed Central

    Wu, Bin; Arranz, Jordi; Du, Jinming; Zhou, Da; Traulsen, Arne

    2016-01-01

    Cooperators forgo their own interests to benefit others. This reduces their fitness and thus cooperators are not likely to spread based on natural selection. Nonetheless, cooperation is widespread on every level of biological organization ranging from bacterial communities to human society. Mathematical models can help to explain under which circumstances cooperation evolves. Evolutionary game theory is a powerful mathematical tool to depict the interactions between cooperators and defectors. Classical models typically involve either pairwise interactions between individuals or a linear superposition of these interactions. For interactions within groups, however, synergetic effects may arise: their outcome is not just the sum of its parts. This is because the payoffs via a single group interaction can be different from the sum of any collection of two-player interactions. Assuming that all interactions start from pairs, how can such synergetic multiplayer games emerge from simpler pairwise interactions? Here, we present a mathematical model that captures the transition from pairwise interactions to synergetic multiplayer ones. We assume that different social groups have different breaking rates. We show that non-uniform breaking rates do foster the emergence of synergy, even though individuals always interact in pairs. Our work sheds new light on the mechanisms underlying such synergetic interactions. PMID:27466437

  5. Evolving Crashworthiness Design Criteria

    DTIC Science & Technology

    1988-12-01

    occupants n crash velocity changes of the seerity cited in. Figure 3. Moreover, the structure and equipment shall allow deformation in a controlled ...to governed by the s topping distance anod pulse duration. Figure 40 ilus- trotes this relationship and indicates the Importance of controlled energy...fatigue, life durina the initIl design phase of the helicopter.onFigure 5 depicts the systia’s approach required relative to managesent of the crash

  6. Reliability training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  7. Person Reliability

    ERIC Educational Resources Information Center

    Lumsden, James

    1977-01-01

    Person changes can be of three kinds: developmental trends, swells, and tremors. Person unreliability in the tremor sense (momentary fluctuations) can be estimated from person characteristic curves. Average person reliability for groups can be compared from item characteristic curves. (Author)

  8. Photovoltaic performance and reliability workshop

    SciTech Connect

    Mrig, L.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986--1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the US, PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  9. Disgust: Evolved Function and Structure

    ERIC Educational Resources Information Center

    Tybur, Joshua M.; Lieberman, Debra; Kurzban, Robert; DeScioli, Peter

    2013-01-01

    Interest in and research on disgust has surged over the past few decades. The field, however, still lacks a coherent theoretical framework for understanding the evolved function or functions of disgust. Here we present such a framework, emphasizing 2 levels of analysis: that of evolved function and that of information processing. Although there is…

  10. Evolving virtual creatures and catapults.

    PubMed

    Chaumont, Nicolas; Egli, Richard; Adami, Christoph

    2007-01-01

    We present a system that can evolve the morphology and the controller of virtual walking and block-throwing creatures (catapults) using a genetic algorithm. The system is based on Sims' work, implemented as a flexible platform with an off-the-shelf dynamics engine. Experiments aimed at evolving Sims-type walkers resulted in the emergence of various realistic gaits while using fairly simple objective functions. Due to the flexibility of the system, drastically different morphologies and functions evolved with only minor modifications to the system and objective function. For example, various throwing techniques evolved when selecting for catapults that propel a block as far as possible. Among the strategies and morphologies evolved, we find the drop-kick strategy, as well as the systematic invention of the principle behind the wheel, when allowing mutations to the projectile.

  11. Reliability physics

    NASA Technical Reports Server (NTRS)

    Cuddihy, E. F.; Ross, R. G., Jr.

    1984-01-01

    Speakers whose topics relate to the reliability physics of solar arrays are listed and their topics briefly reviewed. Nine reports are reviewed ranging in subjects from studies of photothermal degradation in encapsulants and polymerizable ultraviolet stabilizers to interface bonding stability to electrochemical degradation of photovoltaic modules.

  12. The Reliability of Density Measurements.

    ERIC Educational Resources Information Center

    Crothers, Charles

    1978-01-01

    Data from a land-use study of small- and medium-sized towns in New Zealand are used to ascertain the relationship between official and effective density measures. It was found that the reliability of official measures of density is very low overall, although reliability increases with community size. (Author/RLV)

  13. How did the cilium evolve?

    PubMed

    Satir, Peter; Mitchell, David R; Jékely, Gáspár

    2008-01-01

    The cilium is a characteristic organelle of eukaryotes constructed from over 600 proteins. Bacterial flagella are entirely different. 9 + 2 motile cilia evolved before the divergence of the last eukaryotic common ancestor (LECA). This chapter explores, compares, and contrasts two potential pathways of evolution: (1) via invasion of a centriolar-like virus and (2) via autogenous formation from a pre-existing microtubule-organizing center (MTOC). In either case, the intraflagellar transport (IFT) machinery that is nearly universally required for the assembly and maintenance of cilia derived from the evolving intracellular vesicular transport system. The sensory function of cilia evolved first and the ciliary axoneme evolved gradually with ciliary motility, an important selection mechanism, as one of the driving forces.

  14. Natural selection promotes antigenic evolvability.

    PubMed

    Graves, Christopher J; Ros, Vera I D; Stevenson, Brian; Sniegowski, Paul D; Brisson, Dustin

    2013-01-01

    The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide an experimentally tractable system to test whether natural selection has favored mechanisms that increase evolvability. Many antigenic variation systems consist of paralogous unexpressed 'cassettes' that recombine into an expression site to rapidly alter the expressed protein. Importantly, the magnitude of antigenic change is a function of the genetic diversity among the unexpressed cassettes. Thus, evidence that selection favors among-cassette diversity is direct evidence that natural selection promotes antigenic evolvability. We used the Lyme disease bacterium, Borrelia burgdorferi, as a model to test the prediction that natural selection favors amino acid diversity among unexpressed vls cassettes and thereby promotes evolvability in a primary surface antigen, VlsE. The hypothesis that diversity among vls cassettes is favored by natural selection was supported in each B. burgdorferi strain analyzed using both classical (dN/dS ratios) and Bayesian population genetic analyses of genetic sequence data. This hypothesis was also supported by the conservation of highly mutable tandem-repeat structures across B. burgdorferi strains despite a near complete absence of sequence conservation. Diversification among vls cassettes due to natural selection and mutable repeat structures promotes long-term antigenic evolvability of VlsE. These findings provide a direct demonstration that molecular mechanisms that enhance evolvability of surface antigens are an evolutionary adaptation. The molecular evolutionary processes identified here can serve as a model for the evolution of antigenic evolvability in many pathogens which utilize similar strategies to establish chronic infections.

  15. Spacetimes containing slowly evolving horizons

    SciTech Connect

    Kavanagh, William; Booth, Ivan

    2006-08-15

    Slowly evolving horizons are trapping horizons that are ''almost'' isolated horizons. This paper reviews their definition and discusses several spacetimes containing such structures. These include certain Vaidya and Tolman-Bondi solutions as well as (perturbatively) tidally distorted black holes. Taking into account the mass scales and orders of magnitude that arise in these calculations, we conjecture that slowly evolving horizons are the norm rather than the exception in astrophysical processes that involve stellar-scale black holes.

  16. A Note on the Relationship between the Number of Indicators and Their Reliability in Detecting Regression Coefficients in Latent Regression Analysis

    ERIC Educational Resources Information Center

    Dolan, Conor V.; Wicherts, Jelte M.; Molenaar, Peter C. M.

    2004-01-01

    We consider the question of how variation in the number and reliability of indicators affects the power to reject the hypothesis that the regression coefficients are zero in latent linear regression analysis. We show that power remains constant as long as the coefficient of determination remains unchanged. Any increase in the number of indicators…

  17. On the Discovery of Evolving Truth

    PubMed Central

    Li, Yaliang; Li, Qi; Gao, Jing; Su, Lu; Zhao, Bo; Fan, Wei; Han, Jiawei

    2015-01-01

    In the era of big data, information regarding the same objects can be collected from increasingly more sources. Unfortunately, there usually exist conflicts among the information coming from different sources. To tackle this challenge, truth discovery, i.e., to integrate multi-source noisy information by estimating the reliability of each source, has emerged as a hot topic. In many real world applications, however, the information may come sequentially, and as a consequence, the truth of objects as well as the reliability of sources may be dynamically evolving. Existing truth discovery methods, unfortunately, cannot handle such scenarios. To address this problem, we investigate the temporal relations among both object truths and source reliability, and propose an incremental truth discovery framework that can dynamically update object truths and source weights upon the arrival of new data. Theoretical analysis is provided to show that the proposed method is guaranteed to converge at a fast rate. The experiments on three real world applications and a set of synthetic data demonstrate the advantages of the proposed method over state-of-the-art truth discovery methods. PMID:26705502

  18. On the Discovery of Evolving Truth.

    PubMed

    Li, Yaliang; Li, Qi; Gao, Jing; Su, Lu; Zhao, Bo; Fan, Wei; Han, Jiawei

    2015-08-01

    In the era of big data, information regarding the same objects can be collected from increasingly more sources. Unfortunately, there usually exist conflicts among the information coming from different sources. To tackle this challenge, truth discovery, i.e., to integrate multi-source noisy information by estimating the reliability of each source, has emerged as a hot topic. In many real world applications, however, the information may come sequentially, and as a consequence, the truth of objects as well as the reliability of sources may be dynamically evolving. Existing truth discovery methods, unfortunately, cannot handle such scenarios. To address this problem, we investigate the temporal relations among both object truths and source reliability, and propose an incremental truth discovery framework that can dynamically update object truths and source weights upon the arrival of new data. Theoretical analysis is provided to show that the proposed method is guaranteed to converge at a fast rate. The experiments on three real world applications and a set of synthetic data demonstrate the advantages of the proposed method over state-of-the-art truth discovery methods.

  19. Time evolving fluid from Vaidya spacetime

    NASA Astrophysics Data System (ADS)

    Wu, Bin; Hao, Xin; Zhao, Liu

    2017-08-01

    A time evolving fluid system is constructed on a timelike boundary hypersurface at finite cutoff in Vaidya spacetime. The approach used to construct the fluid equations is a direct extension of the ordinary gravity/fluid correspondence under the constrained fluctuation obeying Petrov type I conditions. The explicit relationships between the time dependent fluctuation modes and the fluid quantities such as density, velocity field and kinematic viscosity parameters are established, and the resulting fluid system is governed by a system of a sourced continuity equation and a compressible Navier-Stokes equation with nontrivial time evolution.

  20. Reliability and Validity of the Persian Version of Compulsive Eating Scale (CES) in Overweight or Obese Women and Its Relationship with Some Body Composition and Dietary Intake Variables

    PubMed Central

    Mostafavi, Seyed-Ali; Keshavarz, Seyed Ali; Mohammadi, Mohammad Reza; Hosseini, Saeed; Eshraghian, Mohammad Reza; Hosseinzadeh, Payam; Chamari, Maryam; Sari, Zeinab; Akhondzadeh, Shahin

    2016-01-01

    Objective: Compulsive or binge eating is a kind of disturbed eating behavior, which is mostly observed among dieting women, and is integrated with appetite disorder, and uncontrolled eating of plenty of junk food. The Compulsive Eating Scale (CES) created first by Kagan & Squires in 1984, is an eight-item self-reporting instrument that is made to measure the severity of binge eating disorder. The aim of this study was to provide the reliability and validity of the Persian version of Compulsive Eating Scale (CES) among overweight and obese women in Iran. Method: One hundred and twenty six (N = 126) overweight and obese women consented to participate in this study. We estimated the anthropometric indices, including body weight, height, waist and hip circumferences, a total body fat percentage, and visceral fat level with body analyzer all in standard situations. Then, the participants completed the CES. Next, to assess concurrent validity, Beck Depression Inventory, Spielberger anxiety scale, appetite visual analogue rating scale, Food Craving questionnaire, Three-Factor Eating Questionnaire-R18, and Restraint eating visual analogue rating scale were performed simultaneously. To assess test-retest reliability, CES was repeated for all the participants two weeks later. Moreover, we reported the internal consistency and factor analysis of this questionnaire. Furthermore, we estimated the concurrent correlation of CES with logically relevant questionnaires and body composition and anthropometric indices. Results: Based on the reliability analysis and factor analysis of the principal component by Varimax rotation, we extracted two factors: eating because of negative feelings, and overeating. Internal consistency (Cronbach's alpha) of the CES was 0.85 (Cronbach alpha of the factors was 0.85, and 0.74, respectively). The test-retest correlation of the CES was 0.89. Also, the split-half reliability of the questionnaire was established with the correlation coefficient

  1. Robustness to Faults Promotes Evolvability: Insights from Evolving Digital Circuits

    PubMed Central

    Nolfi, Stefano

    2016-01-01

    We demonstrate how the need to cope with operational faults enables evolving circuits to find more fit solutions. The analysis of the results obtained in different experimental conditions indicates that, in absence of faults, evolution tends to select circuits that are small and have low phenotypic variability and evolvability. The need to face operation faults, instead, drives evolution toward the selection of larger circuits that are truly robust with respect to genetic variations and that have a greater level of phenotypic variability and evolvability. Overall our results indicate that the need to cope with operation faults leads to the selection of circuits that have a greater probability to generate better circuits as a result of genetic variation with respect to a control condition in which circuits are not subjected to faults. PMID:27409589

  2. Robustness to Faults Promotes Evolvability: Insights from Evolving Digital Circuits.

    PubMed

    Milano, Nicola; Nolfi, Stefano

    2016-01-01

    We demonstrate how the need to cope with operational faults enables evolving circuits to find more fit solutions. The analysis of the results obtained in different experimental conditions indicates that, in absence of faults, evolution tends to select circuits that are small and have low phenotypic variability and evolvability. The need to face operation faults, instead, drives evolution toward the selection of larger circuits that are truly robust with respect to genetic variations and that have a greater level of phenotypic variability and evolvability. Overall our results indicate that the need to cope with operation faults leads to the selection of circuits that have a greater probability to generate better circuits as a result of genetic variation with respect to a control condition in which circuits are not subjected to faults.

  3. Reliability of the maximal resisted sprint load test and relationships with performance measures and anthropometric profile in female field sport athletes.

    PubMed

    Petrakos, George; Tynan, Nicola C; Vallely-Farrell, Adam M; Kiely, Cillian; Boudhar, Abdelhak; Egan, Brendan

    2017-09-06

    Resisted sled sprint (RSS) training is an effective modality for the improvement of linear sprint speed. Previous methods of RSS load prescription e.g. an absolute load or as a percentage of body mass (%BM), do not account for inter-individual differences in strength, power or speed characteristics, although the 'maximum resisted sled load' (MRSL) method of RSS load prescription may provide a solution. MRSL is defined as the final RSS load before an athlete can no longer accelerate between two phases (10-15 m and 15-20 m) of a 20 m linear sprint. However, the MRSL test has not been analysed for reliability. Additionally, MRSL performance has not been compared to the outcome of other performance tests. The primary aim of this study was to investigate the reliability of the MRSL testing protocol in female field sport athletes. Participants (age, 20.8 ± 1.9 y; body mass, 64.3 ± 8.4 kg; height, 1.66 ± 0.65 m) tested for anthropometric measurements, strength and power performance testing and twice for MRSL. MRSL values ranged from 20.7 to 58.9%BM. MRSL test-retest reliability intraclass correlation coefficient, confidence intervals and coefficient of variations were 0.95, 0.85-0.98 and 7.6%, respectively. MRSL was 'moderately' and 'strongly' correlated with a number of anthropometric and performance tests (p < 0.05) including % fat free mass, countermovement jump, loaded countermovement jump, rate of force development, horizontal jump and horizontal bound performance. MRSL is a reliable measure for determining the RSS load at which an individual can no longer accelerate during a single RSS effort over 0-20 m. MRSL also accounts for inter-individual variation in body composition, power and speed characteristics of female field sport players.

  4. Network reliability

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1985-01-01

    Network control (or network management) functions are essential for efficient and reliable operation of a network. Some control functions are currently included as part of the Open System Interconnection model. For local area networks, it is widely recognized that there is a need for additional control functions, including fault isolation functions, monitoring functions, and configuration functions. These functions can be implemented in either a central or distributed manner. The Fiber Distributed Data Interface Medium Access Control and Station Management protocols provide an example of distributed implementation. Relative information is presented here in outline form.

  5. Reliability Prediction

    NASA Technical Reports Server (NTRS)

    1993-01-01

    RELAV, a NASA-developed computer program, enables Systems Control Technology, Inc. (SCT) to predict performance of aircraft subsystems. RELAV provides a system level evaluation of a technology. Systems, the mechanism of a landing gear for example, are first described as a set of components performing a specific function. RELAV analyzes the total system and the individual subsystem probabilities to predict success probability, and reliability. This information is then translated into operational support and maintenance requirements. SCT provides research and development services in support of government contracts.

  6. The genotype-phenotype map of an evolving digital organism.

    PubMed

    Fortuna, Miguel A; Zaman, Luis; Ofria, Charles; Wagner, Andreas

    2017-02-01

    To understand how evolving systems bring forth novel and useful phenotypes, it is essential to understand the relationship between genotypic and phenotypic change. Artificial evolving systems can help us understand whether the genotype-phenotype maps of natural evolving systems are highly unusual, and it may help create evolvable artificial systems. Here we characterize the genotype-phenotype map of digital organisms in Avida, a platform for digital evolution. We consider digital organisms from a vast space of 10141 genotypes (instruction sequences), which can form 512 different phenotypes. These phenotypes are distinguished by different Boolean logic functions they can compute, as well as by the complexity of these functions. We observe several properties with parallels in natural systems, such as connected genotype networks and asymmetric phenotypic transitions. The likely common cause is robustness to genotypic change. We describe an intriguing tension between phenotypic complexity and evolvability that may have implications for biological evolution. On the one hand, genotypic change is more likely to yield novel phenotypes in more complex organisms. On the other hand, the total number of novel phenotypes reachable through genotypic change is highest for organisms with simple phenotypes. Artificial evolving systems can help us study aspects of biological evolvability that are not accessible in vastly more complex natural systems. They can also help identify properties, such as robustness, that are required for both human-designed artificial systems and synthetic biological systems to be evolvable.

  7. The genotype-phenotype map of an evolving digital organism

    PubMed Central

    Zaman, Luis; Wagner, Andreas

    2017-01-01

    To understand how evolving systems bring forth novel and useful phenotypes, it is essential to understand the relationship between genotypic and phenotypic change. Artificial evolving systems can help us understand whether the genotype-phenotype maps of natural evolving systems are highly unusual, and it may help create evolvable artificial systems. Here we characterize the genotype-phenotype map of digital organisms in Avida, a platform for digital evolution. We consider digital organisms from a vast space of 10141 genotypes (instruction sequences), which can form 512 different phenotypes. These phenotypes are distinguished by different Boolean logic functions they can compute, as well as by the complexity of these functions. We observe several properties with parallels in natural systems, such as connected genotype networks and asymmetric phenotypic transitions. The likely common cause is robustness to genotypic change. We describe an intriguing tension between phenotypic complexity and evolvability that may have implications for biological evolution. On the one hand, genotypic change is more likely to yield novel phenotypes in more complex organisms. On the other hand, the total number of novel phenotypes reachable through genotypic change is highest for organisms with simple phenotypes. Artificial evolving systems can help us study aspects of biological evolvability that are not accessible in vastly more complex natural systems. They can also help identify properties, such as robustness, that are required for both human-designed artificial systems and synthetic biological systems to be evolvable. PMID:28241039

  8. Proposed reliability cost model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  9. Evolving MEMS Resonator Designs for Fabrication

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.; Kraus, William F.; Lohn, Jason D.

    2008-01-01

    Because of their small size and high reliability, microelectromechanical (MEMS) devices have the potential to revolution many areas of engineering. As with conventionally-sized engineering design, there is likely to be a demand for the automated design of MEMS devices. This paper describes our current status as we progress toward our ultimate goal of using an evolutionary algorithm and a generative representation to produce designs of a MEMS device and successfully demonstrate its transfer to an actual chip. To produce designs that are likely to transfer to reality, we present two ways to modify evaluation of designs. The first is to add location noise, differences between the actual dimensions of the design and the design blueprint, which is a technique we have used for our work in evolving antennas and robots. The second method is to add prestress to model the warping that occurs during the extreme heat of fabrication. In future we expect to fabricate and test some MEMS resonators that are evolved in this way.

  10. How evolvable are polarization machines?

    NASA Astrophysics Data System (ADS)

    Laan, Liedewij; Murray, Andrew

    2012-02-01

    In many different cell types proper polarization is essential for cell function. Polarization mechanisms however, differ between cell types and even closely related species use a variety of polarization machines. Budding yeast, for example, depends on several parallel mechanisms to establish polarity. One mechanism (i) depends on reaction and diffusion of proteins in the membrane. Another one (ii) depends on reorganization of the actin cytoskeleton. So why does yeast use several mechanisms simultaneously? Can yeast also polarize robustly in the absence of one of them? We addressed these questions by evolving budding yeast in the absence of mechanism (i) or (ii). We deleted a mechanism by deleting one or two genes that are essential for its function. After the deletion of either mechanism the growth rate of cells was highly decreased (2-5 fold) and their cell shape was highly perturbed. Subsequently, we evolved these cells for 10 days. Surprisingly, the evolved cells rapidly overcame most of their polarity defects. They grow at 0.9x wildtype growth rate and their cell shape is signifigantly less perturbed. Now we will study how these cells rescued polarization. Did they fix the deleted mechanism, strengthen other mechanisms or evolve a completely new one?

  11. Slippery Texts and Evolving Literacies

    ERIC Educational Resources Information Center

    Mackey, Margaret

    2007-01-01

    The idea of "slippery texts" provides a useful descriptor for materials that mutate and evolve across different media. Eight adult gamers, encountering the slippery text "American McGee's Alice," demonstrate a variety of ways in which players attempt to manage their attention as they encounter a new text with many resonances. The range of their…

  12. Thermal and evolved gas analyzer

    NASA Technical Reports Server (NTRS)

    Williams, M. S.; Boynton, W. V.; James, R. L.; Verts, W. T.; Bailey, S. H.; Hamara, D. K.

    1998-01-01

    The Thermal and Evolved Gas Analyzer (TEGA) instrument will perform calorimetry and evolved gas analysis on soil samples collected from the Martian surface. TEGA is one of three instruments, along with a robotic arm, that form the Mars Volatile and Climate Survey (MVACS) payload. The other instruments are a stereo surface imager, built by Peter Smith of the University of Arizona and a meteorological station, built by JPL. The MVACS lander will investigate a Martian landing site at approximately 70 deg south latitude. Launch will take place from Kennedy Space Center in January, 1999. The TEGA project started in February, 1996. In the intervening 24 months, a flight instrument concept has been designed, prototyped, built as an engineering model and flight model, and tested. The instrument performs laboratory-quality differential-scanning calorimetry (DSC) over the temperature range of Mars ambient to 1400K. Low-temperature volatiles (water and carbon dioxide ices) and the carbonates will be analyzed in this temperature range. Carbonates melt and evolve carbon dioxide at temperatures above 600 C. Evolved oxygen (down to a concentration of 1 ppm) is detected, and C02 and water vapor and the isotopic variations of C02 and water vapor are detected and their concentrations measured. The isotopic composition provides important tests of the theory of solar system formation.

  13. The Evolving Demand for Skills.

    ERIC Educational Resources Information Center

    Greenspan, Alan

    From a macroeconomic perspective, the evolving demand for skills in the United States has been triggered by the accelerated expansion of computer and information technology, which has, in turn, brought significant changes to the workplace. Technological advances have made some wholly manual jobs obsolete. But even for many other workers, a rapidly…

  14. Signing Apes and Evolving Linguistics.

    ERIC Educational Resources Information Center

    Stokoe, William C.

    Linguistics retains from its antecedents, philology and the study of sacred writings, some of their apologetic and theological bias. Thus it has not been able to face squarely the question how linguistic function may have evolved from animal communication. Chimpanzees' use of signs from American Sign Language forces re-examination of language…

  15. The Skin Cancer and Sun Knowledge (SCSK) Scale: Validity, Reliability, and Relationship to Sun-Related Behaviors among Young Western Adults

    ERIC Educational Resources Information Center

    Day, Ashley K.; Wilson, Carlene; Roberts, Rachel M.; Hutchinson, Amanda D.

    2014-01-01

    Increasing public knowledge remains one of the key aims of skin cancer awareness campaigns, yet diagnosis rates continue to rise. It is essential we measure skin cancer knowledge adequately so as to determine the nature of its relationship to sun-related behaviors. This study investigated the psychometric properties of a new measure of skin cancer…

  16. The Skin Cancer and Sun Knowledge (SCSK) Scale: Validity, Reliability, and Relationship to Sun-Related Behaviors among Young Western Adults

    ERIC Educational Resources Information Center

    Day, Ashley K.; Wilson, Carlene; Roberts, Rachel M.; Hutchinson, Amanda D.

    2014-01-01

    Increasing public knowledge remains one of the key aims of skin cancer awareness campaigns, yet diagnosis rates continue to rise. It is essential we measure skin cancer knowledge adequately so as to determine the nature of its relationship to sun-related behaviors. This study investigated the psychometric properties of a new measure of skin cancer…

  17. Non-uniform Evolving Hypergraphs and Weighted Evolving Hypergraphs

    PubMed Central

    Guo, Jin-Li; Zhu, Xin-Yun; Suo, Qi; Forrest, Jeffrey

    2016-01-01

    Firstly, this paper proposes a non-uniform evolving hypergraph model with nonlinear preferential attachment and an attractiveness. This model allows nodes to arrive in batches according to a Poisson process and to form hyperedges with existing batches of nodes. Both the number of arriving nodes and that of chosen existing nodes are random variables so that the size of each hyperedge is non-uniform. This paper establishes the characteristic equation of hyperdegrees, calculates changes in the hyperdegree of each node, and obtains the stationary average hyperdegree distribution of the model by employing the Poisson process theory and the characteristic equation. Secondly, this paper constructs a model for weighted evolving hypergraphs that couples the establishment of new hyperedges, nodes and the dynamical evolution of the weights. Furthermore, what is obtained are respectively the stationary average hyperdegree and hyperstrength distributions by using the hyperdegree distribution of the established unweighted model above so that the weighted evolving hypergraph exhibits a scale-free behavior for both hyperdegree and hyperstrength distributions. PMID:27845334

  18. The evolved function of the oedipal conflict.

    PubMed

    Josephs, Lawrence

    2010-08-01

    Freud based his oedipal theory on three clinical observations of adult romantic relationships: (1) Adults tend to split love and lust; (2) There tend to be sex differences in the ways that men and women split love and lust; (3) Adult romantic relationships are unconsciously structured by the dynamics of love triangles in which dramas of seduction and betrayal unfold. Freud believed that these aspects of adult romantic relationships were derivative expressions of a childhood oedipal conflict that has been repressed. Recent research conducted by evolutionary psychologists supports many of Freud's original observations and suggests that Freud's oedipal conflict may have evolved as a sexually selected adaptation for reproductive advantage. The evolution of bi-parental care based on sexually exclusive romantic bonds made humans vulnerable to the costs of sexual infidelity, a situation of danger that seriously threatens monogamous bonds. A childhood oedipal conflict enables humans to better adapt to this longstanding evolutionary problem by providing the child with an opportunity to develop working models of love triangles. On the one hand, the oedipal conflict facilitates monogamous resolutions by creating intense anxiety about the dangers of sexual infidelity and mate poaching. On the other hand, the oedipal conflict in humans may facilitate successful cheating and mate poaching by cultivating a talent for hiding our true sexual intentions from others and even from ourselves. The oedipal conflict in humans may be disguised by evolutionary design in order to facilitate tactical deception in adult romantic relationships.

  19. An Investigation into Reliability of Knee Extension Muscle Strength Measurements, and into the Relationship between Muscle Strength and Means of Independent Mobility in the Ward: Examinations of Patients Who Underwent Femoral Neck Fracture Surgery

    PubMed Central

    Katoh, Munenori; Kaneko, Yoshihiro

    2014-01-01

    [Purpose] The purpose of the present study was to investigate the reliability of isometric knee extension muscle strength measurement of patients who underwent femoral neck fracture surgery, as well as the relationship between independent mobility in the ward and knee muscle strength. [Subjects] The subjects were 75 patients who underwent femoral neck fracture surgery. [Methods] We used a hand-held dynamometer and a belt to measure isometric knee extension muscle strength three times, and used intraclass correlation coefficients (ICCs) to investigate the reliability of the measurements. We used a receiver operating characteristic curve to investigate the cutoff values for independent walking with walking sticks and non-independent mobility. [Results] ICCs (1, 1) were 0.9 or higher. The cutoff value for independent walking with walking sticks was 0.289 kgf/kg on the non-fractured side, 0.193 kgf/kg on the fractured side, and the average of both limbs was 0.238 kgf/kg. [Conclusion] We consider that the test-retest reliability of isometric knee extension muscle strength measurement of patients who have undergone femoral neck fracture surgery is high. We also consider that isometric knee extension muscle strength is useful for investigating means of independent mobility in the ward. PMID:24567667

  20. Photovoltaic performance and reliability workshop

    SciTech Connect

    Mrig, L.

    1995-11-01

    This conference proceedings is the compilation of papers presented at the eighth PV Performance Reliability Workshop held on September 7--8, 1995. Twenty-two of the twenty six presentations which were submitted in a timely manner are included. This Workshop is one of the technology transfer mechanism employed by the PV Performance and Engineering Project in exchanging technical information in the field of PV component and system reliability. At this time, 10 to 20 year warranty on many of the commercial modules are being provided by PV module manufacturers routinely. Several years of warranty is also being provided by inverter manufacturers. This is very encouraging but there are at present no general warranties on system performance or system reliability which is what I believe will be necessary in the future. As you will read in several of the presentations, many of the attributes that assist in defining PV performance and reliability are still evolving. And specifically, the data base on thin film PV technologies performance and reliability is still very much under development. I believe a forum like this is an appropriate setting to exchange technical information in an open format. Towards that end, we were able to persuade many of the major players in the PV performance and reliability field to speak at this conference. These proceedings provide a good description of what these experts believe the major issues in this area are from their perspective.

  1. Coupled oscillators on evolving networks

    NASA Astrophysics Data System (ADS)

    Singh, R. K.; Bagarti, Trilochan

    2016-12-01

    In this work we study coupled oscillators on evolving networks. We find that the steady state behavior of the system is governed by the relative values of the spread in natural frequencies and the global coupling strength. For coupling strong in comparison to the spread in frequencies, the system of oscillators synchronize and when coupling strength and spread in frequencies are large, a phenomenon similar to amplitude death is observed. The network evolution provides a mechanism to build inter-oscillator connections and once a dynamic equilibrium is achieved, oscillators evolve according to their local interactions. We also find that the steady state properties change by the presence of additional time scales. We demonstrate these results based on numerical calculations studying dynamical evolution of limit-cycle and van der Pol oscillators.

  2. Evolvable Hardware for Space Applications

    NASA Technical Reports Server (NTRS)

    Lohn, Jason; Globus, Al; Hornby, Gregory; Larchev, Gregory; Kraus, William

    2004-01-01

    This article surveys the research of the Evolvable Systems Group at NASA Ames Research Center. Over the past few years, our group has developed the ability to use evolutionary algorithms in a variety of NASA applications ranging from spacecraft antenna design, fault tolerance for programmable logic chips, atomic force field parameter fitting, analog circuit design, and earth observing satellite scheduling. In some of these applications, evolutionary algorithms match or improve on human performance.

  3. Ranking in evolving complex networks

    NASA Astrophysics Data System (ADS)

    Liao, Hao; Mariani, Manuel Sebastian; Medo, Matúš; Zhang, Yi-Cheng; Zhou, Ming-Yang

    2017-05-01

    Complex networks have emerged as a simple yet powerful framework to represent and analyze a wide range of complex systems. The problem of ranking the nodes and the edges in complex networks is critical for a broad range of real-world problems because it affects how we access online information and products, how success and talent are evaluated in human activities, and how scarce resources are allocated by companies and policymakers, among others. This calls for a deep understanding of how existing ranking algorithms perform, and which are their possible biases that may impair their effectiveness. Many popular ranking algorithms (such as Google's PageRank) are static in nature and, as a consequence, they exhibit important shortcomings when applied to real networks that rapidly evolve in time. At the same time, recent advances in the understanding and modeling of evolving networks have enabled the development of a wide and diverse range of ranking algorithms that take the temporal dimension into account. The aim of this review is to survey the existing ranking algorithms, both static and time-aware, and their applications to evolving networks. We emphasize both the impact of network evolution on well-established static algorithms and the benefits from including the temporal dimension for tasks such as prediction of network traffic, prediction of future links, and identification of significant nodes.

  4. Evolving Systems and Adaptive Key Component Control

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Balas, Mark J.

    2009-01-01

    We propose a new framework called Evolving Systems to describe the self-assembly, or autonomous assembly, of actively controlled dynamical subsystems into an Evolved System with a higher purpose. An introduction to Evolving Systems and exploration of the essential topics of the control and stability properties of Evolving Systems is provided. This chapter defines a framework for Evolving Systems, develops theory and control solutions for fundamental characteristics of Evolving Systems, and provides illustrative examples of Evolving Systems and their control with adaptive key component controllers.

  5. The emotion system promotes diversity and evolvability

    PubMed Central

    Giske, Jarl; Eliassen, Sigrunn; Fiksen, Øyvind; Jakobsen, Per J.; Aksnes, Dag L.; Mangel, Marc; Jørgensen, Christian

    2014-01-01

    Studies on the relationship between the optimal phenotype and its environment have had limited focus on genotype-to-phenotype pathways and their evolutionary consequences. Here, we study how multi-layered trait architecture and its associated constraints prescribe diversity. Using an idealized model of the emotion system in fish, we find that trait architecture yields genetic and phenotypic diversity even in absence of frequency-dependent selection or environmental variation. That is, for a given environment, phenotype frequency distributions are predictable while gene pools are not. The conservation of phenotypic traits among these genetically different populations is due to the multi-layered trait architecture, in which one adaptation at a higher architectural level can be achieved by several different adaptations at a lower level. Our results emphasize the role of convergent evolution and the organismal level of selection. While trait architecture makes individuals more constrained than what has been assumed in optimization theory, the resulting populations are genetically more diverse and adaptable. The emotion system in animals may thus have evolved by natural selection because it simultaneously enhances three important functions, the behavioural robustness of individuals, the evolvability of gene pools and the rate of evolutionary innovation at several architectural levels. PMID:25100697

  6. You 3.0: The Most Important Evolving Technology

    ERIC Educational Resources Information Center

    Tamarkin, Molly; Bantz, David A.; Childs, Melody; diFilipo, Stephen; Landry, Stephen G.; LoPresti, Frances; McDonald, Robert H.; McGuthry, John W.; Meier, Tina; Rodrigo, Rochelle; Sparrow, Jennifer; Diggs, D. Teddy; Yang, Catherine W.

    2010-01-01

    That technology evolves is a given. Not as well understood is the impact of technological evolution on each individual--on oneself, one's skill development, one's career, and one's relationship with the work community. The authors believe that everyone in higher education will become an IT worker and that IT workers will be managing a growing…

  7. You 3.0: The Most Important Evolving Technology

    ERIC Educational Resources Information Center

    Tamarkin, Molly; Bantz, David A.; Childs, Melody; diFilipo, Stephen; Landry, Stephen G.; LoPresti, Frances; McDonald, Robert H.; McGuthry, John W.; Meier, Tina; Rodrigo, Rochelle; Sparrow, Jennifer; Diggs, D. Teddy; Yang, Catherine W.

    2010-01-01

    That technology evolves is a given. Not as well understood is the impact of technological evolution on each individual--on oneself, one's skill development, one's career, and one's relationship with the work community. The authors believe that everyone in higher education will become an IT worker and that IT workers will be managing a growing…

  8. The evolvability of programmable hardware

    PubMed Central

    Raman, Karthik; Wagner, Andreas

    2011-01-01

    In biological systems, individual phenotypes are typically adopted by multiple genotypes. Examples include protein structure phenotypes, where each structure can be adopted by a myriad individual amino acid sequence genotypes. These genotypes form vast connected ‘neutral networks’ in genotype space. The size of such neutral networks endows biological systems not only with robustness to genetic change, but also with the ability to evolve a vast number of novel phenotypes that occur near any one neutral network. Whether technological systems can be designed to have similar properties is poorly understood. Here we ask this question for a class of programmable electronic circuits that compute digital logic functions. The functional flexibility of such circuits is important in many applications, including applications of evolutionary principles to circuit design. The functions they compute are at the heart of all digital computation. We explore a vast space of 1045 logic circuits (‘genotypes’) and 1019 logic functions (‘phenotypes’). We demonstrate that circuits that compute the same logic function are connected in large neutral networks that span circuit space. Their robustness or fault-tolerance varies very widely. The vicinity of each neutral network contains circuits with a broad range of novel functions. Two circuits computing different functions can usually be converted into one another via few changes in their architecture. These observations show that properties important for the evolvability of biological systems exist in a commercially important class of electronic circuitry. They also point to generic ways to generate fault-tolerant, adaptable and evolvable electronic circuitry. PMID:20534598

  9. Regolith Evolved Gas Analyzer (REGA)

    NASA Technical Reports Server (NTRS)

    Allen, Carlton C.; McKay, David S.

    1997-01-01

    The instrument consists of five subsystems: (1) a programmable furnace which can be loaded with samples of regolith, (2) a mass spectrometer which detects and measures atmospheric gases or gases evolved during heating, (3) a tank of pressurized gas which can be introduced to the regolith material while detecting and measuring volatile reaction products, (4) a mechanism for dumping the regolith sample and repeating the experiment on a fresh sample, and (5) a data system which controls and monitors the furnace, gas system, and mass spectrometer.

  10. The 'E' factor -- evolving endodontics.

    PubMed

    Hunter, M J

    2013-03-01

    Endodontics is a constantly developing field, with new instruments, preparation techniques and sealants competing with trusted and traditional approaches to tooth restoration. Thus general dental practitioners must question and understand the significance of these developments before adopting new practices. In view of this, the aim of this article, and the associated presentation at the 2013 British Dental Conference & Exhibition, is to provide an overview of endodontic methods and constantly evolving best practice. The presentation will review current preparation techniques, comparing rotary versus reciprocation, and question current trends in restoration of the endodontically treated tooth.

  11. Regolith Evolved Gas Analyzer (REGA)

    NASA Technical Reports Server (NTRS)

    Allen, Carlton C.; McKay, David S.

    1997-01-01

    The instrument consists of five subsystems: (1) a programmable furnace which can be loaded with samples of regolith, (2) a mass spectrometer which detects and measures atmospheric gases or gases evolved during heating, (3) a tank of pressurized gas which can be introduced to the regolith material while detecting and measuring volatile reaction products, (4) a mechanism for dumping the regolith sample and repeating the experiment on a fresh sample, and (5) a data system which controls and monitors the furnace, gas system, and mass spectrometer.

  12. Improving Evolvability through Generative Representations

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2004-01-01

    One of the main limitations of computer automated design systems is the representation used for encoding designs. Using computer programs as an analogy, representations can be thought of as having the properties of combination, control-flow and abstraction. Generative representations are those which have the ability to reuse elements in an encoding through either iteration, a form of control-flow, or abstraction. Here we argue that generative representations improve the evolvability of designs by capturing design dependencies in a way that makes them easier to change, and we support this with examples from two design substrates.

  13. Languages evolve in punctuational bursts.

    PubMed

    Atkinson, Quentin D; Meade, Andrew; Venditti, Chris; Greenhill, Simon J; Pagel, Mark

    2008-02-01

    Linguists speculate that human languages often evolve in rapid or punctuational bursts, sometimes associated with their emergence from other languages, but this phenomenon has never been demonstrated. We used vocabulary data from three of the world's major language groups-Bantu, Indo-European, and Austronesian-to show that 10 to 33% of the overall vocabulary differences among these languages arose from rapid bursts of change associated with language-splitting events. Our findings identify a general tendency for increased rates of linguistic evolution in fledgling languages, perhaps arising from a linguistic founder effect or a desire to establish a distinct social identity.

  14. A slowly evolving host moves first in symbiotic interactions

    NASA Astrophysics Data System (ADS)

    Damore, James; Gore, Jeff

    2011-03-01

    Symbiotic relationships, both parasitic and mutualistic, are ubiquitous in nature. Understanding how these symbioses evolve, from bacteria and their phages to humans and our gut microflora, is crucial in understanding how life operates. Often, symbioses consist of a slowly evolving host species with each host only interacting with its own sub-population of symbionts. The Red Queen hypothesis describes coevolutionary relationships as constant arms races with each species rushing to evolve an advantage over the other, suggesting that faster evolution is favored. Here, we use a simple game theoretic model of host- symbiont coevolution that includes population structure to show that if the symbionts evolve much faster than the host, the equilibrium distribution is the same as it would be if it were a sequential game where the host moves first against its symbionts. For the slowly evolving host, this will prove to be advantageous in mutualisms and a handicap in antagonisms. The model allows for symbiont adaptation to its host, a result that is robust to changes in the parameters and generalizes to continuous and multiplayer games. Our findings provide insight into a wide range of symbiotic phenomena and help to unify the field of coevolutionary theory.

  15. Transport on randomly evolving trees

    NASA Astrophysics Data System (ADS)

    Pál, L.

    2005-11-01

    The time process of transport on randomly evolving trees is investigated. By introducing the notions of living and dead nodes, a model of random tree evolution is constructed which describes the spreading in time of objects corresponding to nodes. It is assumed that at t=0 the tree consists of a single living node (root), from which the evolution may begin. At a certain time instant τ⩾0 , the root produces ν⩾0 living nodes connected by lines to the root which becomes dead at the moment of the offspring production. In the evolution process each of the new living nodes evolves further like a root independently of the others. By using the methods of the age-dependent branching processes we derive the joint distribution function of the numbers of living and dead nodes, and determine the correlation between these node numbers as a function of time. It is proved that the correlation function converges to 3/2 independently of the distributions of ν and τ when q1→1 and t→∞ . Also analyzed are the stochastic properties of the end nodes; and the correlation between the numbers of living and dead end nodes is shown to change its character suddenly at the very beginning of the evolution process. The survival probability of random trees is investigated and expressions are derived for this probability.

  16. Transport on randomly evolving trees.

    PubMed

    Pál, L

    2005-11-01

    The time process of transport on randomly evolving trees is investigated. By introducing the notions of living and dead nodes, a model of random tree evolution is constructed which describes the spreading in time of objects corresponding to nodes. It is assumed that at t=0 the tree consists of a single living node (root), from which the evolution may begin. At a certain time instant tau> or =0, the root produces v> or =0 living nodes connected by lines to the root which becomes dead at the moment of the offspring production. In the evolution process each of the new living nodes evolves further like a root independently of the others. By using the methods of the age-dependent branching processes we derive the joint distribution function of the numbers of living and dead nodes, and determine the correlation between these node numbers as a function of time. It is proved that the correlation function converges to square root of 3/2 independently of the distributions of v and tau when q1-->1 and t-->infinity. Also analyzed are the stochastic properties of the end nodes; and the correlation between the numbers of living and dead end nodes is shown to change its character suddenly at the very beginning of the evolution process. The survival probability of random trees is investigated and expressions are derived for this probability.

  17. Stability of Evolving Multiagent Systems.

    PubMed

    De Wilde, P; Briscoe, G

    2011-08-01

    A multiagent system is a distributed system where the agents or nodes perform complex functions that cannot be written down in analytic form. Multiagent systems are highly connected, and the information they contain is mostly stored in the connections. When agents update their state, they take into account the state of the other agents, and they have access to those states via the connections. There is also external user-generated input into the multiagent system. As so much information is stored in the connections, agents are often memory less. This memory-less property, together with the randomness of the external input, has allowed us to model multiagent systems using Markov chains. In this paper, we look at multiagent systems that evolve, i.e., the number of agents varies according to the fitness of the individual agents. We extend our Markov chain model and define stability. This is the start of a methodology to control multiagent systems. We then build upon this to construct an entropy-based definition for the degree of instability (entropy of the limit probabilities), which we used to perform a stability analysis. We then investigated the stability of evolving agent populations through simulation and show that the results are consistent with the original definition of stability in nonevolving multiagent systems, proposed by Chli and De Wilde. This paper forms the theoretical basis for the construction of digital business ecosystems, and applications have been reported elsewhere.

  18. canEvolve: A Web Portal for Integrative Oncogenomics

    PubMed Central

    Yan, Zhenyu; Wang, Xujun; Cao, Qingyi; Munshi, Nikhil C.; Li, Cheng

    2013-01-01

    Background & Objective Genome-wide profiles of tumors obtained using functional genomics platforms are being deposited to the public repositories at an astronomical scale, as a result of focused efforts by individual laboratories and large projects such as the Cancer Genome Atlas (TCGA) and the International Cancer Genome Consortium. Consequently, there is an urgent need for reliable tools that integrate and interpret these data in light of current knowledge and disseminate results to biomedical researchers in a user-friendly manner. We have built the canEvolve web portal to meet this need. Results canEvolve query functionalities are designed to fulfill most frequent analysis needs of cancer researchers with a view to generate novel hypotheses. canEvolve stores gene, microRNA (miRNA) and protein expression profiles, copy number alterations for multiple cancer types, and protein-protein interaction information. canEvolve allows querying of results of primary analysis, integrative analysis and network analysis of oncogenomics data. The querying for primary analysis includes differential gene and miRNA expression as well as changes in gene copy number measured with SNP microarrays. canEvolve provides results of integrative analysis of gene expression profiles with copy number alterations and with miRNA profiles as well as generalized integrative analysis using gene set enrichment analysis. The network analysis capability includes storage and visualization of gene co-expression, inferred gene regulatory networks and protein-protein interaction information. Finally, canEvolve provides correlations between gene expression and clinical outcomes in terms of univariate survival analysis. Conclusion At present canEvolve provides different types of information extracted from 90 cancer genomics studies comprising of more than 10,000 patients. The presence of multiple data types, novel integrative analysis for identifying regulators of oncogenesis, network analysis and ability

  19. Neural control hierarchy of the heart has not evolved to deal with myocardial ischemia.

    PubMed

    Kember, G; Armour, J A; Zamir, M

    2013-08-01

    The consequences of myocardial ischemia are examined from the standpoint of the neural control system of the heart, a hierarchy of three neuronal centers residing in central command, intrathoracic ganglia, and intrinsic cardiac ganglia. The basis of the investigation is the premise that while this hierarchical control system has evolved to deal with "normal" physiological circumstances, its response in the event of myocardial ischemia is unpredictable because the singular circumstances of this event are as yet not part of its evolutionary repertoire. The results indicate that the harmonious relationship between the three levels of control breaks down, because of a conflict between the priorities that they have evolved to deal with. Essentially, while the main priority in central command is blood demand, the priority at the intrathoracic and cardiac levels is heart rate. As a result of this breakdown, heart rate becomes less predictable and therefore less reliable as a diagnostic guide as to the traumatic state of the heart, which it is commonly used as such following an ischemic event. On the basis of these results it is proposed that under the singular conditions of myocardial ischemia a determination of neural control indexes in addition to cardiovascular indexes has the potential of enhancing clinical outcome.

  20. Primordial evolvability: Impasses and challenges.

    PubMed

    Vasas, Vera; Fernando, Chrisantha; Szilágyi, András; Zachár, István; Santos, Mauro; Szathmáry, Eörs

    2015-09-21

    While it is generally agreed that some kind of replicating non-living compounds were the precursors of life, there is much debate over their possible chemical nature. Metabolism-first approaches propose that mutually catalytic sets of simple organic molecules could be capable of self-replication and rudimentary chemical evolution. In particular, the graded autocatalysis replication domain (GARD) model, depicting assemblies of amphiphilic molecules, has received considerable interest. The system propagates compositional information across generations and is suggested to be a target of natural selection. However, evolutionary simulations indicate that the system lacks selectability (i.e. selection has negligible effect on the equilibrium concentrations). We elaborate on the lessons learnt from the example of the GARD model and, more widely, on the issue of evolvability, and discuss the implications for similar metabolism-first scenarios. We found that simple incorporation-type chemistry based on non-covalent bonds, as assumed in GARD, is unlikely to result in alternative autocatalytic cycles when catalytic interactions are randomly distributed. An even more serious problem stems from the lognormal distribution of catalytic factors, causing inherent kinetic instability of such loops, due to the dominance of efficiently catalyzed components that fail to return catalytic aid. Accordingly, the dynamics of the GARD model is dominated by strongly catalytic, but not auto-catalytic, molecules. Without effective autocatalysis, stable hereditary propagation is not possible. Many repetitions and different scaling of the model come to no rescue. Despite all attempts to show the contrary, the GARD model is not evolvable, in contrast to reflexively autocatalytic networks, complemented by rare uncatalyzed reactions and compartmentation. The latter networks, resting on the creation and breakage of chemical bonds, can generate novel ('mutant') autocatalytic loops from a given set of

  1. Reliability computation from reliability block diagrams

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.; Eckstein, E. Y.

    1975-01-01

    Computer program computes system reliability for very general class of reliability block diagrams. Four factors are considered in calculating probability of system success: active block redundancy, standby block redundancy, partial redundancy, and presence of equivalent blocks in the diagram.

  2. Evolving phenotype of Marfan's syndrome

    PubMed Central

    Lipscomb, K.; Clayton-Smith, J.; Harris, R.

    1997-01-01

    Accepted 20 August 1996
 AIM—To examine evolution of the physical characteristics of Marfan's syndrome throughout childhood.
METHODS—40 children were ascertained during the development of a regional register for Marfan's syndrome. Evolution of the clinical characteristics was determined by repeat evaluation of 10 patients with sporadic Marfan's syndrome and 30 with a family history of the condition. DNA marker studies were used to facilitate diagnosis in those with the familial condition.
RESULTS—Musculoskeletal features predominated and evolved throughout childhood. Gene tracking enabled early diagnosis in children with familial Marfan's syndrome.
CONCLUSIONS—These observations may aid the clinical diagnosis of Marfan's syndrome in childhood, especially in those with the sporadic condition. Gene tracking has a role in the early diagnosis of familial Marfan's syndrome, allowing appropriate follow up and preventive care.

 PMID:9059160

  3. Isotopic Analysis and Evolved Gases

    NASA Technical Reports Server (NTRS)

    Swindle, Timothy D.; Boynton, William V.; Chutjian, Ara; Hoffman, John H.; Jordan, Jim L.; Kargel, Jeffrey S.; McEntire, Richard W.; Nyquist, Larry

    1996-01-01

    Precise measurements of the chemical, elemental, and isotopic composition of planetary surface material and gases, and observed variations in these compositions, can contribute significantly to our knowledge of the source(s), ages, and evolution of solar system materials. The analyses discussed in this paper are mostly made by mass spectrometers or some other type of mass analyzer, and address three broad areas of interest: (1) atmospheric composition - isotopic, elemental, and molecular, (2) gases evolved from solids, and (3) solids. Current isotopic data on nine elements, mostly from in situ analysis, but also from meteorites and telescopic observations are summarized. Potential instruments for isotopic analysis of lunar, Martian, Venusian, Mercury, and Pluto surfaces, along with asteroid, cometary and icy satellites, surfaces are discussed.

  4. [The evolving of cardiac interventions].

    PubMed

    Billinger, Michael

    2014-12-01

    Treatment modalities for heart diseases have considerable evolved during the last 20 years. Coronary and valvular heart disease are treated increasingly by less invasive percutaneous catheter based procedures instead of open-heart surgery. In addition, new cutting-edge interventions allow to cure heart disease for which until recently only medical treatment options were available. Whilst many patients benefit from these innovative therapies, rapidly developing technologies potentially carry the risk of overtreatment. In order to select patients for the most appropriate treatment, an intensive interdisciplinary teamwork between cardiologists and cardiac surgeons is a mandatory requirement. Additionally, knowledge transfer between cardiologists, their growing subspecialties and practitioners should be encouraged. Finally, timely scientific evaluation of new therapies and subsequent incorporation in guidelines remains crucial.

  5. Rapidly evolving homing CRISPR barcodes.

    PubMed

    Kalhor, Reza; Mali, Prashant; Church, George M

    2017-02-01

    We present an approach for engineering evolving DNA barcodes in living cells. A homing guide RNA (hgRNA) scaffold directs the Cas9-hgRNA complex to the DNA locus of the hgRNA itself. We show that this homing CRISPR-Cas9 system acts as an expressed genetic barcode that diversifies its sequence and that the rate of diversification can be controlled in cultured cells. We further evaluate these barcodes in cell populations and show that they can be used to record lineage history and that the barcode RNA can be amplified in situ, a prerequisite for in situ sequencing. This integrated approach will have wide-ranging applications, such as in deep lineage tracing, cellular barcoding, molecular recording, dissecting cancer biology, and connectome mapping.

  6. The evolving Gleason grading system.

    PubMed

    Chen, Ni; Zhou, Qiao

    2016-02-01

    The Gleason grading system for prostate adenocarcinoma has evolved from its original scheme established in the 1960s-1970s, to a significantly modified system after two major consensus meetings conducted by the International Society of Urologic Pathology (ISUP) in 2005 and 2014, respectively. The Gleason grading system has been incorporated into the WHO classification of prostate cancer, the AJCC/UICC staging system, and the NCCN guidelines as one of the key factors in treatment decision. Both pathologists and clinicians need to fully understand the principles and practice of this grading system. We here briefly review the historical aspects of the original scheme and the recent developments of Gleason grading system, focusing on major changes over the years that resulted in the modern Gleason grading system, which has led to a new "Grade Group" system proposed by the 2014 ISUP consensus, and adopted by the 2016 WHO classification of tumours of the prostate.

  7. The evolving Gleason grading system

    PubMed Central

    Chen, Ni

    2016-01-01

    The Gleason grading system for prostate adenocarcinoma has evolved from its original scheme established in the 1960s–1970s, to a significantly modified system after two major consensus meetings conducted by the International Society of Urologic Pathology (ISUP) in 2005 and 2014, respectively. The Gleason grading system has been incorporated into the WHO classification of prostate cancer, the AJCC/UICC staging system, and the NCCN guidelines as one of the key factors in treatment decision. Both pathologists and clinicians need to fully understand the principles and practice of this grading system. We here briefly review the historical aspects of the original scheme and the recent developments of Gleason grading system, focusing on major changes over the years that resulted in the modern Gleason grading system, which has led to a new “Grade Group” system proposed by the 2014 ISUP consensus, and adopted by the 2016 WHO classification of tumours of the prostate. PMID:27041927

  8. Evolving networks by merging cliques

    NASA Astrophysics Data System (ADS)

    Takemoto, Kazuhiro; Oosawa, Chikoo

    2005-10-01

    We propose a model for evolving networks by merging building blocks represented as complete graphs, reminiscent of modules in biological system or communities in sociology. The model shows power-law degree distributions, power-law clustering spectra, and high average clustering coefficients independent of network size. The analytical solutions indicate that a degree exponent is determined by the ratio of the number of merging nodes to that of all nodes in the blocks, demonstrating that the exponent is tunable, and are also applicable when the blocks are classical networks such as Erdös-Rényi or regular graphs. Our model becomes the same model as the Barabási-Albert model under a specific condition.

  9. Behavioural plasticity in evolving robots.

    PubMed

    Carvalho, Jônata Tyska; Nolfi, Stefano

    2016-12-01

    In this paper, we show how the development of plastic behaviours, i.e., behaviour displaying a modular organisation characterised by behavioural subunits that are alternated in a context-dependent manner, can enable evolving robots to solve their adaptive task more efficiently also when it does not require the accomplishment of multiple conflicting functions. The comparison of the results obtained in different experimental conditions indicates that the most important prerequisites for the evolution of behavioural plasticity are: the possibility to generate and perceive affordances (i.e., opportunities for behaviour execution), the possibility to rely on flexible regulatory processes that exploit both external and internal cues, and the possibility to realise smooth and effective transitions between behaviours.

  10. Speech processing: An evolving technology

    SciTech Connect

    Crochiere, R.E.; Flanagan, J.L.

    1986-09-01

    As we enter the information age, speech processing is emerging as an important technology for making machines easier and more convenient for humans to use. It is both an old and a new technology - dating back to the invention of the telephone and forward, at least in aspirations, to the capabilities of HAL in 2001. Explosive advances in microelectronics now make it possible to implement economical real-time hardware for sophisticated speech processing - processing that formerly could be demonstrated only in simulations on main-frame computers. As a result, fundamentally new product concepts - as well as new features and functions in existing products - are becoming possible and are being explored in the marketplace. As the introductory piece to this issue, the authors draw a brief perspective on the evolving field of speech processing and assess the technology in the the three constituent sectors: speech coding, synthesis, and recognition.

  11. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  12. Carl Thoresen: The Evolving Pioneer

    ERIC Educational Resources Information Center

    Harris, Alex H. S.

    2009-01-01

    This interview with Carl E. Thoresen highlights the experiences, relationships, and ideas that have influenced this pioneering psychologist throughout the past half century. His scholarly work, professional service, teaching, and mentorship have motivated many counseling psychologists to radically expand their areas of inquiry. He was among the…

  13. Carl Thoresen: The Evolving Pioneer

    ERIC Educational Resources Information Center

    Harris, Alex H. S.

    2009-01-01

    This interview with Carl E. Thoresen highlights the experiences, relationships, and ideas that have influenced this pioneering psychologist throughout the past half century. His scholarly work, professional service, teaching, and mentorship have motivated many counseling psychologists to radically expand their areas of inquiry. He was among the…

  14. Evolving toward Laughter in Learning

    ERIC Educational Resources Information Center

    Strean, William B.

    2008-01-01

    Lowman (1995) described the relationship between teacher and student and student engagement as the two most important ingredients in learning in higher education. Humour builds teacher-student connection (Berk, 1998) and engages students in the learning process. The bond between student and teacher is essential for learning, satisfaction, and…

  15. Increased longevity evolves from grandmothering.

    PubMed

    Kim, Peter S; Coxworth, James E; Hawkes, Kristen

    2012-12-22

    Postmenopausal longevity may have evolved in our lineage when ancestral grandmothers subsidized their daughters' fertility by provisioning grandchildren, but the verbal hypothesis has lacked mathematical support until now. Here, we present a formal simulation in which life spans similar to those of modern chimpanzees lengthen into the modern human range as a consequence of grandmother effects. Greater longevity raises the chance of living through the fertile years but is opposed by costs that differ for the sexes. Our grandmother assumptions are restrictive. Only females who are no longer fertile themselves are eligible, and female fertility extends to age 45 years. Initially, there are very few eligible grandmothers and effects are small. Grandmothers can support only one dependent at a time and do not care selectively for their daughters' offspring. They must take the oldest juveniles still relying on mothers; and infants under the age of 2 years are never eligible for subsidy. Our model includes no assumptions about brains, learning or pair bonds. Grandmother effects alone are sufficient to propel the doubling of life spans in less than sixty thousand years.

  16. Circumstellar Crystalline Silicates: Evolved Stars

    NASA Astrophysics Data System (ADS)

    Tartar, Josh; Speck, A. K.

    2008-05-01

    One of the most exciting developments in astronomy in the last 15 years was the discovery of crystalline silicate stardust by the Short Wavelength Spectrometer (SWS) on board of ISO; discovery of the crystalline grains was indeed one of the biggest surprises of the ISO mission. Initially discovered around AGB stars (evolved stars in the range of 0.8 > M/M¤>8) at far-infrared (IR) wavelengths, crystalline silicates have since been seen in many astrophysical environments including young stellar objects (T Tauri and Herbig Ae/Be), comets and Ultra Luminous Infrared Galaxies. Low and intermediate mass stars (LIMS) comprise 95% of the contributors to the ISM, so study of the formation of crystalline silicates is critical to our understanding of the ISM, which is thought to be primarily amorphous (one would expect an almost exact match between the composition of AGB dust shells and the dust in the ISM). Whether the crystalline dust is merely undetectable or amorphized remains a mystery. The FORCAST instrument on SOFIA as well as the PACS instrument on Herschel will provide exciting observing opportunities for the further study of crystalline silicates.

  17. Multicopy Suppression Underpins Metabolic Evolvability

    PubMed Central

    Patrick, Wayne M.; Quandt, Erik M.; Swartzlander, Dan B.; Matsumura, Ichiro

    2009-01-01

    Our understanding of the origins of new metabolic functions is based upon anecdotal genetic and biochemical evidence. Some auxotrophies can be suppressed by overexpressing substrate-ambiguous enzymes (i.e., those that catalyze the same chemical transformation on different substrates). Other enzymes exhibit weak but detectable catalytic promiscuity in vitro (i.e., they catalyze different transformations on similar substrates). Cells adapt to novel environments through the evolution of these secondary activities, but neither their chemical natures nor their frequencies of occurrence have been characterized en bloc. Here, we systematically identified multifunctional genes within the Escherichia coli genome. We screened 104 single-gene knockout strains and discovered that many (20%) of these auxotrophs were rescued by the overexpression of at least one noncognate E. coli gene. The deleted gene and its suppressor were generally unrelated, suggesting that promiscuity is a product of contingency. This genome-wide survey demonstrates that multifunctional genes are common and illustrates the mechanistic diversity by which their products enhance metabolic robustness and evolvability. PMID:17884825

  18. Magnetic fields around evolved stars

    NASA Astrophysics Data System (ADS)

    Leal-Ferreira, M.; Vlemmings, W.; Kemball, A.; Amiri, N.; Maercker, M.; Ramstedt, S.; Olofsson, G.

    2014-04-01

    A number of mechanisms, such as magnetic fields, (binary) companions and circumstellar disks have been suggested to be the cause of non-spherical PNe and in particular collimated outflows. This work investigates one of these mechanisms: the magnetic fields. While MHD simulations show that the fields can indeed be important, few observations of magnetic fields have been done so far. We used the VLBA to observe five evolved stars, with the goal of detecting the magnetic field by means of water maser polarization. The sample consists in four AGB stars (IK Tau, RT Vir, IRC+60370 and AP Lyn) and one pPN (OH231.8+4.2). In four of the five sources, several strong maser features were detected allowing us to measure the linear and/or circular polarization. Based on the circular polarization detections, we infer the strength of the component of the field along the line of sight to be between ~30 mG and ~330 mG in the water maser regions of these four sources. When extrapolated to the surface of the stars, the magnetic field strength would be between a few hundred mG and a few Gauss when assuming a toroidal field geometry and higher when assuming more complex magnetic fields. We conclude that the magnetic energy we derived in the water maser regions is higher than the thermal and kinetic energy, leading to the conclusion that, indeed, magnetic fields probably play an important role in shaping Planetary Nebulae.

  19. How do drumlin patterns evolve?

    NASA Astrophysics Data System (ADS)

    Ely, Jeremy; Clark, Chris; Spagnolo, Matteo; Hughes, Anna

    2016-04-01

    The flow of a geomorphic agent over a sediment bed creates patterns in the substrate composed of bedforms. Ice is no exception to this, organising soft sedimentary substrates into subglacial bedforms. As we are yet to fully observe their initiation and evolution beneath a contemporary ice mass, little is known about how patterns in subglacial bedforms develop. Here we study 36,222 drumlins, divided into 72 flowsets, left behind by the former British-Irish Ice sheet. These flowsets provide us with 'snapshots' of drumlin pattern development. The probability distribution functions of the size and shape metrics of drumlins within these flowsets were analysed to determine whether behaviour that is common of other patterned phenomena has occurred. Specifically, we ask whether drumlins i) are printed at a specific scale; ii) grow or shrink after they initiate; iii) stabilise at a specific size and shape; and iv) migrate. Our results indicate that drumlins initiate at a minimum size and spacing. After initiation, the log-normal distribution of drumlin size and shape metrics suggests that drumlins grow, or possibly shrink, as they develop. We find no evidence for stabilisation in drumlin length, supporting the idea of a subglacial bedform continuum. Drumlin migration is difficult to determine from the palaeo-record. However, there are some indications that a mixture of static and mobile drumlins occurs, which could potentially lead to collisions, cannibalisation and coarsening. Further images of modern drumlin fields evolving beneath ice are required to capture stages of drumlin pattern evolution.

  20. Increased longevity evolves from grandmothering

    PubMed Central

    Kim, Peter S.; Coxworth, James E.; Hawkes, Kristen

    2012-01-01

    Postmenopausal longevity may have evolved in our lineage when ancestral grandmothers subsidized their daughters' fertility by provisioning grandchildren, but the verbal hypothesis has lacked mathematical support until now. Here, we present a formal simulation in which life spans similar to those of modern chimpanzees lengthen into the modern human range as a consequence of grandmother effects. Greater longevity raises the chance of living through the fertile years but is opposed by costs that differ for the sexes. Our grandmother assumptions are restrictive. Only females who are no longer fertile themselves are eligible, and female fertility extends to age 45 years. Initially, there are very few eligible grandmothers and effects are small. Grandmothers can support only one dependent at a time and do not care selectively for their daughters' offspring. They must take the oldest juveniles still relying on mothers; and infants under the age of 2 years are never eligible for subsidy. Our model includes no assumptions about brains, learning or pair bonds. Grandmother effects alone are sufficient to propel the doubling of life spans in less than sixty thousand years. PMID:23097518

  1. Recommendation in evolving online networks

    NASA Astrophysics Data System (ADS)

    Hu, Xiao; Zeng, An; Shang, Ming-Sheng

    2016-02-01

    Recommender system is an effective tool to find the most relevant information for online users. By analyzing the historical selection records of users, recommender system predicts the most likely future links in the user-item network and accordingly constructs a personalized recommendation list for each user. So far, the recommendation process is mostly investigated in static user-item networks. In this paper, we propose a model which allows us to examine the performance of the state-of-the-art recommendation algorithms in evolving networks. We find that the recommendation accuracy in general decreases with time if the evolution of the online network fully depends on the recommendation. Interestingly, some randomness in users' choice can significantly improve the long-term accuracy of the recommendation algorithm. When a hybrid recommendation algorithm is applied, we find that the optimal parameter gradually shifts towards the diversity-favoring recommendation algorithm, indicating that recommendation diversity is essential to keep a high long-term recommendation accuracy. Finally, we confirm our conclusions by studying the recommendation on networks with the real evolution data.

  2. The evolving defense communications system

    NASA Astrophysics Data System (ADS)

    Testa, Ann M.; Jones, Walter I.

    1992-05-01

    Command, control, and communications (C3) systems 'help lift the fog of war that adds uncertainty to any military operation.' They multiply the effectiveness of weapon systems and are critical components of our nation's warfighting capability. One of these critical systems is the Defense Communications System (DCS) which evolved over the past 30 years. Several factors drove this evolution, including constrained budgets, the need to improve the effectiveness and efficiency of the service provided, compatibility and interoperability, and technological advances. Based on lessons learned from Desert Shield/Desert Storm and the changing environment, force structure, and strategy, it is time to advance the DCS to its next stage. The future DCS must be flexible enough to adapt to any situation anywhere in the world. Mobile, modular building block packages of communications equipment must be available to provide effective communications capability to deployed units immediately upon arrival. Total integration and interoperability among military, commercial, and other government agencies' communication systems is a must if survivable, robust connectivity is going to be available when needed. Integration planning must begin now.

  3. Multiscale modelling of evolving foams

    NASA Astrophysics Data System (ADS)

    Saye, R. I.; Sethian, J. A.

    2016-06-01

    We present a set of multi-scale interlinked algorithms to model the dynamics of evolving foams. These algorithms couple the key effects of macroscopic bubble rearrangement, thin film drainage, and membrane rupture. For each of the mechanisms, we construct consistent and accurate algorithms, and couple them together to work across the wide range of space and time scales that occur in foam dynamics. These algorithms include second order finite difference projection methods for computing incompressible fluid flow on the macroscale, second order finite element methods to solve thin film drainage equations in the lamellae and Plateau borders, multiphase Voronoi Implicit Interface Methods to track interconnected membrane boundaries and capture topological changes, and Lagrangian particle methods for conservative liquid redistribution during rearrangement and rupture. We derive a full set of numerical approximations that are coupled via interface jump conditions and flux boundary conditions, and show convergence for the individual mechanisms. We demonstrate our approach by computing a variety of foam dynamics, including coupled evolution of three-dimensional bubble clusters attached to an anchored membrane and collapse of a foam cluster.

  4. Reliability model generator

    NASA Technical Reports Server (NTRS)

    McMann, Catherine M. (Inventor); Cohen, Gerald C. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  5. Standardized Conditional "SEM": A Case for Conditional Reliability

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Price, Larry R.; Oshima, T. C.; Nering, Michael L.

    2007-01-01

    An examinee-level (or conditional) reliability is proposed for use in both classical test theory (CTT) and item response theory (IRT). The well-known group-level reliability is shown to be the average of conditional reliabilities of examinees in a group or a population. This relationship is similar to the known relationship between the square of…

  6. The Evolving Relationship between Researchers and Public Policy

    ERIC Educational Resources Information Center

    Henig, Jeffrey R.

    2008-01-01

    When it comes to the role of research in shaping public policy and debate, one might reasonably argue that this is the best of times. No Child Left Behind (NCLB), with its frequent mention of evidence-based decision making, has underscored the role that objective knowledge should play in a democratic society. The Institute of Education Sciences,…

  7. HIV and HLA Class I: an evolving relationship

    PubMed Central

    Goulder, Philip J.R.; Walker, Bruce D

    2014-01-01

    Successful vaccine development for infectious diseases has largely been achieved in settings where natural immunity to the pathogen results in clearance in at least some individuals. HIV presents an additional challenge in that natural clearance of infection does not occur, and the correlates of immune protection are still uncertain. However, partial control of viremia and markedly different outcomes of disease are observed in HIV infected persons. Here we examine the antiviral mechanisms implicated by one variable that has been consistently associated with extremes of outcome, namely HLA class I alleles, and in particular HLA-B, and examine the mechanisms by which this modulation is likely to occur, and the impact of these interactions on evolution of the virus and the host. Studies to date provide evidence for both HLA-dependent and epitope-dependent influences on viral control and viral evolution, and have important implications for the continued quest for an effective HIV vaccine. PMID:22999948

  8. Self-regulating and self-evolving particle swarm optimizer

    NASA Astrophysics Data System (ADS)

    Wang, Hui-Min; Qiao, Zhao-Wei; Xia, Chang-Liang; Li, Liang-Yu

    2015-01-01

    In this article, a novel self-regulating and self-evolving particle swarm optimizer (SSPSO) is proposed. Learning from the idea of direction reversal, self-regulating behaviour is a modified position update rule for particles, according to which the algorithm improves the best position to accelerate convergence in situations where the traditional update rule does not work. Borrowing the idea of mutation from evolutionary computation, self-evolving behaviour acts on the current best particle in the swarm to prevent the algorithm from prematurely converging. The performance of SSPSO and four other improved particle swarm optimizers is numerically evaluated by unimodal, multimodal and rotated multimodal benchmark functions. The effectiveness of SSPSO in solving real-world problems is shown by the magnetic optimization of a Halbach-based permanent magnet machine. The results show that SSPSO has good convergence performance and high reliability, and is well matched to actual problems.

  9. Systems Issues In Terrestrial Fiber Optic Link Reliability

    NASA Astrophysics Data System (ADS)

    Spencer, James L.; Lewin, Barry R.; Lee, T. Frank S.

    1990-01-01

    This paper reviews fiber optic system reliability issues from three different viewpoints - availability, operating environment, and evolving technologies. Present availability objectives for interoffice links and for the distribution loop must be re-examined for applications such as the Synchronous Optical Network (SONET), Fiber-to-the-Home (FTTH), and analog services. The hostile operating environments of emerging applications (such as FTTH) must be carefully considered in system design as well as reliability assessments. Finally, evolving technologies might require the development of new reliability testing strategies.

  10. Submillimeter observations of evolved stars

    SciTech Connect

    Sopka, R.J.; Hildebrand, R.; Jaffe, D.T.; Gatley, I.; Roellig, T.; Werner, M.; Jura, M.; Zuckerman, B.

    1985-07-01

    Broad-band submillimeter observations of the thermal emission from evolved stars have been obtained with the United Kingdom Infrared Telescope on Mauna Kea, Hawaii. These observations, at an effective wavelength of 400 ..mu..m, provide the most direct method for estimating the mass loss rate in dust from these stars and also help to define the long-wavelength thermal spectrum of the dust envelopes. The mass loss rates in dust that we derive range from 10/sup -9/ to 10/sup -6/ M/sub sun/ yr/sup -1/ and are compared with mass loss rates derived from molecular line observations to estimate gas-to-dust ratios in outflowing envelopes. These values are found to be generally compatible with the interstellar gas-to-dust ratio of approx.100 if submillimeter emissivities appropriate to amorphous grain structures are assumed. Our analysis of the spectrum of IRC+10216 confirms previous suggestions that the grain emissivity varies as lambda/sup -1.2/ rather than as lambda/sup -2/ for 10

  11. Idiopathic pulmonary fibrosis: evolving concepts.

    PubMed

    Ryu, Jay H; Moua, Teng; Daniels, Craig E; Hartman, Thomas E; Yi, Eunhee S; Utz, James P; Limper, Andrew H

    2014-08-01

    Idiopathic pulmonary fibrosis (IPF) occurs predominantly in middle-aged and older adults and accounts for 20% to 30% of interstitial lung diseases. It is usually progressive, resulting in respiratory failure and death. Diagnostic criteria for IPF have evolved over the years, and IPF is currently defined as a disease characterized by the histopathologic pattern of usual interstitial pneumonia occurring in the absence of an identifiable cause of lung injury. Understanding of the pathogenesis of IPF has shifted away from chronic inflammation and toward dysregulated fibroproliferative repair in response to alveolar epithelial injury. Idiopathic pulmonary fibrosis is likely a heterogeneous disorder caused by various interactions between genetic components and environmental exposures. High-resolution computed tomography can be diagnostic in the presence of typical findings such as bilateral reticular opacities associated with traction bronchiectasis/bronchiolectasis in a predominantly basal and subpleural distribution, along with subpleural honeycombing. In other circumstances, a surgical lung biopsy may be needed. The clinical course of IPF can be unpredictable and may be punctuated by acute deteriorations (acute exacerbation). Although progress continues in unraveling the mechanisms of IPF, effective therapy has remained elusive. Thus, clinicians and patients need to reach informed decisions regarding management options including lung transplant. The findings in this review were based on a literature search of PubMed using the search terms idiopathic pulmonary fibrosis and usual interstitial pneumonia, limited to human studies in the English language published from January 1, 2000, through December 31, 2013, and supplemented by key references published before the year 2000. Copyright © 2014 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  12. Voyages Through Time: Everything Evolves

    NASA Astrophysics Data System (ADS)

    Pendleton, Y. J.; Tarter, J. C.; DeVore, E. K.; O'Sullivan, K. A.; Taylor, S. M.

    2001-12-01

    Evolutionary change is a powerful framework for studying our world and our place therein. It is a recurring theme in every realm of science: over time, the universe, the planet Earth, life, and human technologies all change, albeit on vastly different scales. Evolution offers scientific explanations for the age-old question, "Where did we come from?" In addition, historical perspectives of science show how our understanding has evolved over time. The complexities of all of these systems will never reveal a "finished" story. But it is a story of epic size, capable of inspiring awe and of expanding our sense of time and place, and eminently worthy of investigating. This story is the basis of Voyages Through Time. Voyages Through Time (VTT), provides teachers with not only background science content and pedagogy, but also with materials and resources for the teaching of evolution. The six modules, Cosmic Evolution, Planetary Evolution, Origin of Life, Evolution of Life, Hominid Evolution, and Evolution of Technology, emphasize student inquiry, and promote the nature of science, as recommended in the NSES and BSL. The modules are unified by the overarching theme of evolution and the meta questions: "What is changing?" "What is the rate of change?" and "What is the mechanism of change?" Determination of student outcomes for the project required effective collaboration of scientists, teachers, students and media specialists. The broadest curricula students outcomes are 1) an enjoyment of science, 2) an understanding of the nature of science, especially the understanding of evidence and re-evaluation, and 3) key science content. The curriculum is being developed by the SETI Institute, NASA Ames Research Center, California Academy of Sciences, and San Francisco State University, and is funded by the NSF (IMD 9730693), with support form Hewlett-Packard Company, The Foundation for Microbiology, Combined Federated Charities, NASA Astrobiology Institute, and NASA Fundamental

  13. Exploring Evolving Media Discourse Through Event Cueing.

    PubMed

    Lu, Yafeng; Steptoe, Michael; Burke, Sarah; Wang, Hong; Tsai, Jiun-Yi; Davulcu, Hasan; Montgomery, Douglas; Corman, Steven R; Maciejewski, Ross

    2016-01-01

    Online news, microblogs and other media documents all contain valuable insight regarding events and responses to events. Underlying these documents is the concept of framing, a process in which communicators act (consciously or unconsciously) to construct a point of view that encourages facts to be interpreted by others in a particular manner. As media discourse evolves, how topics and documents are framed can undergo change, shifting the discussion to different viewpoints or rhetoric. What causes these shifts can be difficult to determine directly; however, by linking secondary datasets and enabling visual exploration, we can enhance the hypothesis generation process. In this paper, we present a visual analytics framework for event cueing using media data. As discourse develops over time, our framework applies a time series intervention model which tests to see if the level of framing is different before or after a given date. If the model indicates that the times before and after are statistically significantly different, this cues an analyst to explore related datasets to help enhance their understanding of what (if any) events may have triggered these changes in discourse. Our framework consists of entity extraction and sentiment analysis as lenses for data exploration and uses two different models for intervention analysis. To demonstrate the usage of our framework, we present a case study on exploring potential relationships between climate change framing and conflicts in Africa.

  14. Evolving Galaxies in a Hierachical Universe

    NASA Astrophysics Data System (ADS)

    Hahn, Changhoon

    2017-01-01

    Observations of galaxies using large surveys (SDSS, COSMOS, PRIMUS, etc.) have firmly established a global view of galaxy properties out to z~1. Galaxies are broadly divided into two classes: blue, typically disk-like star forming galaxies and red, typically elliptical quiescent ones with little star formation. The star formation rates (SFR) and stellar masses of star forming galaxies form an empirical relationship referred to as the "star formation main sequence". Over cosmic time, this sequence undergoes significant decline in SFR and causes the overall cosmic star formation decline. Simultaneously, physical processes cause significant fractions of star forming galaxies to "quench" their star formation. Hierarchical structure formation and cosmological models provide precise predictions of the evolution of the underying dark matter, which serve as the foundation for these detailed trends and their evolution. Whatever trends we observe in galaxy properties can be interpreted within the narrative of the underlying dark matter and halo occupation framework. More importantly, through careful statistical treatment and precise measurements, this connection can be utilized to better constrain and understand key elements of galaxy evolution. In this spirit, for my dissertation I connect observations of evolving galaxy properties to the framework of the hierarchical Universe and use it to better understand physical processes responsible for the cessation of star formation in galaxies. For instance, through this approach, I constrain the quenching timescale of central galaxies and find that they are significantly longer than the quenching timescale of satellite galaxies.

  15. Reliability Generalization: "Lapsus Linguae"

    ERIC Educational Resources Information Center

    Smith, Julie M.

    2011-01-01

    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  16. Reliability Generalization: "Lapsus Linguae"

    ERIC Educational Resources Information Center

    Smith, Julie M.

    2011-01-01

    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  17. Reliability. ERIC Digest.

    ERIC Educational Resources Information Center

    Rudner, Lawrence M.; Schafer, William D.

    This digest discusses sources of error in testing, several approaches to estimating reliability, and several ways to increase test reliability. Reliability has been defined in different ways by different authors, but the best way to look at reliability may be the extent to which measurements resulting from a test are characteristics of those being…

  18. Inherent randomness of evolving populations.

    PubMed

    Harper, Marc

    2014-03-01

    The entropy rates of the Wright-Fisher process, the Moran process, and generalizations are computed and used to compare these processes and their dependence on standard evolutionary parameters. Entropy rates are measures of the variation dependent on both short-run and long-run behaviors and allow the relationships between mutation, selection, and population size to be examined. Bounds for the entropy rate are given for the Moran process (independent of population size) and for the Wright-Fisher process (bounded for fixed population size). A generational Moran process is also presented for comparison to the Wright-Fisher Process. Results include analytic results and computational extensions.

  19. Inherent randomness of evolving populations

    NASA Astrophysics Data System (ADS)

    Harper, Marc

    2014-03-01

    The entropy rates of the Wright-Fisher process, the Moran process, and generalizations are computed and used to compare these processes and their dependence on standard evolutionary parameters. Entropy rates are measures of the variation dependent on both short-run and long-run behaviors and allow the relationships between mutation, selection, and population size to be examined. Bounds for the entropy rate are given for the Moran process (independent of population size) and for the Wright-Fisher process (bounded for fixed population size). A generational Moran process is also presented for comparison to the Wright-Fisher Process. Results include analytic results and computational extensions.

  20. Evolving evolutionary algorithms using linear genetic programming.

    PubMed

    Oltean, Mihai

    2005-01-01

    A new model for evolving Evolutionary Algorithms is proposed in this paper. The model is based on the Linear Genetic Programming (LGP) technique. Every LGP chromosome encodes an EA which is used for solving a particular problem. Several Evolutionary Algorithms for function optimization, the Traveling Salesman Problem and the Quadratic Assignment Problem are evolved by using the considered model. Numerical experiments show that the evolved Evolutionary Algorithms perform similarly and sometimes even better than standard approaches for several well-known benchmarking problems.

  1. Acquiring Evolving Technologies: Web Services Standards

    DTIC Science & Technology

    2016-06-30

    2006 Carnegie Mellon University Acquiring Evolving Technologies : Web Services Standards Harry L. Levinson Software Engineering Institute Carnegie...Acquiring Evolving Technologies : Web Services Standards 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 2 Acquiring Evolving Technologies : Web Services Standards © 2006 Carnegie Mellon University Acquiring

  2. Water in evolved lunar rocks

    NASA Astrophysics Data System (ADS)

    Robinson, Katharine Lynn

    The Moon was thought to be completely anhydrous until indigenous water was found in lunar samples in 2008. This discovery raised two fundamental questions about the Moon: how much water is present in the bulk Moon and is water uniformly distributed in the lunar interior? To address these questions, I studied a suite of lunar samples rich in a chemical component called KREEP (K, Rare Earth Elements, P), all of which are incompatible elements. Water behaves as an incompatible element in magmas, so KREEP-rich lunar samples are potentially water rich. In this dissertation, I present the results of a petrologic study of KREEP-rich lunar rocks, measurements of their water contents and deuterium (D) to hydrogen (H) ratios (D/H), and examined where these rocks fit into our understanding of water in the Moon as a whole. We performed a study of highly evolved, KREEP-rich lunar rocks called felsites and determined that they contain quartz. Using cooling rates derived from quartz-Ti thermometry, we show the felsites originated at a minimum pressure of ˜1 kbar, corresponding to a minimum depth of 20-25 km in the lunar crust. We calculate that at that pressure water would have been soluble in the melt, indicating that degassing of H2O from the felsite parental melts was likely minimal and hydrogen isotopes in intrusive rocks are likely unfractionated. We then measured D/H in apatite in KREEP-rich intrusive rocks to clarify the solar system source of the Moon's water. When viewed in the context of other lunar D/H studies, our results indicate there are at least three distinctive reservoirs in the lunar interior, including an ultra-low D reservoir that could represent a primitive component in the Moon's interior. Furthermore, our measurements of residual glass in a KREEP basalt show that the KREEP basaltic magmas contained 10 times less water than the source of the Apollo 17 pyroclastic glass beads, indicating that, though wetter than previously thought, the concentration of

  3. Can There Be Reliability without "Reliability?"

    ERIC Educational Resources Information Center

    Mislevy, Robert J.

    2004-01-01

    An "Educational Researcher" article by Pamela Moss (1994) asks the title question, "Can there be validity without reliability?" Yes, she answers, if by reliability one means "consistency among independent observations intended as interchangeable" (Moss, 1994, p. 7), quantified by internal consistency indices such as…

  4. Can there be reliability without reliability?

    NASA Astrophysics Data System (ADS)

    Mislevy, Robert J.

    1994-10-01

    A recent article by Pamela Moss asks the title question, 'Can there be validity without reliability?' If by reliability we mean only KR-2O coefficients or inter-rater correlations, the answer is yes. Sometimes these particular indices for evaluating evidence suit the problem we encounter; sometimes they don't. If by reliability we mean credibility of evidence, where credibility is defined as 'appropriate to the intended inference, the answer is no, we cannot have validity without reliability. Because 'validity' encompasses the process of reasoning as well as the data, uncritically accepting observations as strong evidence, when they may be incorrect, misleading, unrepresentative, or fraudulent, may lead coincidentally to correct conclusions but not to valid ones. This paper discusses and illustrates a broader conception of 'reliability' in educational assessment, to ground a deeper understanding of the issues raised by Professor Moss's question.

  5. Protein sites with more coevolutionary connections tend to evolve slower, while more variable protein families acquire higher coevolutionary connections.

    PubMed

    Mandloi, Sapan; Chakrabarti, Saikat

    2017-01-01

    Background: Amino acid exchanges within proteins sometimes compensate for one another and could therefore be co-evolved. It is essential to investigate the intricate relationship between the extent of coevolution and the evolutionary variability exerted at individual protein sites, as well as the whole protein. Methods: In this study, we have used a reliable set of coevolutionary connections (sites within 10Å spatial distance) and investigated their correlation with the evolutionary diversity within the respective protein sites. Results: Based on our observations, we propose an interesting hypothesis that higher numbers of coevolutionary connections are associated with lesser evolutionary variable protein sites, while higher numbers of the coevolutionary connections can be observed for a protein family that has higher evolutionary variability. Our findings also indicate that highly coevolved sites located in a solvent accessible state tend to be less evolutionary variable. This relationship reverts at the whole protein level where cytoplasmic and extracellular proteins show moderately higher anti-correlation between the number of coevolutionary connections and the average evolutionary conservation of the whole protein. Conclusions: Observations and hypothesis presented in this study provide intriguing insights towards understanding the critical relationship between coevolutionary and evolutionary changes observed within proteins. Our observations encourage further investigation to find out the reasons behind subtle variations in the relationship between coevolutionary connectivity and evolutionary diversity for proteins located at various cellular localizations and/or involved in different molecular-biological functions.

  6. Compound estimation procedures in reliability

    NASA Technical Reports Server (NTRS)

    Barnes, Ron

    1990-01-01

    At NASA, components and subsystems of components in the Space Shuttle and Space Station generally go through a number of redesign stages. While data on failures for various design stages are sometimes available, the classical procedures for evaluating reliability only utilize the failure data on the present design stage of the component or subsystem. Often, few or no failures have been recorded on the present design stage. Previously, Bayesian estimators for the reliability of a single component, conditioned on the failure data for the present design, were developed. These new estimators permit NASA to evaluate the reliability, even when few or no failures have been recorded. Point estimates for the latter evaluation were not possible with the classical procedures. Since different design stages of a component (or subsystem) generally have a good deal in common, the development of new statistical procedures for evaluating the reliability, which consider the entire failure record for all design stages, has great intuitive appeal. A typical subsystem consists of a number of different components and each component has evolved through a number of redesign stages. The present investigations considered compound estimation procedures and related models. Such models permit the statistical consideration of all design stages of each component and thus incorporate all the available failure data to obtain estimates for the reliability of the present version of the component (or subsystem). A number of models were considered to estimate the reliability of a component conditioned on its total failure history from two design stages. It was determined that reliability estimators for the present design stage, conditioned on the complete failure history for two design stages have lower risk than the corresponding estimators conditioned only on the most recent design failure data. Several models were explored and preliminary models involving bivariate Poisson distribution and the

  7. The Problem of Evolving a Genetic Code

    ERIC Educational Resources Information Center

    Woese, Carl R.

    1970-01-01

    Proposes models for the evolution of the genetic code and translation mechanisms. Suggests that the translation process is so complex and precise that it must have evolved in many stages, and that the evolution of the code was influenced by the constraints imposed by the evolving translation mechanism. (EB)

  8. Evolving Technologies: A View to Tomorrow

    ERIC Educational Resources Information Center

    Tamarkin, Molly; Rodrigo, Shelley

    2011-01-01

    Technology leaders must participate in strategy creation as well as operational delivery within higher education institutions. The future of higher education--the view to tomorrow--is irrevocably integrated and intertwined with evolving technologies. This article focuses on two specific evolving technologies: (1) alternative IT sourcing; and (2)…

  9. Evolving Technologies: A View to Tomorrow

    ERIC Educational Resources Information Center

    Tamarkin, Molly; Rodrigo, Shelley

    2011-01-01

    Technology leaders must participate in strategy creation as well as operational delivery within higher education institutions. The future of higher education--the view to tomorrow--is irrevocably integrated and intertwined with evolving technologies. This article focuses on two specific evolving technologies: (1) alternative IT sourcing; and (2)…

  10. What Technology? Reflections on Evolving Services

    ERIC Educational Resources Information Center

    Collins, Sharon

    2009-01-01

    Each year, the members of the EDUCAUSE Evolving Technologies Committee identify and research the evolving technologies that are having--or are predicted to have--the most direct impact on higher education institutions. The committee members choose the relevant topics, write white papers, and present their findings at the EDUCAUSE annual…

  11. Assuring reliability program effectiveness.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  12. Assuring reliability program effectiveness.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  13. [Emergencies evolving from local anesthesia].

    PubMed

    Kaufman, E; Garfunkel, A; Findler, M; Elad, S; Zusman, S P; Malamed, S F; Galili, D

    2002-01-01

    Local anesthesia is without doubt the most frequently used drug in dentistry and in medicine. In spite of records of safety set by using these drugs, there is evidence to adverse reactions ranging from 2.5%-11%. Most of the reactions originate from the autonomic system. A recent, well-planned study indicates that adverse reactions are highly correlated to the medical status of the patient: the higher the medical risk, the greater the chance to experience an adverse reaction. This study also found that adverse reactions highly correlated to the concentration of adrenalin. Another recent study found a direct relationship between adverse reactions and the level of anxiety experienced by the patient and to the dental procedure. Most of the reactions in this study occurred either immediately at injection time and within 2 hours following the injection. Since the beginning of last century, vasoconstrictors have been added to local anesthesia solutions in order to reduce toxicity and prologue activity of the LA. However, today it is commonly agreed that this addition to local anesthesia should not be administered to cardiac patients especially those suffering from refractory dysrhythmias, angina pectoris, post myocardial infarction (6 months) and uncontrolled hypertension. Other contraindications to vasoconstrictors are endocrine disorders such as hyperthyroidism, hyperfunction of the medullary adrenal (pheochromocytoma) and uncontrolled diabetes mellitus. Cross reactivity of local anesthetic solutions can occur with MAO inhibitors, non specific beta adrenergic blockers, tricyclic antidepressants, phenothiazides and cocaine abusers. Noradrenaline added to local anesthetics as a vasoconstrictor has been described as a trigger to a great increase in blood pressure and therefore has been forbidden for use in many countries. This paper describes 4 cases of severe complications following the injections of local anesthesia of which three ended in fatality.

  14. Reliability computation from reliability block diagrams

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.; Eckstein, R. E.

    1971-01-01

    A method and a computer program are presented to calculate probability of system success from an arbitrary reliability block diagram. The class of reliability block diagrams that can be handled include any active/standby combination of redundancy, and the computations include the effects of dormancy and switching in any standby redundancy. The mechanics of the program are based on an extension of the probability tree method of computing system probabilities.

  15. Human Reliability Program Overview

    SciTech Connect

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  16. Power electronics reliability analysis.

    SciTech Connect

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  17. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  18. Evolvable Cryogenics (ECRYO) Pressure Transducer Calibration Test

    NASA Technical Reports Server (NTRS)

    Diaz, Carlos E., Jr.

    2015-01-01

    This paper provides a summary of the findings of recent activities conducted by Marshall Space Flight Center's (MSFC) In-Space Propulsion Branch and MSFC's Metrology and Calibration Lab to assess the performance of current "state of the art" pressure transducers for use in long duration storage and transfer of cryogenic propellants. A brief historical narrative in this paper describes the Evolvable Cryogenics program and the relevance of these activities to the program. This paper also provides a review of three separate test activities performed throughout this effort, including: (1) the calibration of several pressure transducer designs in a liquid nitrogen cryogenic environmental chamber, (2) the calibration of a pressure transducer in a liquid helium Dewar, and (3) the calibration of several pressure transducers at temperatures ranging from 20 to 70 degrees Kelvin (K) using a "cryostat" environmental chamber. These three separate test activities allowed for study of the sensors along a temperature range from 4 to 300 K. The combined data shows that both the slope and intercept of the sensor's calibration curve vary as a function of temperature. This homogeneous function is contrary to the linearly decreasing relationship assumed at the start of this investigation. Consequently, the data demonstrates the need for lookup tables to change the slope and intercept used by any data acquisition system. This ultimately would allow for more accurate pressure measurements at the desired temperature range. This paper concludes with a review of a request for information (RFI) survey conducted amongst different suppliers to determine the availability of current "state of the art" flight-qualified pressure transducers. The survey identifies requirements that are most difficult for the suppliers to meet, most notably the capability to validate the sensor's performance at temperatures below 70 K.

  19. Reliable Design Versus Trust

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  20. Integrated avionics reliability

    NASA Technical Reports Server (NTRS)

    Alikiotis, Dimitri

    1988-01-01

    The integrated avionics reliability task is an effort to build credible reliability and/or performability models for multisensor integrated navigation and flight control. The research was initiated by the reliability analysis of a multisensor navigation system consisting of the Global Positioning System (GPS), the Long Range Navigation system (Loran C), and an inertial measurement unit (IMU). Markov reliability models were developed based on system failure rates and mission time.

  1. Warning signals evolve to disengage Batesian mimics.

    PubMed

    Franks, Daniel W; Ruxton, Graeme D; Sherratt, Thomas N

    2009-01-01

    Prey that are unprofitable to attack are typically conspicuous in appearance. Conventional theory assumes that these warning signals have evolved in response to predator receiver biases. However, such biases might be a symptom rather than a cause of warning signals. We therefore examine an alternative theory: that conspicuousness evolves in unprofitable prey to avoid confusion with profitable prey. One might wonder why unprofitable prey do not find a cryptic means to be distinct from profitable prey, reducing both their risk of confusion with profitable prey and their rate of detection by predators. Here we present the first coevolutionary model to allow for Batesian mimicry and signals with different levels of detectability. We find that unprofitable prey do indeed evolve ways of distinguishing themselves using cryptic signals, particularly when appearance traits can evolve in multiple dimensions. However, conspicuous warning signals readily evolve in unprofitable prey when there are more ways to look different from the background than to match it. Moreover, the more unprofitable the prey species, the higher its evolved conspicuousness. Our results provide strong support for the argument that unprofitable species evolve conspicuous signals to avoid confusion with profitable prey and indicate that peak shift in conspicuousness-linked traits is a major factor in its establishment.

  2. Theory of reliable systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1975-01-01

    An attempt was made to refine the current notion of system reliability by identifying and investigating attributes of a system which are important to reliability considerations. Techniques which facilitate analysis of system reliability are included. Special attention was given to fault tolerance, diagnosability, and reconfigurability characteristics of systems.

  3. Reliability as Argument

    ERIC Educational Resources Information Center

    Parkes, Jay

    2007-01-01

    Reliability consists of both important social and scientific values and methods for evidencing those values, though in practice methods are often conflated with the values. With the two distinctly understood, a reliability argument can be made that articulates the particular reliability values most relevant to the particular measurement situation…

  4. Viking Lander reliability program

    NASA Technical Reports Server (NTRS)

    Pilny, M. J.

    1978-01-01

    The Viking Lander reliability program is reviewed with attention given to the development of the reliability program requirements, reliability program management, documents evaluation, failure modes evaluation, production variation control, failure reporting and correction, and the parts program. Lander hardware failures which have occurred during the mission are listed.

  5. Reliability model generator specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Mccann, Catherine

    1990-01-01

    The Reliability Model Generator (RMG), a program which produces reliability models from block diagrams for ASSIST, the interface for the reliability evaluation tool SURE is described. An account is given of motivation for RMG and the implemented algorithms are discussed. The appendices contain the algorithms and two detailed traces of examples.

  6. Reliability and structural integrity

    NASA Technical Reports Server (NTRS)

    Davidson, J. R.

    1976-01-01

    An analytic model is developed to calculate the reliability of a structure after it is inspected for cracks. The model accounts for the growth of undiscovered cracks between inspections and their effect upon the reliability after subsequent inspections. The model is based upon a differential form of Bayes' Theorem for reliability, and upon fracture mechanics for crack growth.

  7. Predicting software reliability

    NASA Technical Reports Server (NTRS)

    Littlewood, B.

    1989-01-01

    A detailed look is given to software reliability techniques. A conceptual model of the failure process is examined, and some software reliability growth models are discussed. Problems for which no current solutions exist are addressed, emphasizing the very difficult problem of safety-critical systems for which the reliability requirements can be enormously demanding.

  8. Properties of artificial networks evolved to contend with natural spectra.

    PubMed

    Morgenstern, Yaniv; Rostami, Mohammad; Purves, Dale

    2014-07-22

    Understanding why spectra that are physically the same appear different in different contexts (color contrast), whereas spectra that are physically different appear similar (color constancy) presents a major challenge in vision research. Here, we show that the responses of biologically inspired neural networks evolved on the basis of accumulated experience with spectral stimuli automatically generate contrast and constancy. The results imply that these phenomena are signatures of a strategy that biological vision uses to circumvent the inverse optics problem as it pertains to light spectra, and that double-opponent neurons in early-level vision evolve to serve this purpose. This strategy provides a way of understanding the peculiar relationship between the objective world and subjective color experience, as well as rationalizing the relevant visual circuitry without invoking feature detection or image representation.

  9. Properties of artificial networks evolved to contend with natural spectra

    PubMed Central

    Morgenstern, Yaniv; Rostami, Mohammad; Purves, Dale

    2014-01-01

    Understanding why spectra that are physically the same appear different in different contexts (color contrast), whereas spectra that are physically different appear similar (color constancy) presents a major challenge in vision research. Here, we show that the responses of biologically inspired neural networks evolved on the basis of accumulated experience with spectral stimuli automatically generate contrast and constancy. The results imply that these phenomena are signatures of a strategy that biological vision uses to circumvent the inverse optics problem as it pertains to light spectra, and that double-opponent neurons in early-level vision evolve to serve this purpose. This strategy provides a way of understanding the peculiar relationship between the objective world and subjective color experience, as well as rationalizing the relevant visual circuitry without invoking feature detection or image representation. PMID:25024184

  10. The reliability of clinical diagnoses: state of the art.

    PubMed

    Kraemer, Helena Chmura

    2014-01-01

    Reliability of clinical diagnosis is essential for good clinical decision making as well as productive clinical research. The current review emphasizes the distinction between a disorder and a diagnosis and between validity and reliability of diagnoses, and the relationships that exist between them. What is crucial is that reliable diagnoses are essential to establishing valid diagnoses. The present review discusses the theoretical background underlying the evaluation of diagnoses, possible designs of reliability studies, estimation of the reliability coefficient, the standards for assessment of reliability, and strategies for improving reliability without compromising validity.

  11. An Evolvable Approach to Launch Vehicles for Exploration

    NASA Technical Reports Server (NTRS)

    Cheuvront, David L.; Nguyen, Tri X.

    2005-01-01

    This paper presents ideas that may be used individually or in combination to mitigate high costs for separate developments of new crew and heavy-lift cargo launch vehicles, while providing the foundation for a highly reliable and evolvable approach to exploration. Consideration is given to reclassification of cargo for launch purposes into high value versus low value categories, rather than the presently-defined crew versus cargo categories. Objectives for the reclassification are to reduce the gap between payload mass requirements for crew and cargo payloads to better allow closure on a single moderately-sized common core vehicle to reduce development cost, achieve an economical balance between launch frequency and payload mass, and to improve total mission reliability and safety, as compared a light-weight crew vehicle and heavy cargo lift approach. Concepts to reduce design and flight qualification costs for a common core vehicle with derivatives are presented. Appropriate types and mass of cargo for each class of vehicle are identified. Utilization of existing infrastructure and flight hardware is considered to reduce costs and build on proven capabilities. The approach enables low-risk incorporation of international and commercial launch of relatively low-cost, easily replaceable assets as a means to evolve toward longer-duration and more distant missions. Benefits are identified for ground idrastructure, personnel, training, logistics, spares, and system evolution. Technology needs are compared with needs for other aspects of exploration. Technology development phasing, demonstration, and reliability growth opportunities are considered. Flexibility to adapt to future technologies such as advanced in-space propulsion is contrasted with an approach of sizing the cargo launch vehicle based on today's in-space propellants.

  12. Neural mechanisms underlying the evolvability of behaviour

    PubMed Central

    Katz, Paul S.

    2011-01-01

    The complexity of nervous systems alters the evolvability of behaviour. Complex nervous systems are phylogenetically constrained; nevertheless particular species-specific behaviours have repeatedly evolved, suggesting a predisposition towards those behaviours. Independently evolved behaviours in animals that share a common neural architecture are generally produced by homologous neural structures, homologous neural pathways and even in the case of some invertebrates, homologous identified neurons. Such parallel evolution has been documented in the chromatic sensitivity of visual systems, motor behaviours and complex social behaviours such as pair-bonding. The appearance of homoplasious behaviours produced by homologous neural substrates suggests that there might be features of these nervous systems that favoured the repeated evolution of particular behaviours. Neuromodulation may be one such feature because it allows anatomically defined neural circuitry to be re-purposed. The developmental, genetic and physiological mechanisms that contribute to nervous system complexity may also bias the evolution of behaviour, thereby affecting the evolvability of species-specific behaviour. PMID:21690127

  13. Neural mechanisms underlying the evolvability of behaviour.

    PubMed

    Katz, Paul S

    2011-07-27

    The complexity of nervous systems alters the evolvability of behaviour. Complex nervous systems are phylogenetically constrained; nevertheless particular species-specific behaviours have repeatedly evolved, suggesting a predisposition towards those behaviours. Independently evolved behaviours in animals that share a common neural architecture are generally produced by homologous neural structures, homologous neural pathways and even in the case of some invertebrates, homologous identified neurons. Such parallel evolution has been documented in the chromatic sensitivity of visual systems, motor behaviours and complex social behaviours such as pair-bonding. The appearance of homoplasious behaviours produced by homologous neural substrates suggests that there might be features of these nervous systems that favoured the repeated evolution of particular behaviours. Neuromodulation may be one such feature because it allows anatomically defined neural circuitry to be re-purposed. The developmental, genetic and physiological mechanisms that contribute to nervous system complexity may also bias the evolution of behaviour, thereby affecting the evolvability of species-specific behaviour.

  14. Evolving communicative complexity: insights from rodents and beyond.

    PubMed

    Pollard, Kimberly A; Blumstein, Daniel T

    2012-07-05

    Social living goes hand in hand with communication, but the details of this relationship are rarely simple. Complex communication may be described by attributes as diverse as a species' entire repertoire, signallers' individualistic signatures, or complex acoustic phenomena within single calls. Similarly, attributes of social complexity are diverse and may include group size, social role diversity, or networks of interactions and relationships. How these different attributes of social and communicative complexity co-evolve is an active question in behavioural ecology. Sciurid rodents (ground squirrels, prairie dogs and marmots) provide an excellent model system for studying these questions. Sciurid studies have found that demographic role complexity predicts alarm call repertoire size, while social group size predicts alarm call individuality. Along with other taxa, sciurids reveal an important insight: different attributes of sociality are linked to different attributes of communication. By breaking social and communicative complexity down to different attributes, focused studies can better untangle the underlying evolutionary relationships and move us closer to a comprehensive theory of how sociality and communication evolve.

  15. Evolving communicative complexity: insights from rodents and beyond

    PubMed Central

    Pollard, Kimberly A.; Blumstein, Daniel T.

    2012-01-01

    Social living goes hand in hand with communication, but the details of this relationship are rarely simple. Complex communication may be described by attributes as diverse as a species' entire repertoire, signallers' individualistic signatures, or complex acoustic phenomena within single calls. Similarly, attributes of social complexity are diverse and may include group size, social role diversity, or networks of interactions and relationships. How these different attributes of social and communicative complexity co-evolve is an active question in behavioural ecology. Sciurid rodents (ground squirrels, prairie dogs and marmots) provide an excellent model system for studying these questions. Sciurid studies have found that demographic role complexity predicts alarm call repertoire size, while social group size predicts alarm call individuality. Along with other taxa, sciurids reveal an important insight: different attributes of sociality are linked to different attributes of communication. By breaking social and communicative complexity down to different attributes, focused studies can better untangle the underlying evolutionary relationships and move us closer to a comprehensive theory of how sociality and communication evolve. PMID:22641825

  16. Cancer stem cells: constantly evolving and functionally heterogeneous therapeutic targets.

    PubMed

    Yang, Tao; Rycaj, Kiera; Liu, Zhong-Min; Tang, Dean G

    2014-06-01

    Elucidating the origin of and dynamic interrelationship between intratumoral cell subpopulations has clear clinical significance in helping to understand the cellular basis of treatment response, therapeutic resistance, and tumor relapse. Cancer stem cells (CSC), together with clonal evolution driven by genetic alterations, generate cancer cell heterogeneity commonly observed in clinical samples. The 2013 Shanghai International Symposium on Cancer Stem Cells brought together leaders in the field to highlight the most recent progress in phenotyping, characterizing, and targeting CSCs and in elucidating the relationship between the cell-of-origin of cancer and CSCs. Discussions from the symposium emphasize the urgent need in developing novel therapeutics to target the constantly evolving CSCs.

  17. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  18. The transcriptomics of an experimentally evolved plant-virus interaction

    PubMed Central

    Hillung, Julia; García-García, Francisco; Dopazo, Joaquín; Cuevas, José M.; Elena, Santiago F.

    2016-01-01

    Models of plant-virus interaction assume that the ability of a virus to infect a host genotype depends on the matching between virulence and resistance genes. Recently, we evolved tobacco etch potyvirus (TEV) lineages on different ecotypes of Arabidopsis thaliana, and found that some ecotypes selected for specialist viruses whereas others selected for generalists. Here we sought to evaluate the transcriptomic basis of such relationships. We have characterized the transcriptomic responses of five ecotypes infected with the ancestral and evolved viruses. Genes and functional categories differentially expressed by plants infected with local TEV isolates were identified, showing heterogeneous responses among ecotypes, although significant parallelism existed among lineages evolved in the same ecotype. Although genes involved in immune responses were altered upon infection, other functional groups were also pervasively over-represented, suggesting that plant resistance genes were not the only drivers of viral adaptation. Finally, the transcriptomic consequences of infection with the generalist and specialist lineages were compared. Whilst the generalist induced very similar perturbations in the transcriptomes of the different ecotypes, the perturbations induced by the specialist were divergent. Plant defense mechanisms were activated when the infecting virus was specialist but they were down-regulated when infecting with generalist. PMID:27113435

  19. The transcriptomics of an experimentally evolved plant-virus interaction.

    PubMed

    Hillung, Julia; García-García, Francisco; Dopazo, Joaquín; Cuevas, José M; Elena, Santiago F

    2016-04-26

    Models of plant-virus interaction assume that the ability of a virus to infect a host genotype depends on the matching between virulence and resistance genes. Recently, we evolved tobacco etch potyvirus (TEV) lineages on different ecotypes of Arabidopsis thaliana, and found that some ecotypes selected for specialist viruses whereas others selected for generalists. Here we sought to evaluate the transcriptomic basis of such relationships. We have characterized the transcriptomic responses of five ecotypes infected with the ancestral and evolved viruses. Genes and functional categories differentially expressed by plants infected with local TEV isolates were identified, showing heterogeneous responses among ecotypes, although significant parallelism existed among lineages evolved in the same ecotype. Although genes involved in immune responses were altered upon infection, other functional groups were also pervasively over-represented, suggesting that plant resistance genes were not the only drivers of viral adaptation. Finally, the transcriptomic consequences of infection with the generalist and specialist lineages were compared. Whilst the generalist induced very similar perturbations in the transcriptomes of the different ecotypes, the perturbations induced by the specialist were divergent. Plant defense mechanisms were activated when the infecting virus was specialist but they were down-regulated when infecting with generalist.

  20. Perspectives on evolving dental care payment and delivery models.

    PubMed

    Rubin, Marcie S; Edelstein, Burton L

    2016-01-01

    Health care reform is well under way in the United States as reflected in evolving delivery, financing, and payment approaches that are affecting medicine ahead of dentistry. The authors explored health systems changes under way, distinguished historical and organizational differences between medicine and dentistry, and developed alternative models to characterize the relationships between these professions. The authors explored a range of medical payment approaches, including those tied to objective performance metrics, and their potential application to dentistry. Advances in understanding the essential role of oral health in general health have pulled dentistry into the broader discussion of care integration and payment reform. Dentistry's fit with primary and specialty medical care may take a variety of forms. Common provider payment approaches in dentistry-fee-for-service, capitation, and salary-are tied insufficiently to performance when measured as either health processes or health outcomes. Dentistry can anticipate potential payment reforms by observing changes already under way in medicine and by understanding alternative payment approaches that are tied to performance metrics, such as those now in development by the Dental Quality Alliance and others. Novel forms of dental practice may be expected to evolve continuously as medical-dental integration and payment reforms that promote accountability evolve. Copyright © 2016 American Dental Association. Published by Elsevier Inc. All rights reserved.

  1. Evolving treatment plan quality criteria from institution-specific experience.

    PubMed

    Ruan, D; Shao, W; Demarco, J; Tenn, S; King, C; Low, D; Kupelian, P; Steinberg, M

    2012-05-01

    The dosimetric aspects of radiation therapy treatment plan quality are usually evaluated and reported with dose volume histogram (DVH) endpoints. For clinical practicality, a small number of representative quantities derived from the DVH are often used as dose endpoints to summarize the plan quality. National guidelines on reference values for such quantities for some standard treatment approaches are often used as acceptance criteria to trigger treatment plan review. On the other hand, treatment prescription and planning approaches specific to each institution warrants the need to report plan quality in terms of practice consistency and with respect to institution-specific experience. The purpose of this study is to investigate and develop a systematic approach to record and characterize the institution-specific plan experience and use such information to guide the design of plan quality criteria. In the clinical setting, this approach will assist in (1) improving overall plan quality and consistency and (2) detecting abnormal plan behavior for retrospective analysis. The authors propose a self-evolving methodology and have developed an in-house prototype software suite that (1) extracts the dose endpoints from a treatment plan and evaluates them against both national standard and institution-specific criteria and (2) evolves the statistics for the dose endpoints and updates institution-specific criteria. The validity of the proposed methodology was demonstrated with a database of prostate stereotactic body radiotherapy cases. As more data sets are accumulated, the evolving institution-specific criteria can serve as a reliable and stable consistency measure for plan quality and reveals the potential use of the "tighter" criteria than national standards or projected criteria, leading to practice that may push to shrink the gap between plans deemed acceptable and the underlying unknown optimality. The authors have developed a rationale to improve plan quality and

  2. Recent advances in evolvable systems--ICES 96 (International Conference on Evolvable Systems).

    PubMed

    Frank, I; Manderick, B; Higuchi, T

    1997-01-01

    This paper reviews the developments in evolvable hardware systems presented at the First International Conference on Evolvable Systems (ICES 96). The main body of the review gives an overview of the 34 papers presented orally, splitting them into three broad groups according to whether they involve (1) evolving a fit solution to a problem as a member of a population of competing candidates, (2) evolving solutions that can individually learn from and adapt to their environments, or (3) the embryonic growth of solutions. We also review the discussion sessions of the conference and give pointers to related upcoming events.

  3. Power Quality and Reliability Project

    NASA Technical Reports Server (NTRS)

    Attia, John O.

    2001-01-01

    One area where universities and industry can link is in the area of power systems reliability and quality - key concepts in the commercial, industrial and public sector engineering environments. Prairie View A&M University (PVAMU) has established a collaborative relationship with the University of'Texas at Arlington (UTA), NASA/Johnson Space Center (JSC), and EP&C Engineering and Technology Group (EP&C) a small disadvantage business that specializes in power quality and engineering services. The primary goal of this collaboration is to facilitate the development and implementation of a Strategic Integrated power/Systems Reliability and Curriculum Enhancement Program. The objectives of first phase of this work are: (a) to develop a course in power quality and reliability, (b) to use the campus of Prairie View A&M University as a laboratory for the study of systems reliability and quality issues, (c) to provide students with NASA/EPC shadowing and Internship experience. In this work, a course, titled "Reliability Analysis of Electrical Facilities" was developed and taught for two semesters. About thirty seven has benefited directly from this course. A laboratory accompanying the course was also developed. Four facilities at Prairie View A&M University were surveyed. Some tests that were performed are (i) earth-ground testing, (ii) voltage, amperage and harmonics of various panels in the buildings, (iii) checking the wire sizes to see if they were the right size for the load that they were carrying, (iv) vibration tests to test the status of the engines or chillers and water pumps, (v) infrared testing to the test arcing or misfiring of electrical or mechanical systems.

  4. Access to space: The Space Shuttle's evolving rolee

    NASA Astrophysics Data System (ADS)

    Duttry, Steven R.

    1993-04-01

    Access to space is of extreme importance to our nation and the world. Military, civil, and commercial space activities all depend on reliable space transportation systems for access to space at a reasonable cost. The Space Transportation System or Space Shuttle was originally planned to provide transportation to and from a manned Earth-orbiting space station. To justify the development and operations costs, the Space Shuttle took on other space transportation requirements to include DoD, civil, and a growing commercial launch market. This research paper or case study examines the evolving role of the Space Shuttle as our nation's means of accessing space. The case study includes a review of the events leading to the development of the Space Shuttle, identifies some of the key players in the decision-making process, examines alternatives developed to mitigate the risks associated with sole reliance on the Space Shuttle, and highlights the impacts of this national space policy following the Challenger accident.

  5. Ergonomics in consumer product evaluation: an evolving process.

    PubMed

    Butters, L M; Dixon, R T

    1998-02-01

    As part of its commitment to empowering people to make informed consumer decisions, the Consumers' Association investigates the convenience aspects of a vast range of products, from cars to garden spades. Evaluation approaches include user trials, convenience checklists and expert appraisals. Our methodology is subject to constant review and refinement to ensure the highest levels of reliability, validity and auditability. We have a distinctive approach: our tests are designed to reflect consumer usage and to provide comparative data which is absolutely fair to all products. This paper discusses the evolving nature of that methodology within the "lifetime" of a product. Reasons for choosing each method are given as practical guidance for those involved in comparative testing.

  6. Evolving networks in the human epileptic brain

    NASA Astrophysics Data System (ADS)

    Lehnertz, Klaus; Ansmann, Gerrit; Bialonski, Stephan; Dickten, Henning; Geier, Christian; Porz, Stephan

    2014-01-01

    Network theory provides novel concepts that promise an improved characterization of interacting dynamical systems. Within this framework, evolving networks can be considered as being composed of nodes, representing systems, and of time-varying edges, representing interactions between these systems. This approach is highly attractive to further our understanding of the physiological and pathophysiological dynamics in human brain networks. Indeed, there is growing evidence that the epileptic process can be regarded as a large-scale network phenomenon. We here review methodologies for inferring networks from empirical time series and for a characterization of these evolving networks. We summarize recent findings derived from studies that investigate human epileptic brain networks evolving on timescales ranging from few seconds to weeks. We point to possible pitfalls and open issues, and discuss future perspectives.

  7. Quantifying evolvability in small biological networks

    SciTech Connect

    Nemenman, Ilya; Mugler, Andrew; Ziv, Etay; Wiggins, Chris H

    2008-01-01

    The authors introduce a quantitative measure of the capacity of a small biological network to evolve. The measure is applied to a stochastic description of the experimental setup of Guet et al. (Science 2002, 296, pp. 1466), treating chemical inducers as functional inputs to biochemical networks and the expression of a reporter gene as the functional output. The authors take an information-theoretic approach, allowing the system to set parameters that optimise signal processing ability, thus enumerating each network's highest-fidelity functions. All networks studied are highly evolvable by the measure, meaning that change in function has little dependence on change in parameters. Moreover, each network's functions are connected by paths in the parameter space along which information is not significantly lowered, meaning a network may continuously change its functionality without completely losing it along the way. This property further underscores the evolvability of the networks.

  8. Metanetworks of artificially evolved regulatory networks

    NASA Astrophysics Data System (ADS)

    Danacı, Burçin; Erzan, Ayşe

    2016-04-01

    We study metanetworks arising in genotype and phenotype spaces, in the context of a model population of Boolean graphs evolved under selection for short dynamical attractors. We define the adjacency matrix of a graph as its genotype, which gets mutated in the course of evolution, while its phenotype is its set of dynamical attractors. Metanetworks in the genotype and phenotype spaces are formed, respectively, by genetic proximity and by phenotypic similarity, the latter weighted by the sizes of the basins of attraction of the shared attractors. We find that evolved populations of Boolean graphs form tree-like giant clusters in genotype space, while random populations of Boolean graphs are typically so far removed from each other genetically that they cannot form a metanetwork. In phenotype space, the metanetworks of evolved populations are super robust both under the elimination of weak connections and random removal of nodes.

  9. Hawaii electric system reliability.

    SciTech Connect

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  10. Hawaii Electric System Reliability

    SciTech Connect

    Loose, Verne William; Silva Monroy, Cesar Augusto

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  11. Reliability, Recursion, and Risk.

    ERIC Educational Resources Information Center

    Henriksen, Melvin, Ed.; Wagon, Stan, Ed.

    1991-01-01

    The discrete mathematics topics of trees and computational complexity are implemented in a simple reliability program which illustrates the process advantages of the PASCAL programing language. The discussion focuses on the impact that reliability research can provide in assessment of the risks found in complex technological ventures. (Author/JJK)

  12. Monte Carlo Reliability Analysis.

    DTIC Science & Technology

    1987-10-01

    to Stochastic Processes , Prentice-Hall, Englewood Cliffs, NJ, 1975. (5) R. E. Barlow and F. Proscham, Statistical TheorX of Reliability and Life...Lewis and Z. Tu, "Monte Carlo Reliability Modeling by Inhomogeneous ,Markov Processes, Reliab. Engr. 16, 277-296 (1986). (4) E. Cinlar, Introduction

  13. JavaGenes: Evolving Graphs with Crossover

    NASA Technical Reports Server (NTRS)

    Globus, Al; Atsatt, Sean; Lawton, John; Wipke, Todd

    2000-01-01

    Genetic algorithms usually use string or tree representations. We have developed a novel crossover operator for a directed and undirected graph representation, and used this operator to evolve molecules and circuits. Unlike strings or trees, a single point in the representation cannot divide every possible graph into two parts, because graphs may contain cycles. Thus, the crossover operator is non-trivial. A steady-state, tournament selection genetic algorithm code (JavaGenes) was written to implement and test the graph crossover operator. All runs were executed by cycle-scavagging on networked workstations using the Condor batch processing system. The JavaGenes code has evolved pharmaceutical drug molecules and simple digital circuits. Results to date suggest that JavaGenes can evolve moderate sized drug molecules and very small circuits in reasonable time. The algorithm has greater difficulty with somewhat larger circuits, suggesting that directed graphs (circuits) are more difficult to evolve than undirected graphs (molecules), although necessary differences in the crossover operator may also explain the results. In principle, JavaGenes should be able to evolve other graph-representable systems, such as transportation networks, metabolic pathways, and computer networks. However, large graphs evolve significantly slower than smaller graphs, presumably because the space-of-all-graphs explodes combinatorially with graph size. Since the representation strongly affects genetic algorithm performance, adding graphs to the evolutionary programmer's bag-of-tricks should be beneficial. Also, since graph evolution operates directly on the phenotype, the genotype-phenotype translation step, common in genetic algorithm work, is eliminated.

  14. Co-evolving prisoner's dilemma: Performance indicators and analytic approaches

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Choi, C. W.; Li, Y. S.; Xu, C.; Hui, P. M.

    2017-02-01

    Understanding the intrinsic relation between the dynamical processes in a co-evolving network and the necessary ingredients in formulating a reliable theory is an important question and a challenging task. Using two slightly different definitions of performance indicator in the context of a co-evolving prisoner's dilemma game, it is shown that very different cooperative levels result and theories of different complexity are required to understand the key features. When the payoff per opponent is used as the indicator (Case A), non-cooperative strategy has an edge and dominates in a large part of the parameter space formed by the cutting-and-rewiring probability and the strategy imitation probability. When the payoff from all opponents is used (Case B), cooperative strategy has an edge and dominates the parameter space. Two distinct phases, one homogeneous and dynamical and another inhomogeneous and static, emerge and the phase boundary in the parameter space is studied in detail. A simple theory assuming an average competing environment for cooperative agents and another for non-cooperative agents is shown to perform well in Case A. The same theory, however, fails badly for Case B. It is necessary to include more spatial correlation into a theory for Case B. We show that the local configuration approximation, which takes into account of the different competing environments for agents with different strategies and degrees, is needed to give reliable results for Case B. The results illustrate that formulating a proper theory requires both a conceptual understanding of the effects of the adaptive processes in the problem and a delicate balance between simplicity and accuracy.

  15. Chapter 9: Reliability

    SciTech Connect

    Algora, Carlos; Espinet-Gonzalez, Pilar; Vazquez, Manuel; Bosco, Nick; Miller, David; Kurtz, Sarah; Rubio, Francisca; McConnell,Robert

    2016-04-15

    This chapter describes the accumulated knowledge on CPV reliability with its fundamentals and qualification. It explains the reliability of solar cells, modules (including optics) and plants. The chapter discusses the statistical distributions, namely exponential, normal and Weibull. The reliability of solar cells includes: namely the issues in accelerated aging tests in CPV solar cells, types of failure and failures in real time operation. The chapter explores the accelerated life tests, namely qualitative life tests (mainly HALT) and quantitative accelerated life tests (QALT). It examines other well proven and experienced PV cells and/or semiconductor devices, which share similar semiconductor materials, manufacturing techniques or operating conditions, namely, III-V space solar cells and light emitting diodes (LEDs). It addresses each of the identified reliability issues and presents the current state of the art knowledge for their testing and evaluation. Finally, the chapter summarizes the CPV qualification and reliability standards.

  16. A Stefan problem on an evolving surface

    PubMed Central

    Alphonse, Amal; Elliott, Charles M.

    2015-01-01

    We formulate a Stefan problem on an evolving hypersurface and study the well posedness of weak solutions given L1 data. To do this, we first develop function spaces and results to handle equations on evolving surfaces in order to give a natural treatment of the problem. Then, we consider the existence of solutions for data; this is done by regularization of the nonlinearity. The regularized problem is solved by a fixed point theorem and then uniform estimates are obtained in order to pass to the limit. By using a duality method, we show continuous dependence, which allows us to extend the results to L1 data. PMID:26261364

  17. How the first biopolymers could have evolved.

    PubMed Central

    Abkevich, V I; Gutin, A M; Shakhnovich, E I

    1996-01-01

    In this work, we discuss a possible origin of the first biopolymers with stable unique structures. We suggest that at the prebiotic stage of evolution, long organic polymers had to be compact to avoid hydrolysis and had to be soluble and thus must not be exceedingly hydrophobic. We present an algorithm that generates such sequences for model proteins. The evolved sequences turn out to have a stable unique structure, into which they quickly fold. This result illustrates the idea that the unique three-dimensional native structures of first biopolymers could have evolved as a side effect of nonspecific physicochemical factors acting at the prebiotic stage of evolution. PMID:8570645

  18. Sequence classification with side effect machines evolved via ring optimization.

    PubMed

    McEachern, Andrew; Ashlock, Daniel; Schonfeld, Justin

    2013-07-01

    The explosion of available sequence data necessitates the development of sophisticated machine learning tools with which to analyze them. This study introduces a sequence-learning technology called side effect machines. It also applies a model of evolution which simulates the evolution of a ring species to the training of the side effect machines. A comparison is done between side effect machines evolved in the ring structure and side effect machines evolved using a standard evolutionary algorithm based on tournament selection. At the core of the training of side effect machines is a nearest neighbor classifier. A parameter study was performed to investigate the impact of the division of training data into examples for nearest neighbor assessment and training cases. The parameter study demonstrates that parameter setting is important in the baseline runs but had little impact in the ring-optimization runs. The ring optimization technique was also found to exhibit improved and also more reliable training performance. Side effect machines are tested on two types of synthetic data, one based on GC-content and the other checking for the ability of side effect machines to recognize an embedded motif. Three types of biological data are used, a data set with different types of immune-system genes, a data set with normal and retro-virally derived human genomic sequence, and standard and nonstandard initiation regions from the cytochrome-oxidase subunit one in the mitochondrial genome. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Structural Analysis of an Evolved Transketolase Reveals Divergent Binding Modes.

    PubMed

    Affaticati, Pierre E; Dai, Shao-Bo; Payongsri, Panwajee; Hailes, Helen C; Tittmann, Kai; Dalby, Paul A

    2016-10-21

    The S385Y/D469T/R520Q variant of E. coli transketolase was evolved previously with three successive smart libraries, each guided by different structural, bioinformatical or computational methods. Substrate-walking progressively shifted the target acceptor substrate from phosphorylated aldehydes, towards a non-phosphorylated polar aldehyde, a non-polar aliphatic aldehyde, and finally a non-polar aromatic aldehyde. Kinetic evaluations on three benzaldehyde derivatives, suggested that their active-site binding was differentially sensitive to the S385Y mutation. Docking into mutants generated in silico from the wild-type crystal structure was not wholly satisfactory, as errors accumulated with successive mutations, and hampered further smart-library designs. Here we report the crystal structure of the S385Y/D469T/R520Q variant, and molecular docking of three substrates. This now supports our original hypothesis that directed-evolution had generated an evolutionary intermediate with divergent binding modes for the three aromatic aldehydes tested. The new active site contained two binding pockets supporting π-π stacking interactions, sterically separated by the D469T mutation. While 3-formylbenzoic acid (3-FBA) preferred one pocket, and 4-FBA the other, the less well-accepted substrate 3-hydroxybenzaldehyde (3-HBA) was caught in limbo with equal preference for the two pockets. This work highlights the value of obtaining crystal structures of evolved enzyme variants, for continued and reliable use of smart library strategies.

  20. An Evolvable Multi-Agent Approach to Space Operations Engineering

    NASA Technical Reports Server (NTRS)

    Mandutianu, Sanda; Stoica, Adrian

    1999-01-01

    A complex system of spacecraft and ground tracking stations, as well as a constellation of satellites or spacecraft, has to be able to reliably withstand sudden environment changes, resource fluctuations, dynamic resource configuration, limited communication bandwidth, etc., while maintaining the consistency of the system as a whole. It is not known in advance when a change in the environment might occur or when a particular exchange will happen. A higher degree of sophistication for the communication mechanisms between different parts of the system is required. The actual behavior has to be determined while the system is performing and the course of action can be decided at the individual level. Under such circumstances, the solution will highly benefit from increased on-board and on the ground adaptability and autonomy. An evolvable architecture based on intelligent agents that communicate and cooperate with each other can offer advantages in this direction. This paper presents an architecture of an evolvable agent-based system (software and software/hardware hybrids) as well as some plans for further implementation.

  1. Structural Analysis of an Evolved Transketolase Reveals Divergent Binding Modes

    PubMed Central

    Affaticati, Pierre E.; Dai, Shao-Bo; Payongsri, Panwajee; Hailes, Helen C.; Tittmann, Kai; Dalby, Paul A.

    2016-01-01

    The S385Y/D469T/R520Q variant of E. coli transketolase was evolved previously with three successive smart libraries, each guided by different structural, bioinformatical or computational methods. Substrate-walking progressively shifted the target acceptor substrate from phosphorylated aldehydes, towards a non-phosphorylated polar aldehyde, a non-polar aliphatic aldehyde, and finally a non-polar aromatic aldehyde. Kinetic evaluations on three benzaldehyde derivatives, suggested that their active-site binding was differentially sensitive to the S385Y mutation. Docking into mutants generated in silico from the wild-type crystal structure was not wholly satisfactory, as errors accumulated with successive mutations, and hampered further smart-library designs. Here we report the crystal structure of the S385Y/D469T/R520Q variant, and molecular docking of three substrates. This now supports our original hypothesis that directed-evolution had generated an evolutionary intermediate with divergent binding modes for the three aromatic aldehydes tested. The new active site contained two binding pockets supporting π-π stacking interactions, sterically separated by the D469T mutation. While 3-formylbenzoic acid (3-FBA) preferred one pocket, and 4-FBA the other, the less well-accepted substrate 3-hydroxybenzaldehyde (3-HBA) was caught in limbo with equal preference for the two pockets. This work highlights the value of obtaining crystal structures of evolved enzyme variants, for continued and reliable use of smart library strategies. PMID:27767080

  2. Zygomorphy evolved from disymmetry in Fumarioideae (Papaveraceae, Ranunculales): new evidence from an expanded molecular phylogenetic framework

    PubMed Central

    Sauquet, Hervé; Carrive, Laetitia; Poullain, Noëlie; Sannier, Julie; Damerval, Catherine; Nadot, Sophie

    2015-01-01

    Background and Aims Fumarioideae (20 genera, 593 species) is a clade of Papaveraceae (Ranunculales) characterized by flowers that are either disymmetric (i.e. two perpendicular planes of bilateral symmetry) or zygomorphic (i.e. one plane of bilateral symmetry). In contrast, the other subfamily of Papaveraceae, Papaveroideae (23 genera, 230 species), has actinomorphic flowers (i.e. more than two planes of symmetry). Understanding of the evolution of floral symmetry in this clade has so far been limited by the lack of a reliable phylogenetic framework. Pteridophyllum (one species) shares similarities with Fumarioideae but has actinomorphic flowers, and the relationships among Pteridophyllum, Papaveroideae and Fumarioideae have remained unclear. This study reassesses the evolution of floral symmetry in Papaveraceae based on new molecular phylogenetic analyses of the family. Methods Maximum likelihood, Bayesian and maximum parsimony phylogenetic analyses of Papaveraceae were conducted using six plastid markers and one nuclear marker, sampling Pteridophyllum, 18 (90 %) genera and 73 species of Fumarioideae, 11 (48 %) genera and 11 species of Papaveroideae, and a wide selection of outgroup taxa. Floral characters recorded from the literature were then optimized onto phylogenetic trees to reconstruct ancestral states using parsimony, maximum likelihood and reversible-jump Bayesian approaches. Key Results Pteridophyllum is not nested in Fumarioideae. Fumarioideae are monophyletic and Hypecoum (18 species) is the sister group of the remaining genera. Relationships within the core Fumarioideae are well resolved and supported. Dactylicapnos and all zygomorphic genera form a well-supported clade nested among disymmetric taxa. Conclusions Disymmetry of the corolla is a synapomorphy of Fumarioideae and is strongly correlated with changes in the androecium and differentiation of middle and inner tepal shape (basal spurs on middle tepals). Zygomorphy subsequently evolved from

  3. Anomaly Detection in Time-Evolving Climate Graphs

    NASA Astrophysics Data System (ADS)

    Liess, S.; Agrawal, S.; Das, K.; Atluri, G.; Steinbach, M.; Steinhaeuser, K.; Kumar, V.

    2016-12-01

    The spatio­-temporal observations that are available for different climate variables such as pressure, temperature, wind, humidity etc., have been studied to understand how changes in one variable at a location exhibit similarity with changes in a different variable at a location thousands of kilometers away. These non-trivial long distance relationships, called teleconnections, are often useful in understanding the underlying physical phenomenon driving extreme events, which are becoming more common with the changing climate. Networks constructed using these data sets have the ability to capture these relationships at a global scale. These networks have been analyzed using a variety of network based approaches such as community detection and anomaly detection that have shown promise in capturing interesting climate phenomenon. In this research we plan to construct time-evolving climate networks such that their edges represent causal relationships, and then discover anomalies in such 'causal' climate networks. As part of this research, we will address several limitations of previous work in anomaly detection using climate networks. First, we will take into account spatial and temporal dependencies while constructing the networks, that has been largely ignored by existing work. Second, we will learn Granger causality to define causal relationships among different nodes. Third, we will build heterogeneous climate networks that will involve nodes from different climate variables. Fourth, we will construct a Granger graphical model to understand the long-range temporal dependency in the data. Finally, we will use community evolution based notion of anomaly detection on the time-evolving causal networks to discover deviations in expected behavior.

  4. Surveying The Digital Landscape: Evolving Technologies 2004. The EDUCAUSE Evolving Technologies Committee

    ERIC Educational Resources Information Center

    EDUCAUSE Review, 2004

    2004-01-01

    Each year, the members of the EDUCAUSE Evolving Technologies Committee identify and research the evolving technologies that are having the most direct impact on higher education institutions. The committee members choose the relevant topics, write white papers, and present their findings at the EDUCAUSE annual conference. This year, under the…

  5. Surveying The Digital Landscape: Evolving Technologies 2004. The EDUCAUSE Evolving Technologies Committee

    ERIC Educational Resources Information Center

    EDUCAUSE Review, 2004

    2004-01-01

    Each year, the members of the EDUCAUSE Evolving Technologies Committee identify and research the evolving technologies that are having the most direct impact on higher education institutions. The committee members choose the relevant topics, write white papers, and present their findings at the EDUCAUSE annual conference. This year, under the…

  6. Toward an Evolved Concept of Landrace

    PubMed Central

    Casañas, Francesc; Simó, Joan; Casals, Joan; Prohens, Jaime

    2017-01-01

    The term “landrace” has generally been defined as a cultivated, genetically heterogeneous variety that has evolved in a certain ecogeographical area and is therefore adapted to the edaphic and climatic conditions and to its traditional management and uses. Despite being considered by many to be inalterable, landraces have been and are in a constant state of evolution as a result of natural and artificial selection. Many landraces have disappeared from cultivation but are preserved in gene banks. Using modern selection and breeding technology tools to shape these preserved landraces together with the ones that are still cultivated is a further step in their evolution in order to preserve their agricultural significance. Adapting historical landraces to present agricultural conditions using cutting-edge breeding technology represents a challenging opportunity to use them in a modern sustainable agriculture, as an immediate return on the investment is highly unlikely. Consequently, we propose a more inclusive definition of landraces, namely that they consist of cultivated varieties that have evolved and may continue evolving, using conventional or modern breeding techniques, in traditional or new agricultural environments within a defined ecogeographical area and under the influence of the local human culture. This includes adaptation of landraces to new management systems and the unconscious or conscious selection made by farmers or breeders using available technology. In this respect, a mixed selection system might be established in which farmers and other social agents develop evolved landraces from the variability generated by public entities. PMID:28228769

  7. Apollo 16 Evolved Lithology Sodic Ferrogabbro

    NASA Technical Reports Server (NTRS)

    Zeigler, Ryan; Jolliff, B. L.; Korotev, R. L.

    2014-01-01

    Evolved lunar igneous lithologies, often referred to as the alkali suite, are a minor but important component of the lunar crust. These evolved samples are incompatible-element rich samples, and are, not surprisingly, most common in the Apollo sites in (or near) the incompatible-element rich region of the Moon known as the Procellarum KREEP Terrane (PKT). The most commonly occurring lithologies are granites (A12, A14, A15, A17), monzogabbro (A14, A15), alkali anorthosites (A12, A14), and KREEP basalts (A15, A17). The Feldspathic Highlands Terrane is not entirely devoid of evolved lithologies, and rare clasts of alkali gabbronorite and sodic ferrogabbro (SFG) have been identified in Apollo 16 station 11 breccias 67915 and 67016. Curiously, nearly all pristine evolved lithologies have been found as small clasts or soil particles, exceptions being KREEP basalts 15382/6 and granitic sample 12013 (which is itself a breccia). Here we reexamine the petrography and geochemistry of two SFG-like particles found in a survey of Apollo 16 2-4 mm particles from the Cayley Plains 62283,7-15 and 62243,10-3 (hereafter 7-15 and 10-3 respectively). We will compare these to previously reported SFG samples, including recent analyses on the type specimen of SFG from lunar breccia 67915.

  8. Did Language Evolve Like the Vertebrate Eye?

    ERIC Educational Resources Information Center

    Botha, Rudolf P.

    2002-01-01

    Offers a critical appraisal of the way in which the idea that human language or some of its features evolved like the vertebrate eye by natural selection is articulated in Pinker and Bloom's (1990) selectionist account of language evolution. Argues that this account is less than insightful because it fails to draw some of the conceptual…

  9. The Evolving Leadership Path of Visual Analytics

    SciTech Connect

    Kluse, Michael; Peurrung, Anthony J.; Gracio, Deborah K.

    2012-01-02

    This is a requested book chapter for an internationally authored book on visual analytics and related fields, coordianted by a UK university and to be published by Springer in 2012. This chapter is an overview of the leadship strategies that PNNL's Jim Thomas and other stakeholders used to establish visual analytics as a field, and how those strategies may evolve in the future.

  10. Evolving Neural Networks for Nonlinear Control.

    DTIC Science & Technology

    1996-09-30

    An approach to creating Amorphous Recurrent Neural Networks (ARNN) using Genetic Algorithms (GA) called 2pGA has been developed and shown to be...effective in evolving neural networks for the control and stabilization of both linear and nonlinear plants, the optimal control for a nonlinear regulator

  11. Toward an Evolved Concept of Landrace.

    PubMed

    Casañas, Francesc; Simó, Joan; Casals, Joan; Prohens, Jaime

    2017-01-01

    The term "landrace" has generally been defined as a cultivated, genetically heterogeneous variety that has evolved in a certain ecogeographical area and is therefore adapted to the edaphic and climatic conditions and to its traditional management and uses. Despite being considered by many to be inalterable, landraces have been and are in a constant state of evolution as a result of natural and artificial selection. Many landraces have disappeared from cultivation but are preserved in gene banks. Using modern selection and breeding technology tools to shape these preserved landraces together with the ones that are still cultivated is a further step in their evolution in order to preserve their agricultural significance. Adapting historical landraces to present agricultural conditions using cutting-edge breeding technology represents a challenging opportunity to use them in a modern sustainable agriculture, as an immediate return on the investment is highly unlikely. Consequently, we propose a more inclusive definition of landraces, namely that they consist of cultivated varieties that have evolved and may continue evolving, using conventional or modern breeding techniques, in traditional or new agricultural environments within a defined ecogeographical area and under the influence of the local human culture. This includes adaptation of landraces to new management systems and the unconscious or conscious selection made by farmers or breeders using available technology. In this respect, a mixed selection system might be established in which farmers and other social agents develop evolved landraces from the variability generated by public entities.

  12. Did Language Evolve Like the Vertebrate Eye?

    ERIC Educational Resources Information Center

    Botha, Rudolf P.

    2002-01-01

    Offers a critical appraisal of the way in which the idea that human language or some of its features evolved like the vertebrate eye by natural selection is articulated in Pinker and Bloom's (1990) selectionist account of language evolution. Argues that this account is less than insightful because it fails to draw some of the conceptual…

  13. Thermal and Evolved-Gas Analyzer Illustration

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This is a computer-aided drawing of the Thermal and Evolved-Gas Analyzer, or TEGA, on NASA's Phoenix Mars Lander.

    The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  14. The Evolving Office of the Registrar

    ERIC Educational Resources Information Center

    Pace, Harold L.

    2011-01-01

    A healthy registrar's office will continue to evolve as it considers student, faculty, and institutional needs; staff talents and expectations; technological opportunities; economic realities; space issues; work environments; and where the strategic plan is taking the institution in support of the mission. Several recognized leaders in the field…

  15. A Course Evolves-Physical Anthropology.

    ERIC Educational Resources Information Center

    O'Neil, Dennis

    2001-01-01

    Describes the development of an online physical anthropology course at Palomar College (California) that evolved from online tutorials. Discusses the ability to update materials on the Web more quickly than in traditional textbooks; creating Web pages that are readable by most Web browsers; test security issues; and clarifying ownership of online…

  16. Reliability Analysis Model

    NASA Technical Reports Server (NTRS)

    1970-01-01

    RAM program determines probability of success for one or more given objectives in any complex system. Program includes failure mode and effects, criticality and reliability analyses, and some aspects of operations, safety, flight technology, systems design engineering, and configuration analyses.

  17. The rating reliability calculator

    PubMed Central

    Solomon, David J

    2004-01-01

    Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program. PMID:15117416

  18. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Wilson, Larry W.

    1989-01-01

    The longterm goal of this research is to identify or create a model for use in analyzing the reliability of flight control software. The immediate tasks addressed are the creation of data useful to the study of software reliability and production of results pertinent to software reliability through the analysis of existing reliability models and data. The completed data creation portion of this research consists of a Generic Checkout System (GCS) design document created in cooperation with NASA and Research Triangle Institute (RTI) experimenters. This will lead to design and code reviews with the resulting product being one of the versions used in the Terminal Descent Experiment being conducted by the Systems Validations Methods Branch (SVMB) of NASA/Langley. An appended paper details an investigation of the Jelinski-Moranda and Geometric models for software reliability. The models were given data from a process that they have correctly simulated and asked to make predictions about the reliability of that process. It was found that either model will usually fail to make good predictions. These problems were attributed to randomness in the data and replication of data was recommended.

  19. Metrology automation reliability

    NASA Astrophysics Data System (ADS)

    Chain, Elizabeth E.

    1996-09-01

    At Motorola's MOS-12 facility automated measurements on 200- mm diameter wafers proceed in a hands-off 'load-and-go' mode requiring only wafer loading, measurement recipe loading, and a 'run' command for processing. Upon completion of all sample measurements, the data is uploaded to the factory's data collection software system via a SECS II interface, eliminating the requirement of manual data entry. The scope of in-line measurement automation has been extended to the entire metrology scheme from job file generation to measurement and data collection. Data analysis and comparison to part specification limits is also carried out automatically. Successful integration of automated metrology into the factory measurement system requires that automated functions, such as autofocus and pattern recognition algorithms, display a high degree of reliability. In the 24- hour factory reliability data can be collected automatically on every part measured. This reliability data is then uploaded to the factory data collection software system at the same time as the measurement data. Analysis of the metrology reliability data permits improvements to be made as needed, and provides an accurate accounting of automation reliability. This reliability data has so far been collected for the CD-SEM (critical dimension scanning electron microscope) metrology tool, and examples are presented. This analysis method can be applied to such automated in-line measurements as CD, overlay, particle and film thickness measurements.

  20. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  1. Stable unstable reliability theory.

    PubMed

    Thomas, Hoben; Lohaus, Arnold; Domsch, Holger

    2012-05-01

    Classical reliability theory assumes that individuals have identical true scores on both testing occasions, a condition described as stable. If some individuals' true scores are different on different testing occasions, described as unstable, the estimated reliability can be misleading. A model called stable unstable reliability theory (SURT) frames stability or instability as an empirically testable question. SURT assumes a mixed population of stable and unstable individuals in unknown proportions, with w(i) the probability that individual i is stable. w(i) becomes i's test score weight which is used to form a weighted correlation coefficient r(w) which is reliability under SURT. If all w(i) = 1 then r(w) is the classical reliability coefficient; thus classical theory is a special case of SURT. Typically r(w) is larger than the conventional reliability r, and confidence intervals on true scores are typically shorter than conventional intervals. r(w) is computed with routines in a publicly available R package. ©2011 The British Psychological Society.

  2. Reliable aluminum contact formation by electrostatic bonding

    NASA Astrophysics Data System (ADS)

    Kárpáti, T.; Pap, A. E.; Radnóczi, Gy; Beke, B.; Bársony, I.; Fürjes, P.

    2015-07-01

    The paper presents a detailed study of a reliable method developed for aluminum fusion wafer bonding assisted by the electrostatic force evolving during the anodic bonding process. The IC-compatible procedure described allows the parallel formation of electrical and mechanical contacts, facilitating a reliable packaging of electromechanical systems with backside electrical contacts. This fusion bonding method supports the fabrication of complex microelectromechanical systems (MEMS) and micro-opto-electromechanical systems (MOEMS) structures with enhanced temperature stability, which is crucial in mechanical sensor applications such as pressure or force sensors. Due to the applied electrical potential of  -1000 V the Al metal layers are compressed by electrostatic force, and at the bonding temperature of 450 °C intermetallic diffusion causes aluminum ions to migrate between metal layers.

  3. Mission Reliability Estimation for Repairable Robot Teams

    NASA Technical Reports Server (NTRS)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  4. Scaled CMOS Technology Reliability Users Guide

    NASA Technical Reports Server (NTRS)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  5. Dust obscuration by an evolving galaxy population

    NASA Technical Reports Server (NTRS)

    Najita, Joan; Silk, Joseph; Wachter, Kenneth W.

    1990-01-01

    The effect of an evolving luminosity function (LF) on the ability of foreground galaxies to obscure background sources is discussed, using the Press-Schechter/CDM standard evolving LF model. Galaxies are modeled as simplified versions of local spirals and Poisson statistics are used to estimate the fraction of sky covered by intervening dusty galaxies and the mean optical depths due to these galaxies. The results are compared to those obtained in the case of nonevolving luminosity function in a low-density universe. It is found that evolution of the galaxy LF does not allow the quasar dust obscuration hypothesis to be sustained for dust disks with plausible sizes. Even in a low-density universe, where evolution at z = less than 10 is unimportant, large disk radii are needed to achieve the desired obscuring effect. The mean fraction of sky covered is presented as a function of the redshift z along with adequate diagram illustrations.

  6. The management of evolving bronchopulmonary dysplasia.

    PubMed

    Schulzke, Sven M; Pillow, J Jane

    2010-09-01

    Bronchopulmonary dysplasia (BPD) is associated with increased mortality and significant long-term cardiorespiratory and neurodevelopmental sequelae. Treatment of evolving BPD in the neonatal intensive care unit (NICU) is challenging due to the complex interplay of contributing risk factors which include preterm birth per se, supplemental oxygen, positive pressure ventilation, patent ductus arterious, and pre- and postnatal infection. Management of evolving BPD requires a multimodal approach including adequate nutrition, careful fluid management, effective and safe pharmacotherapy, and respiratory support aiming at minimal lung injury. Among pharmacological interventions, caffeine has the best risk-benefit profile. Systemic postnatal corticosteroids should be reserved to ventilated infants at highest risk of BPD who cannot be weaned from the ventilator. Several ongoing randomised trials are evaluating optimal oxygen saturation targets in preterm infants. The most beneficial respiratory support strategy to minimise lung injury remains unclear and requires further investigation. Copyright 2009 Elsevier Ltd. All rights reserved.

  7. Evolved gas analysis of secondary organic aerosols

    SciTech Connect

    Grosjean, D.; Williams, E.L. II; Grosjean, E. ); Novakov, T. )

    1994-11-01

    Secondary organic aerosols have been characterized by evolved gas analysis (EGA). Hydrocarbons selected as aerosol precursors were representative of anthropogenic emissions (cyclohexene, cyclopentene, 1-decene and 1-dodecene, n-dodecane, o-xylene, and 1,3,5-trimethylbenzene) and of biogenic emissions (the terpenes [alpha]-pinene, [beta]-pinene and d-limonene and the sesquiterpene trans-caryophyllene). Also analyzed by EGA were samples of secondary, primary (highway tunnel), and ambient (urban) aerosols before and after exposure to ozone and other photochemical oxidants. The major features of the EGA thermograms (amount of CO[sub 2] evolved as a function of temperature) are described. The usefulness and limitations of EGA data for source apportionment of atmospheric particulate carbon are briefly discussed. 28 refs., 7 figs., 4 tabs.

  8. [Families and psychiatry: models and evolving links].

    PubMed

    Frankhauser, Adeline

    2016-01-01

    The role of the families of persons with severe psychiatric disorders (schizophrenia in particular) in the care of their relatives has recently evolved: once seen as pathogenic to be kept at a distance, the family is now recognised by professionals as a partner in the care process. The links between families and psychiatric institutions remain complex and marked by ambivalence and paradoxes. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  9. Design Space Issues for Intrinsic Evolvable Hardware

    NASA Technical Reports Server (NTRS)

    Hereford, James; Gwaltney, David

    2004-01-01

    This paper discusses the problem of increased programming time for intrinsic evolvable hardware (EM) as the complexity of the circuit grows. As the circuit becomes more complex, then more components will be required and a longer programming string, L, is required. We develop equations for the size of the population, n, and the number of generations required for the population to converge, based on L. Our analytical results show that even though the design search space grows as 2L (assuming a binary programming string), the number of circuit evaluations, n*ngen, only grows as O(Lg3), or slightly less than O(L). This makes evolvable techniques a good tool for exploring large design spaces. The major hurdle for intrinsic EHW is evaluation time for each possible circuit. The evaluation time involves downloading the bit string to the device, updating the device configuration, measuring the output and then transferring the output data to the control processor. Each of these steps must be done for each member of the population. The processing time of the computer becomes negligible since the selection/crossover/mutation steps are only done once per generation. Evaluation time presently limits intrinsic evolvable hardware techniques to designing only small or medium-sized circuits. To evolve large or complicated circuits, several researchers have proposed using hierarchical design or reuse techniques where submodules are combined together to form complex circuits. However, these practical approaches limit the search space of available designs and preclude utilizing parasitic coupling or other effects within the programmable device. The practical approaches also raise the issue of why intrinsic EHW techniques do not easily apply to large design spaces, since the analytical results show only an O(L) complexity growth.

  10. The evolving definition of systemic arterial hypertension.

    PubMed

    Ram, C Venkata S; Giles, Thomas D

    2010-05-01

    Systemic hypertension is an important risk factor for premature cardiovascular disease. Hypertension also contributes to excessive morbidity and mortality. Whereas excellent therapeutic options are available to treat hypertension, there is an unsettled issue about the very definition of hypertension. At what level of blood pressure should we treat hypertension? Does the definition of hypertension change in the presence of co-morbid conditions? This article covers in detail the evolving concepts in the diagnosis and management of hypertension.

  11. Quantum games on evolving random networks

    NASA Astrophysics Data System (ADS)

    Pawela, Łukasz

    2016-09-01

    We study the advantages of quantum strategies in evolutionary social dilemmas on evolving random networks. We focus our study on the two-player games: prisoner's dilemma, snowdrift and stag-hunt games. The obtained result show the benefits of quantum strategies for the prisoner's dilemma game. For the other two games, we obtain regions of parameters where the quantum strategies dominate, as well as regions where the classical strategies coexist.

  12. Nursing administration research: an evolving science.

    PubMed

    Murphy, Lyn Stankiewicz; Scott, Elaine S; Warshawsky, Nora E

    2014-12-01

    The nature and focus of nursing administrative research have evolved over time. Recently, the research agenda has primarily reflected the national health policy agenda. Although nursing research has traditionally been dominated by clinical interests, nursing administrative research has historically addressed the interface of reimbursement, quality, and care delivery systems. This article traces the evolution of nursing administrative research to answer questions relevant to scope, practice, and policy and suggests future directions.

  13. Continuous Evaluation of Evolving Behavioral Intervention Technologies

    PubMed Central

    Mohr, David C.; Cheung, Ken; Schueller, Stephen M.; Brown, C. Hendricks; Duan, Naihua

    2013-01-01

    Behavioral intervention technologies (BITs) are web-based and mobile interventions intended to support patients and consumers in changing behaviors related to health, mental health, and well-being. BITs are provided to patients and consumers in clinical care settings and commercial marketplaces, frequently with little or no evaluation. Current evaluation methods, including RCTs and implementation studies, can require years to validate an intervention. This timeline is fundamentally incompatible with the BIT environment, where technology advancement and changes in consumer expectations occur quickly, necessitating rapidly evolving interventions. However, BITs can routinely and iteratively collect data in a planned and strategic manner and generate evidence through systematic prospective analyses, thereby creating a system that can “learn.” A methodologic framework, Continuous Evaluation of Evolving Behavioral Intervention Technologies (CEEBIT), is proposed that can support the evaluation of multiple BITs or evolving versions, eliminating those that demonstrate poorer outcomes, while allowing new BITs to be entered at any time. CEEBIT could be used to ensure the effectiveness of BITs provided through deployment platforms in clinical care organizations or BIT marketplaces. The features of CEEBIT are described, including criteria for the determination of inferiority, determination of BIT inclusion, methods of assigning consumers to BITs, definition of outcomes, and evaluation of the usefulness of the system. CEEBIT offers the potential to collapse initial evaluation and postmarketing surveillance, providing ongoing assurance of safety and efficacy to patients and consumers, payers, and policymakers. PMID:24050429

  14. Transistor Level Circuit Experiments using Evolvable Hardware

    NASA Technical Reports Server (NTRS)

    Stoica, A.; Zebulum, R. S.; Keymeulen, D.; Ferguson, M. I.; Daud, Taher; Thakoor, A.

    2005-01-01

    The Jet Propulsion Laboratory (JPL) performs research in fault tolerant, long life, and space survivable electronics for the National Aeronautics and Space Administration (NASA). With that focus, JPL has been involved in Evolvable Hardware (EHW) technology research for the past several years. We have advanced the technology not only by simulation and evolution experiments, but also by designing, fabricating, and evolving a variety of transistor-based analog and digital circuits at the chip level. EHW refers to self-configuration of electronic hardware by evolutionary/genetic search mechanisms, thereby maintaining existing functionality in the presence of degradations due to aging, temperature, and radiation. In addition, EHW has the capability to reconfigure itself for new functionality when required for mission changes or encountered opportunities. Evolution experiments are performed using a genetic algorithm running on a DSP as the reconfiguration mechanism and controlling the evolvable hardware mounted on a self-contained circuit board. Rapid reconfiguration allows convergence to circuit solutions in the order of seconds. The paper illustrates hardware evolution results of electronic circuits and their ability to perform under 230 C temperature as well as radiations of up to 250 kRad.

  15. Continuous evaluation of evolving behavioral intervention technologies.

    PubMed

    Mohr, David C; Cheung, Ken; Schueller, Stephen M; Hendricks Brown, C; Duan, Naihua

    2013-10-01

    Behavioral intervention technologies (BITs) are web-based and mobile interventions intended to support patients and consumers in changing behaviors related to health, mental health, and well-being. BITs are provided to patients and consumers in clinical care settings and commercial marketplaces, frequently with little or no evaluation. Current evaluation methods, including RCTs and implementation studies, can require years to validate an intervention. This timeline is fundamentally incompatible with the BIT environment, where technology advancement and changes in consumer expectations occur quickly, necessitating rapidly evolving interventions. However, BITs can routinely and iteratively collect data in a planned and strategic manner and generate evidence through systematic prospective analyses, thereby creating a system that can "learn." A methodologic framework, Continuous Evaluation of Evolving Behavioral Intervention Technologies (CEEBIT), is proposed that can support the evaluation of multiple BITs or evolving versions, eliminating those that demonstrate poorer outcomes, while allowing new BITs to be entered at any time. CEEBIT could be used to ensure the effectiveness of BITs provided through deployment platforms in clinical care organizations or BIT marketplaces. The features of CEEBIT are described, including criteria for the determination of inferiority, determination of BIT inclusion, methods of assigning consumers to BITs, definition of outcomes, and evaluation of the usefulness of the system. CEEBIT offers the potential to collapse initial evaluation and postmarketing surveillance, providing ongoing assurance of safety and efficacy to patients and consumers, payers, and policymakers.

  16. Evolving specialization of the arthropod nervous system.

    PubMed

    Jarvis, Erin; Bruce, Heather S; Patel, Nipam H

    2012-06-26

    The diverse array of body plans possessed by arthropods is created by generating variations upon a design of repeated segments formed during development, using a relatively small "toolbox" of conserved patterning genes. These attributes make the arthropod body plan a valuable model for elucidating how changes in development create diversity of form. As increasingly specialized segments and appendages evolved in arthropods, the nervous systems of these animals also evolved to control the function of these structures. Although there is a remarkable degree of conservation in neural development both between individual segments in any given species and between the nervous systems of different arthropod groups, the differences that do exist are informative for inferring general principles about the holistic evolution of body plans. This review describes developmental processes controlling neural segmentation and regionalization, highlighting segmentation mechanisms that create both ectodermal and neural segments, as well as recent studies of the role of Hox genes in generating regional specification within the central nervous system. We argue that this system generates a modular design that allows the nervous system to evolve in concert with the body segments and their associated appendages. This information will be useful in future studies of macroevolutionary changes in arthropod body plans, especially in understanding how these transformations can be made in a way that retains the function of appendages during evolutionary transitions in morphology.

  17. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  18. Crystalline-silicon reliability lessons for thin-film modules

    NASA Technical Reports Server (NTRS)

    Ross, Ronald G., Jr.

    1985-01-01

    Key reliability and engineering lessons learned from the 10-year history of the Jet Propulsion Laboratory's Flat-Plate Solar Array Project are presented and analyzed. Particular emphasis is placed on lessons applicable to the evolving new thin-film cell and module technologies and the organizations involved with these technologies. The user-specific demand for reliability is a strong function of the application, its location, and its expected duration. Lessons relative to effective means of specifying reliability are described, and commonly used test requirements are assessed from the standpoint of which are the most troublesome to pass, and which correlate best with field experience. Module design lessons are also summarized, including the significance of the most frequently encountered failure mechanisms and the role of encapsulant and cell reliability in determining module reliability. Lessons pertaining to research, design, and test approaches include the historical role and usefulness of qualification tests and field tests.

  19. Evolving networks-Using past structure to predict the future

    NASA Astrophysics Data System (ADS)

    Shang, Ke-ke; Yan, Wei-sheng; Small, Michael

    2016-08-01

    Many previous studies on link prediction have focused on using common neighbors to predict the existence of links between pairs of nodes. More broadly, research into the structural properties of evolving temporal networks and temporal link prediction methods have recently attracted increasing attention. In this study, for the first time, we examine the use of links between a pair of nodes to predict their common neighbors and analyze the relationship between the weight and the structure in static networks, evolving networks, and in the corresponding randomized networks. We propose both new unweighted and weighted prediction methods and use six kinds of real networks to test our algorithms. In unweighted networks, we find that if a pair of nodes connect to each other in the current network, they will have a higher probability to connect common nodes both in the current and the future networks-and the probability will decrease with the increase of the number of neighbors. Furthermore, we find that the original networks have their particular structure and statistical characteristics which benefit link prediction. In weighted networks, the prediction algorithm performance of networks which are dominated by human factors decrease with the decrease of weight and are in general better in static networks. Furthermore, we find that geographical position and link weight both have significant influence on the transport network. Moreover, the evolving financial network has the lowest predictability. In addition, we find that the structure of non-social networks has more robustness than social networks. The structure of engineering networks has both best predictability and also robustness.

  20. Orbiter Autoland reliability analysis

    NASA Technical Reports Server (NTRS)

    Welch, D. Phillip

    1993-01-01

    The Space Shuttle Orbiter is the only space reentry vehicle in which the crew is seated upright. This position presents some physiological effects requiring countermeasures to prevent a crewmember from becoming incapacitated. This also introduces a potential need for automated vehicle landing capability. Autoland is a primary procedure that was identified as a requirement for landing following and extended duration orbiter mission. This report documents the results of the reliability analysis performed on the hardware required for an automated landing. A reliability block diagram was used to evaluate system reliability. The analysis considers the manual and automated landing modes currently available on the Orbiter. (Autoland is presently a backup system only.) Results of this study indicate a +/- 36 percent probability of successfully extending a nominal mission to 30 days. Enough variations were evaluated to verify that the reliability could be altered with missions planning and procedures. If the crew is modeled as being fully capable after 30 days, the probability of a successful manual landing is comparable to that of Autoland because much of the hardware is used for both manual and automated landing modes. The analysis indicates that the reliability for the manual mode is limited by the hardware and depends greatly on crew capability. Crew capability for a successful landing after 30 days has not been determined yet.

  1. Photovoltaic module reliability workshop

    SciTech Connect

    Mrig, L.

    1990-01-01

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  2. Photovoltaic module reliability workshop

    NASA Astrophysics Data System (ADS)

    Mrig, L.

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986 to 1990. The reliability photovoltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warrantees available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the U.S., PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  3. Ultra reliability at NASA

    NASA Technical Reports Server (NTRS)

    Shapiro, Andrew A.

    2006-01-01

    Ultra reliable systems are critical to NASA particularly as consideration is being given to extended lunar missions and manned missions to Mars. NASA has formulated a program designed to improve the reliability of NASA systems. The long term goal for the NASA ultra reliability is to ultimately improve NASA systems by an order of magnitude. The approach outlined in this presentation involves the steps used in developing a strategic plan to achieve the long term objective of ultra reliability. Consideration is given to: complex systems, hardware (including aircraft, aerospace craft and launch vehicles), software, human interactions, long life missions, infrastructure development, and cross cutting technologies. Several NASA-wide workshops have been held, identifying issues for reliability improvement and providing mitigation strategies for these issues. In addition to representation from all of the NASA centers, experts from government (NASA and non-NASA), universities and industry participated. Highlights of a strategic plan, which is being developed using the results from these workshops, will be presented.

  4. eVolver: an optimization engine for evolving protein sequences to stabilize the respective structures.

    PubMed

    Brylinski, Michal

    2013-07-31

    Many structural bioinformatics approaches employ sequence profile-based threading techniques. To improve fold recognition rates, homology searching may include artificially evolved amino acid sequences, which were demonstrated to enhance the sensitivity of protein threading in targeting midnight zone templates. We describe implementation details of eVolver, an optimization algorithm that evolves protein sequences to stabilize the respective structures by a variety of potentials, which are compatible with those commonly used in protein threading. In a case study focusing on LARG PDZ domain, we show that artificially evolved sequences have quite high capabilities to recognize the correct protein structures using standard sequence profile-based fold recognition. Computationally design protein sequences can be incorporated in existing sequence profile-based threading approaches to increase their sensitivity. They also provide a desired linkage between protein structure and function in in silico experiments that relate to e.g. the completeness of protein structure space, the origin of folds and protein universe. eVolver is freely available as a user-friendly webserver and a well-documented stand-alone software distribution at http://www.brylinski.org/evolver.

  5. High reliability organizations (HROs).

    PubMed

    Sutcliffe, Kathleen M

    2011-06-01

    Academic and professional disciplines, such as organisation and management theory, psychology, sociology and engineering, have, for years, grappled with the multidisciplinary issues of safety and accident prevention. However, these ideas are just beginning to enrich research on safety in medicine. This article examines a domain of research on system safety - the High Reliability Organization (HRO) paradigm. HROs operate in hazardous conditions, but have fewer than their fair share of adverse events. HROs are committed to safety at the highest level and adopt a special approach to its pursuit. The attributes and operating dynamics of the best HROs provide a template on which to better understand how safe and reliable performance can be achieved under trying conditions, and this may be useful to researchers and caregivers who seek to improve safety and reliability in health care. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  7. Salt tolerance evolves more frequently in C4 grass lineages.

    PubMed

    Bromham, L; Bennett, T H

    2014-03-01

    Salt tolerance has evolved many times in the grass family, and yet few cereal crops are salt tolerant. Why has it been so difficult to develop crops tolerant of saline soils when salt tolerance has evolved so frequently in nature? One possible explanation is that some grass lineages have traits that predispose them to developing salt tolerance and that without these background traits, salt tolerance is harder to achieve. One candidate background trait is photosynthetic pathway, which has also been remarkably labile in grasses. At least 22 independent origins of the C4 photosynthetic pathway have been suggested to occur within the grass family. It is possible that the evolution of C4 photosynthesis aids exploitation of saline environments, because it reduces transpiration, increases water-use efficiency and limits the uptake of toxic ions. But the observed link between the evolution of C4 photosynthesis and salt tolerance could simply be due to biases in phylogenetic distribution of halophytes or C4 species. Here, we use a phylogenetic analysis to investigate the association between photosynthetic pathway and salt tolerance in the grass family Poaceae. We find that salt tolerance is significantly more likely to occur in lineages with C4 photosynthesis than in C3 lineages. We discuss the possible links between C4 photosynthesis and salt tolerance and consider the limitations of inferring the direction of causality of this relationship. © 2014 The Authors. Journal of Evolutionary Biology © 2014 European Society For Evolutionary Biology.

  8. Where the Wild Things Are: The Evolving Iconography of Rural Fauna

    ERIC Educational Resources Information Center

    Buller, Henry

    2004-01-01

    This paper explores the changing relationship between "nature" and rurality through an examination of the shifting iconography of animals, and particularly "wild" animals, in a rural setting. Drawing upon a set of examples, the paper argues that the faunistic icons of rural areas are evolving as alternative conceptions of the countryside, of…

  9. Where the Wild Things Are: The Evolving Iconography of Rural Fauna

    ERIC Educational Resources Information Center

    Buller, Henry

    2004-01-01

    This paper explores the changing relationship between "nature" and rurality through an examination of the shifting iconography of animals, and particularly "wild" animals, in a rural setting. Drawing upon a set of examples, the paper argues that the faunistic icons of rural areas are evolving as alternative conceptions of the countryside, of…

  10. Gearbox Reliability Collaborative Update (Presentation)

    SciTech Connect

    Sheng, S.; Keller, J.; Glinsky, C.

    2013-10-01

    This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.

  11. Reliability Engineering Handbook

    DTIC Science & Technology

    1964-06-01

    INTEVAL 00 0 542 917 1953 OPERATING TIME IN HOURS Figure 6-4. TWT Reliability Function, Showing the 90% Confidence Interval 6-7 6-2-4 to 6-2-5 NAVWEPS...the lower one-sided 90% greater than 977 hours, or 90% confidence confidence limit on 0 is (.704)(530) = 373 that 0 lies between these two bounds . R...6-4 6-2-2 Measurement of Reliability (Application of Confidence Limits).. 6-4 6-2-3 Procedural Steps

  12. The organization and control of an evolving interdependent population

    PubMed Central

    Vural, Dervis C.; Isakov, Alexander; Mahadevan, L.

    2015-01-01

    Starting with Darwin, biologists have asked how populations evolve from a low fitness state that is evolutionarily stable to a high fitness state that is not. Specifically of interest is the emergence of cooperation and multicellularity where the fitness of individuals often appears in conflict with that of the population. Theories of social evolution and evolutionary game theory have produced a number of fruitful results employing two-state two-body frameworks. In this study, we depart from this tradition and instead consider a multi-player, multi-state evolutionary game, in which the fitness of an agent is determined by its relationship to an arbitrary number of other agents. We show that populations organize themselves in one of four distinct phases of interdependence depending on one parameter, selection strength. Some of these phases involve the formation of specialized large-scale structures. We then describe how the evolution of independence can be manipulated through various external perturbations. PMID:26040593

  13. Wild Origins: The Evolving Nature of Animal Behavior

    NASA Astrophysics Data System (ADS)

    Flores, Ifigenia

    For billions of years, evolution has been the driving force behind the incredible range of biodiversity on our planet. Wild Origins is a concept plan for an exhibition at the National Zoo that uses case studies of animal behavior to explain the theory of evolution. Behaviors evolve, just as physical forms do. Understanding natural selection can help us interpret animal behavior and vice-versa. A living collection, digital media, interactives, fossils, and photographs will relay stories of social behavior, sex, navigation and migration, foraging, domestication, and relationships between different species. The informal learning opportunities visitors are offered at the zoo will create a connection with the exhibition's teaching points. Visitors will leave with an understanding and sense of wonder at the evolutionary view of life.

  14. Reliability Degradation Due to Stockpile Aging

    SciTech Connect

    Robinson, David G.

    1999-04-01

    The objective of this reseach is the investigation of alternative methods for characterizing the reliability of systems with time dependent failure modes associated with stockpile aging. Reference to 'reliability degradation' has, unfortunately, come to be associated with all types of aging analyes: both deterministic and stochastic. In this research, in keeping with the true theoretical definition, reliability is defined as a probabilistic description of system performance as a funtion of time. Traditional reliability methods used to characterize stockpile reliability depend on the collection of a large number of samples or observations. Clearly, after the experiments have been performed and the data has been collected, critical performance problems can be identified. A Major goal of this research is to identify existing methods and/or develop new mathematical techniques and computer analysis tools to anticipate stockpile problems before they become critical issues. One of the most popular methods for characterizing the reliability of components, particularly electronic components, assumes that failures occur in a completely random fashion, i.e. uniformly across time. This method is based primarily on the use of constant failure rates for the various elements that constitute the weapon system, i.e. the systems do not degrade while in storage. Experience has shown that predictions based upon this approach should be regarded with great skepticism since the relationship between the life predicted and the observed life has been difficult to validate. In addition to this fundamental problem, the approach does not recognize that there are time dependent material properties and variations associated with the manufacturing process and the operational environment. To appreciate the uncertainties in predicting system reliability a number of alternative methods are explored in this report. All of the methods are very different from those currently used to assess stockpile

  15. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection.

    PubMed

    Janković, Srdja; Ćirković, Milan M

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  16. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection

    NASA Astrophysics Data System (ADS)

    Janković, Srdja; Ćirković, Milan M.

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  17. Administration to innovation: the evolving management challenge in primary care.

    PubMed

    Laing, A; Marnoch, G; McKee, L; Joshi, R; Reid, J

    1997-01-01

    The concept of the primary health-care team involving an increasingly diverse range of health care professionals is widely recognized as central to the pursuit of a primary care-led health service in the UK. Although GPs are formally recognized as the team leaders, there is little by way of policy prescription as to how team roles and relationships should be developed, or evidence as to how their roles have in fact evolved. Thus the notion of the primary health-care team while commonly employed, is in reality lacking definition with the current contribution of practice managers to the operation of this team being poorly understood. Focusing on the career backgrounds of practice managers, their range of responsibilities, and their involvement in innovation in general practice, presents a preliminary account of a chief scientist office-funded project examining the role being played by practice managers in primary health-care innovation. More specifically, utilizing data gained from the ongoing study, contextualizes the role played by practice managers in the primary health-care team. By exploring the business environment surrounding the NHS general practice, the research seeks to understand the evolving world of the practice manager. Drawing on questionnaire data, reinforced by qualitative data from the current interview phase, describes the role played by practice managers in differing practice contexts. This facilitates a discussion of a set of ideal type general practice organizational and managerial structures. Discusses the relationships and skills required by practice managers in each of these organizational types with reference to data gathered to date in the research.

  18. Production and decay of evolving horizons

    NASA Astrophysics Data System (ADS)

    Nielsen, Alex B.; Visser, Matt

    2006-07-01

    We consider a simple physical model for an evolving horizon that is strongly interacting with its environment, exchanging arbitrarily large quantities of matter with its environment in the form of both infalling material and outgoing Hawking radiation. We permit fluxes of both lightlike and timelike particles to cross the horizon, and ask how the horizon grows and shrinks in response to such flows. We place a premium on providing a clear and straightforward exposition with simple formulae. To be able to handle such a highly dynamical situation in a simple manner we make one significant physical restriction—that of spherical symmetry—and two technical mathematical restrictions: (1) we choose to slice the spacetime in such a way that the spacetime foliations (and hence the horizons) are always spherically symmetric. (2) Furthermore, we adopt Painlevé Gullstrand coordinates (which are well suited to the problem because they are nonsingular at the horizon) in order to simplify the relevant calculations. Of course physics results are ultimately independent of the choice of coordinates, but this particular coordinate system yields a clean physical interpretation of the relevant physics. We find particularly simple forms for surface gravity, and for the first and second law of black hole thermodynamics, in this general evolving horizon situation. Furthermore, we relate our results to Hawking's apparent horizon, Ashtekar and co-worker's isolated and dynamical horizons, and Hayward's trapping horizon. The evolving black hole model discussed here will be of interest, both from an astrophysical viewpoint in terms of discussing growing black holes and from a purely theoretical viewpoint in discussing black hole evaporation via Hawking radiation.

  19. Risky prey behavior evolves in risky habitats

    PubMed Central

    Urban, Mark C.

    2007-01-01

    Longstanding theory in behavioral ecology predicts that prey should evolve decreased foraging rates under high predation threat. However, an alternative perspective suggests that growth into a size refuge from gape-limited predation and the future benefits of large size can outweigh the initial survival costs of intense foraging. Here, I evaluate the relative contributions of selection from a gape-limited predator (Ambystoma opacum) and spatial location to explanations of variation in foraging, growth, and survival in 10 populations of salamander larvae (Ambystoma maculatum). Salamander larvae from populations naturally exposed to intense A. opacum predation risk foraged more actively under common garden conditions. Higher foraging rates were associated with low survival in populations exposed to free-ranging A. opacum larvae. Results demonstrate that risky foraging activity can evolve in high predation-risk habitats when the dominant predators are gape-limited. This finding invites the further exploration of diverse patterns of prey foraging behavior that depends on natural variation in predator size-selectivity. In particular, prey should adopt riskier behaviors under predation threat than expected under existing risk allocation models if foraging effort directly reduces the duration of risk by growth into a size refuge. Moreover, evidence from this study suggests that foraging has evolved over microgeographic scales despite substantial modification by regional gene flow. This interaction between local selection and spatial location suggests a joint role for adaptation and maladaptation in shaping species interactions across natural landscapes, which is a finding with implications for dynamics at the population, community, and metacommunity levels. PMID:17724339

  20. Risky prey behavior evolves in risky habitats.

    PubMed

    Urban, Mark C

    2007-09-04

    Longstanding theory in behavioral ecology predicts that prey should evolve decreased foraging rates under high predation threat. However, an alternative perspective suggests that growth into a size refuge from gape-limited predation and the future benefits of large size can outweigh the initial survival costs of intense foraging. Here, I evaluate the relative contributions of selection from a gape-limited predator (Ambystoma opacum) and spatial location to explanations of variation in foraging, growth, and survival in 10 populations of salamander larvae (Ambystoma maculatum). Salamander larvae from populations naturally exposed to intense A. opacum predation risk foraged more actively under common garden conditions. Higher foraging rates were associated with low survival in populations exposed to free-ranging A. opacum larvae. Results demonstrate that risky foraging activity can evolve in high predation-risk habitats when the dominant predators are gape-limited. This finding invites the further exploration of diverse patterns of prey foraging behavior that depends on natural variation in predator size-selectivity. In particular, prey should adopt riskier behaviors under predation threat than expected under existing risk allocation models if foraging effort directly reduces the duration of risk by growth into a size refuge. Moreover, evidence from this study suggests that foraging has evolved over microgeographic scales despite substantial modification by regional gene flow. This interaction between local selection and spatial location suggests a joint role for adaptation and maladaptation in shaping species interactions across natural landscapes, which is a finding with implications for dynamics at the population, community, and metacommunity levels.

  1. Reliable solar cookers

    SciTech Connect

    Magney, G.K.

    1992-12-31

    The author describes the activities of SERVE, a Christian relief and development agency, to introduce solar ovens to the Afghan refugees in Pakistan. It has provided 5,000 solar cookers since 1984. The experience has demonstrated the potential of the technology and the need for a durable and reliable product. Common complaints about the cookers are discussed and the ideal cooker is described.

  2. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the ``factories,`` reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  3. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the factories,'' reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  4. Parametric Mass Reliability Study

    NASA Technical Reports Server (NTRS)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  5. Sequential Reliability Tests.

    ERIC Educational Resources Information Center

    Eiting, Mindert H.

    1991-01-01

    A method is proposed for sequential evaluation of reliability of psychometric instruments. Sample size is unfixed; a test statistic is computed after each person is sampled and a decision is made in each stage of the sampling process. Results from a series of Monte-Carlo experiments establish the method's efficiency. (SLD)

  6. Grid reliability management tools

    SciTech Connect

    Eto, J.; Martinez, C.; Dyer, J.; Budhraja, V.

    2000-10-01

    To summarize, Consortium for Electric Reliability Technology Solutions (CERTS) is engaged in a multi-year program of public interest R&D to develop and prototype software tools that will enhance system reliability during the transition to competitive markets. The core philosophy embedded in the design of these tools is the recognition that in the future reliability will be provided through market operations, not the decisions of central planners. Embracing this philosophy calls for tools that: (1) Recognize that the game has moved from modeling machine and engineering analysis to simulating markets to understand the impacts on reliability (and vice versa); (2) Provide real-time data and support information transparency toward enhancing the ability of operators and market participants to quickly grasp, analyze, and act effectively on information; (3) Allow operators, in particular, to measure, monitor, assess, and predict both system performance as well as the performance of market participants; and (4) Allow rapid incorporation of the latest sensing, data communication, computing, visualization, and algorithmic techniques and technologies.

  7. Quantifying Human Performance Reliability.

    ERIC Educational Resources Information Center

    Askren, William B.; Regulinski, Thaddeus L.

    Human performance reliability for tasks in the time-space continuous domain is defined and a general mathematical model presented. The human performance measurement terms time-to-error and time-to-error-correction are defined. The model and measurement terms are tested using laboratory vigilance and manual control tasks. Error and error-correction…

  8. Reliability of semiology description.

    PubMed

    Heo, Jae-Hyeok; Kim, Dong Wook; Lee, Seo-Young; Cho, Jinwhan; Lee, Sang-Kun; Nam, Hyunwoo

    2008-01-01

    Seizure semiology is important for classifying patients' epilepsy. Physicians usually get most of the seizure information from observers though there have been few reports on the reliability of the observers' description. This study aims at determining the reliability of observers' description of the semiology. We included 92 patients who had their habitual seizures recorded during video-EEG monitoring. We compared the semiology described by the observers with that recorded on the videotape, and reviewed which characteristics of the observers affected the reliability of their reported data. The classification of seizures and the individual components of the semiology based only on the observer-description was somewhat discordant compared with the findings from the videotape (correct classification, 85%). The descriptions of some ictal behaviors such as oroalimentary automatism, tonic/dystonic limb posturing, and head versions were relatively accurate, but those of motionless staring and hand automatism were less accurate. The specified directions by the observers were relatively correct. The accuracy of the description was related to the educational level of the observers. Much of the information described by well-educated observers is reliable. However, every physician should keep in mind the limitations of this information and use this information cautiously.

  9. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  10. Reliability Design Handbook

    DTIC Science & Technology

    1976-03-01

    prediction, failure modes and effects analysis ( FMEA ) and reliability growth techniques represent those prediction and design evaluation methods that...Assessment Production Operation Ö Maintenance MIL-HDBK- 217 Bayesian Techniques Probabilistic Design FMEA I R Growth " I...devices suffer thermal aging; oxidation and other chemical reactions are enhanced; viscosity reduction and evaporation of lubricants are problems

  11. Evolvable circuit with transistor-level reconfigurability

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian (Inventor); Salazar-Lazaro, Carlos Harold (Inventor)

    2004-01-01

    An evolvable circuit includes a plurality of reconfigurable switches, a plurality of transistors within a region of the circuit, the plurality of transistors having terminals, the plurality of transistors being coupled between a power source terminal and a power sink terminal so as to be capable of admitting power between the power source terminal and the power sink terminal, the plurality of transistors being coupled so that every transistor terminal to transistor terminal coupling within the region of the circuit comprises a reconfigurable switch.

  12. Present weather and climate: evolving conditions

    USGS Publications Warehouse

    Hoerling, Martin P; Dettinger, Michael; Wolter, Klaus; Lukas, Jeff; Eischeid, Jon K.; Nemani, Rama; Liebmann, Brant; Kunkel, Kenneth E.

    2013-01-01

    This chapter assesses weather and climate variability and trends in the Southwest, using observed climate and paleoclimate records. It analyzes the last 100 years of climate variability in comparison to the last 1,000 years, and links the important features of evolving climate conditions to river flow variability in four of the region’s major drainage basins. The chapter closes with an assessment of the monitoring and scientific research needed to increase confidence in understanding when climate episodes, events, and phenomena are attributable to human-caused climate change.

  13. Cobalt-phosphate oxygen-evolving compound.

    PubMed

    Kanan, Matthew W; Surendranath, Yogesh; Nocera, Daniel G

    2009-01-01

    The utilization of solar energy on a large scale requires efficient storage. Solar-to-fuels has the capacity to meet large scale storage needs as demonstrated by natural photosynthesis. This process uses sunlight to rearrange the bonds of water to furnish O2 and an H2-equivalent. We present a tutorial review of our efforts to develop an amorphous cobalt-phosphate catalyst that oxidizes water to O2. The use of earth-abundant materials, operation in water at neutral pH, and the formation of the catalyst in situ captures functional elements of the oxygen evolving complex of Photosystem II.

  14. An evolving paradigm for the secretory pathway?

    PubMed Central

    Lippincott-Schwartz, Jennifer

    2011-01-01

    The paradigm that the secretory pathway consists of a stable endoplasmic reticulum and Golgi apparatus, using discrete transport vesicles to exchange their contents, gained important support from groundbreaking biochemical and genetic studies during the 1980s. However, the subsequent development of new imaging technologies with green fluorescent protein introduced data on dynamic processes not fully accounted for by the paradigm. As a result, we may be seeing an example of how a paradigm is evolving to account for the results of new technologies and their new ways of describing cellular processes. PMID:22039065

  15. Mobile computing acceptance grows as applications evolve.

    PubMed

    Porn, Louis M; Patrick, Kelly

    2002-01-01

    Handheld devices are becoming more cost-effective to own, and their use in healthcare environments is increasing. Handheld devices currently are being used for e-prescribing, charge capture, and accessing daily schedules and reference tools. Future applications may include education on medications, dictation, order entry, and test-results reporting. Selecting the right handheld device requires careful analysis of current and future applications, as well as vendor expertise. It is important to recognize the technology will continue to evolve over the next three years.

  16. SALT Spectroscopy of Evolved Massive Stars

    NASA Astrophysics Data System (ADS)

    Kniazev, A. Y.; Gvaramadze, V. V.; Berdnikov, L. N.

    2017-06-01

    Long-slit spectroscopy with the Southern African Large Telescope (SALT) of central stars of mid-infrared nebulae detected with the Spitzer Space Telescope and Wide-Field Infrared Survey Explorer (WISE) led to the discovery of numerous candidate luminous blue variables (cLBVs) and other rare evolved massive stars. With the recent advent of the SALT fiber-fed high-resolution echelle spectrograph (HRS), a new perspective for the study of these interesting objects is appeared. Using the HRS we obtained spectra of a dozen newly identified massive stars. Some results on the recently identified cLBV Hen 3-729 are presented.

  17. Investigating Evolved Compositions Around Wolf Crater

    NASA Technical Reports Server (NTRS)

    Greenhagen, B. T.; Cahill, J. T. S.; Jolliff, B. L.; Lawrence, S. J.; Glotch, T. D.

    2017-01-01

    Wolf crater is an irregularly shaped, approximately 25 km crater in the south-central portion of Mare Nubium on the lunar nearside. While not previously identified as a lunar "red spot", Wolf crater was identified as a Th anomaly by Lawrence and coworkers. We have used data from the Lunar Reconnaissance Orbiter (LRO) to determine the area surrounding Wolf crater has composition more similar to highly evolved, non-mare volcanic structures than typical lunar crustal lithology. In this presentation, we will investigate the geomorphology and composition of the Wolf crater and discuss implications for the origin of the anomalous terrain.

  18. Scar State on Time-evolving Wavepacket

    NASA Astrophysics Data System (ADS)

    Tomiya, Mitsuyoshi; Tsuyuki, Hiroyoshi; Kawamura, Kentaro; Sakamoto, Shoichi; Heller, Eric J.

    2015-09-01

    The scar-like enhancement is found in the accumulation of the time-evolving wavepacket in stadium billiard. It appears close to unstable periodic orbits, when the wavepackets are launched along the orbits. The enhancement is essentially due to the same mechanism of the well-known scar states in stationary eigenstates. The weighted spectral function reveals that the enhancement is the pileup of contributions from scar states on the same periodic orbit. The availavility of the weighted spectrum to the semiclassical approximation is also disscussed.

  19. f( R) gravity solutions for evolving wormholes

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Subhra; Chakraborty, Subenoy

    2017-08-01

    The scalar-tensor f( R) theory of gravity is considered in the framework of a simple inhomogeneous space-time model. In this research we use the reconstruction technique to look for possible evolving wormhole solutions within viable f( R) gravity formalism. These f( R) models are then constrained so that they are consistent with existing experimental data. Energy conditions related to the matter threading the wormhole are analyzed graphically and are in general found to obey the null energy conditions (NEC) in regions around the throat, while in the limit f(R)=R, NEC can be violated at large in regions around the throat.

  20. Reliability Generalization (RG) Analysis: The Test Is Not Reliable

    ERIC Educational Resources Information Center

    Warne, Russell

    2008-01-01

    Literature shows that most researchers are unaware of some of the characteristics of reliability. This paper clarifies some misconceptions by describing the procedures, benefits, and limitations of reliability generalization while using it to illustrate the nature of score reliability. Reliability generalization (RG) is a meta-analytic method…

  1. Bioharness™ Multivariable Monitoring Device: Part. II: Reliability

    PubMed Central

    Johnstone, James A.; Ford, Paul A.; Hughes, Gerwyn; Watson, Tim; Garrett, Andrew T.

    2012-01-01

    The Bioharness™ monitoring system may provide physiological information on human performance but the reliability of this data is fundamental for confidence in the equipment being used. The objective of this study was to assess the reliability of each of the 5 Bioharness™ variables using a treadmill based protocol. 10 healthy males participated. A between and within subject design to assess the reliability of Heart rate (HR), Breathing Frequency (BF), Accelerometry (ACC) and Infra-red skin temperature (ST) was completed via a repeated, discontinuous, incremental treadmill protocol. Posture (P) was assessed by a tilt table, moved through 160°. Between subject data reported low Coefficient of Variation (CV) and strong correlations(r) for ACC and P (CV< 7.6; r = 0.99, p < 0.01). In contrast, HR and BF (CV~19.4; r~0.70, p < 0.01) and ST (CV 3.7; r = 0.61, p < 0.01), present more variable data. Intra and inter device data presented strong relationships (r > 0.89, p < 0.01) and low CV (<10.1) for HR, ACC, P and ST. BF produced weaker relationships (r < 0.72) and higher CV (<17.4). In comparison to the other variables BF variable consistently presents less reliability. Global results suggest that the Bioharness™ is a reliable multivariable monitoring device during laboratory testing within the limits presented. Key pointsHeart rate and breathing frequency data increased in variance at higher velocities (i.e. ≥ 10 km.h-1)In comparison to the between subject testing, the intra and inter reliability presented good reliability in data suggesting placement or position of device relative to performer could be important for data collectionUnderstanding a devices variability in measurement is important before it can be used within an exercise testing or monitoring setting PMID:24149347

  2. The complex network reliability and influential nodes

    NASA Astrophysics Data System (ADS)

    Li, Kai; He, Yongfeng

    2017-08-01

    In order to study the complex network node important degree and reliability, considering semi-local centrality, betweenness centrality and PageRank algorithm, through the simulation method to gradually remove nodes and recalculate the importance in the random network, small world network and scale-free network. Study the relationship between the largest connected component and node removed proportion, the research results show that betweenness centrality and PageRank algorithm based on the global information network are more effective for evaluating the importance of nodes, and the reliability of the network is related to the network topology.

  3. Early formation of evolved asteroidal crust.

    PubMed

    Day, James M D; Ash, Richard D; Liu, Yang; Bellucci, Jeremy J; Rumble, Douglas; McDonough, William F; Walker, Richard J; Taylor, Lawrence A

    2009-01-08

    Mechanisms for the formation of crust on planetary bodies remain poorly understood. It is generally accepted that Earth's andesitic continental crust is the product of plate tectonics, whereas the Moon acquired its feldspar-rich crust by way of plagioclase flotation in a magma ocean. Basaltic meteorites provide evidence that, like the terrestrial planets, some asteroids generated crust and underwent large-scale differentiation processes. Until now, however, no evolved felsic asteroidal crust has been sampled or observed. Here we report age and compositional data for the newly discovered, paired and differentiated meteorites Graves Nunatak (GRA) 06128 and GRA 06129. These meteorites are feldspar-rich, with andesite bulk compositions. Their age of 4.52 +/- 0.06 Gyr demonstrates formation early in Solar System history. The isotopic and elemental compositions, degree of metamorphic re-equilibration and sulphide-rich nature of the meteorites are most consistent with an origin as partial melts from a volatile-rich, oxidized asteroid. GRA 06128 and 06129 are the result of a newly recognized style of evolved crust formation, bearing witness to incomplete differentiation of their parent asteroid and to previously unrecognized diversity of early-formed materials in the Solar System.

  4. The evolving role of the transfusion practitioner.

    PubMed

    Miller, Kristy; Akers, Christine; Davis, Amanda K; Wood, Erica; Hennessy, Clare; Bielby, Linley

    2015-04-01

    Much of the recent work in transfusion practice has shifted to focus on the patient, after efforts over previous decades to ensure the quality and safety of blood products. After the commencement of hemovigilance and transfusion practice improvement programs, the introduction of transfusion practitioners (TP) into health care services and blood centers has continued to increase worldwide. Since this relatively new role was introduced, much work of the TP has focused on patient and staff education, adverse events, transfusion governance, and monitoring of transfusion practices within organizations. The complex nature of the transfusion process makes the TP an integral link in the transfusion chain. Together with hospital transfusion teams and committees, the TP works collaboratively to facilitate the transfusion change management programs and initiatives. Recently, the TP role has evolved to include an emphasis on patient blood management and, to some extent, is shaped by national standards and regulations. These established roles of the TP, together with the ever-changing field of transfusion medicine, provide new opportunities and challenges for a role that is continuing to evolve worldwide. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Evolvability of an Optimal Recombination Rate

    PubMed Central

    Lobkovsky, Alexander E.; Wolf, Yuri I.; Koonin, Eugene V.

    2016-01-01

    Evolution and maintenance of genetic recombination and its relation to the mutational process is a long-standing, fundamental problem in evolutionary biology that is linked to the general problem of evolution of evolvability. We explored a stochastic model of the evolution of recombination using additive fitness and infinite allele assumptions but no assumptions on the sign or magnitude of the epistasis and the distribution of mutation effects. In this model, fluctuating negative epistasis and predominantly deleterious mutations arise naturally as a consequence of the additive fitness and a reservoir from which new alleles arrive with a fixed distribution of fitness effects. Analysis of the model revealed a nonmonotonic effect of recombination intensity on fitness, with an optimal recombination rate value which maximized fitness in steady state. The optimal recombination rate depended on the mutation rate and was evolvable, that is, subject to selection. The predictions of the model were compatible with the observations on the dependence between genome rearrangement rate and gene flux in microbial genomes. PMID:26660159

  6. BOOK REVIEW: OPENING SCIENCE, THE EVOLVING GUIDE ...

    EPA Pesticide Factsheets

    The way we get our funding, collaborate, do our research, and get the word out has evolved over hundreds of years but we can imagine a more open science world, largely facilitated by the internet. The movement towards this more open way of doing and presenting science is coming, and it is not taking hundreds of years. If you are interested in these trends, and would like to find out more about where this is all headed and what it means to you, consider downloding Opening Science, edited by Sönke Bartling and Sascha Friesike, subtitled The Evolving Guide on How the Internet is Changing Research, Collaboration, and Scholarly Publishing. In 26 chapters by various authors from a range of disciplines the book explores the developing world of open science, starting from the first scientific revolution and bringing us to the next scientific revolution, sometimes referred to as “Science 2.0”. Some of the articles deal with the impact of the changing landscape of how science is done, looking at the impact of open science on Academia, or journal publishing, or medical research. Many of the articles look at the uses, pitfalls, and impact of specific tools, like microblogging (think Twitter), social networking, and reference management. There is lots of discussion and definition of terms you might use or misuse like “altmetrics” and “impact factor”. Science will probably never be completely open, and Twitter will probably never replace the journal article,

  7. Evolving Dark Energy with w =/ -1

    SciTech Connect

    Hall, Lawrence J.; Nomura, Yasunori; Oliver, Steven J.

    2005-03-31

    Theories of evolving quintessence are constructed that generically lead to deviations from the w = -1 prediction of non-evolving dark energy. The small mass scale that governs evolution, m_\\phi \\approx 10^-33 eV, is radiatively stable, and the"Why Now?'' problem is solved. These results rest crucially on seesaw cosmology: in broad outline, fundamental physics and cosmology can be understood from only two mass scales, the weak scale, v, and the Planck scale, M. Requiring a scale of dark energy \\rho_DE^1/4 governed by v^2/M, and a radiatively stable evolution rate m_\\phi given by v^4/M^3, leads to a distinctive form for the equation of state w(z) that follows from a cosine quintessence potential. An explicit hidden axion model is constructed. Dark energy resides in the potential of the axion field which is generated by a new QCD-like force that gets strong at the scale \\Lambda \\approx v^2/M \\approx \\rho_DE^1/4. The evolution rate is given by a second seesaw that leads to the axion mass, m_\\phi \\approx \\Lambda^2/f, with f \\approx M.

  8. Novel cooperation experimentally evolved between species.

    PubMed

    Harcombe, William

    2010-07-01

    Cooperation violates the view of "nature red in tooth and claw" that prevails in our understanding of evolution, yet examples of cooperation abound. Most work has focused on maintenance of cooperation within a single species through mechanisms such as kin selection. The factors necessary for the evolutionary origin of aiding unrelated individuals such as members of another species have not been experimentally tested. Here, I demonstrate that cooperation between species can be evolved in the laboratory if (1) there is preexisting reciprocation or feedback for cooperation, and (2) reciprocation is preferentially received by cooperative genotypes. I used a two species system involving Salmonella enterica ser. Typhimurium and an Escherichia coli mutant unable to synthesize an essential amino acid. In lactose media Salmonella consumes metabolic waste from E. coli, thus creating a mechanism of reciprocation for cooperation. Growth in a spatially structured environment assured that the benefits of cooperation were preferentially received by cooperative genotypes. Salmonella evolved to aid E. coli by excreting a costly amino acid, however this novel cooperation disappeared if the waste consumption or spatial structure were removed. This study builds on previous work to demonstrate an experimental origin of interspecific cooperation, and to test the factors necessary for such interactions to arise.

  9. Shaping the outflows of evolved stars

    NASA Astrophysics Data System (ADS)

    Mohamed, Shazrene

    2015-08-01

    Both hot and cool evolved stars, e.g., red (super)giants and Wolf-Rayet stars, lose copious amounts of mass, momentum and mechanical energy through powerful, dense stellar winds. The interaction of these outflows with their surroundings results in highly structured and complex circumstellar environments, often featuring knots, arcs, shells and spirals. Recent improvements in computational power and techniques have led to the development of detailed, multi-dimensional simulations that have given new insight into the origin of these structures, and better understanding of the physical mechanisms driving their formation. In this talk, I will discuss three of the main mechanisms that shape the outflows of evolved stars:- interaction with the interstellar medium (ISM), i.e., wind-ISM interactions- interaction with a stellar wind, either from a previous phase of evolution or the wind from a companion star, i.e., wind-wind interactions- and interaction with a companion star that has a weak or insignicant outflow (e.g., a compact companion such as a neutron star or black hole), i.e., wind-companion interactions.I will also highlight the broader implications and impact of these stellar wind interactions for other phenomena, e.g, for symbiotic and X-ray binaries, supernovae and Gamma-ray bursts.

  10. Hierarchical decomposition of dynamically evolving regulatory networks.

    PubMed

    Ay, Ahmet; Gong, Dihong; Kahveci, Tamer

    2015-05-15

    Gene regulatory networks describe the interplay between genes and their products. These networks control almost every biological activity in the cell through interactions. The hierarchy of genes in these networks as defined by their interactions gives important insights into how these functions are governed. Accurately determining the hierarchy of genes is however a computationally difficult problem. This problem is further complicated by the fact that an intrinsic characteristic of regulatory networks is that the wiring of interactions can change over time. Determining how the hierarchy in the gene regulatory networks changes with dynamically evolving network topology remains to be an unsolved challenge. In this study, we develop a new method, named D-HIDEN (Dynamic-HIerarchical DEcomposition of Networks) to find the hierarchy of the genes in dynamically evolving gene regulatory network topologies. Unlike earlier methods, which recompute the hierarchy from scratch when the network topology changes, our method adapts the hierarchy based on the wiring of the interactions only for the nodes which have the potential to move in the hierarchy. We compare D-HIDEN to five currently available hierarchical decomposition methods on synthetic and real gene regulatory networks. Our experiments demonstrate that D-HIDEN significantly outperforms existing methods in running time, accuracy, or both. Furthermore, our method is robust against dynamic changes in hierarchy. Our experiments on human gene regulatory networks suggest that our method may be used to reconstruct hierarchy in gene regulatory networks.

  11. Have plants evolved to self-immolate?

    PubMed Central

    Bowman, David M. J. S.; French, Ben J.; Prior, Lynda D.

    2014-01-01

    By definition fire prone ecosystems have highly combustible plants, leading to the hypothesis, first formally stated by Mutch in 1970, that community flammability is the product of natural selection of flammable traits. However, proving the “Mutch hypothesis” has presented an enormous challenge for fire ecologists given the difficulty in establishing cause and effect between landscape fire and flammable plant traits. Individual plant traits (such as leaf moisture content, retention of dead branches and foliage, oil rich foliage) are known to affect the flammability of plants but there is no evidence these characters evolved specifically to self-immolate, although some of these traits may have been secondarily modified to increase the propensity to burn. Demonstrating individual benefits from self-immolation is extraordinarily difficult, given the intersection of the physical environmental factors that control landscape fire (fuel production, dryness and ignitions) with community flammability properties that emerge from numerous traits of multiple species (canopy cover and litter bed bulk density). It is more parsimonious to conclude plants have evolved mechanisms to tolerate, but not promote, landscape fire. PMID:25414710

  12. BOOK REVIEW: OPENING SCIENCE, THE EVOLVING GUIDE ...

    EPA Pesticide Factsheets

    The way we get our funding, collaborate, do our research, and get the word out has evolved over hundreds of years but we can imagine a more open science world, largely facilitated by the internet. The movement towards this more open way of doing and presenting science is coming, and it is not taking hundreds of years. If you are interested in these trends, and would like to find out more about where this is all headed and what it means to you, consider downloding Opening Science, edited by Sönke Bartling and Sascha Friesike, subtitled The Evolving Guide on How the Internet is Changing Research, Collaboration, and Scholarly Publishing. In 26 chapters by various authors from a range of disciplines the book explores the developing world of open science, starting from the first scientific revolution and bringing us to the next scientific revolution, sometimes referred to as “Science 2.0”. Some of the articles deal with the impact of the changing landscape of how science is done, looking at the impact of open science on Academia, or journal publishing, or medical research. Many of the articles look at the uses, pitfalls, and impact of specific tools, like microblogging (think Twitter), social networking, and reference management. There is lots of discussion and definition of terms you might use or misuse like “altmetrics” and “impact factor”. Science will probably never be completely open, and Twitter will probably never replace the journal article,

  13. The Comet Cometh: Evolving Developmental Systems.

    PubMed

    Jaeger, Johannes; Laubichler, Manfred; Callebaut, Werner

    In a recent opinion piece, Denis Duboule has claimed that the increasing shift towards systems biology is driving evolutionary and developmental biology apart, and that a true reunification of these two disciplines within the framework of evolutionary developmental biology (EvoDevo) may easily take another 100 years. He identifies methodological, epistemological, and social differences as causes for this supposed separation. Our article provides a contrasting view. We argue that Duboule's prediction is based on a one-sided understanding of systems biology as a science that is only interested in functional, not evolutionary, aspects of biological processes. Instead, we propose a research program for an evolutionary systems biology, which is based on local exploration of the configuration space in evolving developmental systems. We call this approach-which is based on reverse engineering, simulation, and mathematical analysis-the natural history of configuration space. We discuss a number of illustrative examples that demonstrate the past success of local exploration, as opposed to global mapping, in different biological contexts. We argue that this pragmatic mode of inquiry can be extended and applied to the mathematical analysis of the developmental repertoire and evolutionary potential of evolving developmental mechanisms and that evolutionary systems biology so conceived provides a pragmatic epistemological framework for the EvoDevo synthesis.

  14. Evolvability of an Optimal Recombination Rate.

    PubMed

    Lobkovsky, Alexander E; Wolf, Yuri I; Koonin, Eugene V

    2015-12-10

    Evolution and maintenance of genetic recombination and its relation to the mutational process is a long-standing, fundamental problem in evolutionary biology that is linked to the general problem of evolution of evolvability. We explored a stochastic model of the evolution of recombination using additive fitness and infinite allele assumptions but no assumptions on the sign or magnitude of the epistasis and the distribution of mutation effects. In this model, fluctuating negative epistasis and predominantly deleterious mutations arise naturally as a consequence of the additive fitness and a reservoir from which new alleles arrive with a fixed distribution of fitness effects. Analysis of the model revealed a nonmonotonic effect of recombination intensity on fitness, with an optimal recombination rate value which maximized fitness in steady state. The optimal recombination rate depended on the mutation rate and was evolvable, that is, subject to selection. The predictions of the model were compatible with the observations on the dependence between genome rearrangement rate and gene flux in microbial genomes.

  15. Netgram: Visualizing Communities in Evolving Networks

    PubMed Central

    Mall, Raghvendra; Langone, Rocco; Suykens, Johan A. K.

    2015-01-01

    Real-world complex networks are dynamic in nature and change over time. The change is usually observed in the interactions within the network over time. Complex networks exhibit community like structures. A key feature of the dynamics of complex networks is the evolution of communities over time. Several methods have been proposed to detect and track the evolution of these groups over time. However, there is no generic tool which visualizes all the aspects of group evolution in dynamic networks including birth, death, splitting, merging, expansion, shrinkage and continuation of groups. In this paper, we propose Netgram: a tool for visualizing evolution of communities in time-evolving graphs. Netgram maintains evolution of communities over 2 consecutive time-stamps in tables which are used to create a query database using the sql outer-join operation. It uses a line-based visualization technique which adheres to certain design principles and aesthetic guidelines. Netgram uses a greedy solution to order the initial community information provided by the evolutionary clustering technique such that we have fewer line cross-overs in the visualization. This makes it easier to track the progress of individual communities in time evolving graphs. Netgram is a generic toolkit which can be used with any evolutionary community detection algorithm as illustrated in our experiments. We use Netgram for visualization of topic evolution in the NIPS conference over a period of 11 years and observe the emergence and merging of several disciplines in the field of information processing systems. PMID:26356538

  16. Caterpillars evolved from onychophorans by hybridogenesis.

    PubMed

    Williamson, Donald I

    2009-11-24

    I reject the Darwinian assumption that larvae and their adults evolved from a single common ancestor. Rather I posit that, in animals that metamorphose, the basic types of larvae originated as adults of different lineages, i.e., larvae were transferred when, through hybridization, their genomes were acquired by distantly related animals. "Caterpillars," the name for eruciforms with thoracic and abdominal legs, are larvae of lepidopterans, hymenopterans, and mecopterans (scorpionflies). Grubs and maggots, including the larvae of beetles, bees, and flies, evolved from caterpillars by loss of legs. Caterpillar larval organs are dismantled and reconstructed in the pupal phase. Such indirect developmental patterns (metamorphoses) did not originate solely by accumulation of random mutations followed by natural selection; rather they are fully consistent with my concept of evolution by hybridogenesis. Members of the phylum Onychophora (velvet worms) are proposed as the evolutionary source of caterpillars and their grub or maggot descendants. I present a molecular biological research proposal to test my thesis. By my hypothesis 2 recognizable sets of genes are detectable in the genomes of all insects with caterpillar grub- or maggot-like larvae: (i) onychophoran genes that code for proteins determining larval morphology/physiology and (ii) sequentially expressed insect genes that code for adult proteins. The genomes of insects and other animals that, by contrast, entirely lack larvae comprise recognizable sets of genes from single animal common ancestors.

  17. Collapse of cooperation in evolving games

    PubMed Central

    Stewart, Alexander J.; Plotkin, Joshua B.

    2014-01-01

    Game theory provides a quantitative framework for analyzing the behavior of rational agents. The Iterated Prisoner’s Dilemma in particular has become a standard model for studying cooperation and cheating, with cooperation often emerging as a robust outcome in evolving populations. Here we extend evolutionary game theory by allowing players’ payoffs as well as their strategies to evolve in response to selection on heritable mutations. In nature, many organisms engage in mutually beneficial interactions and individuals may seek to change the ratio of risk to reward for cooperation by altering the resources they commit to cooperative interactions. To study this, we construct a general framework for the coevolution of strategies and payoffs in arbitrary iterated games. We show that, when there is a tradeoff between the benefits and costs of cooperation, coevolution often leads to a dramatic loss of cooperation in the Iterated Prisoner’s Dilemma. The collapse of cooperation is so extreme that the average payoff in a population can decline even as the potential reward for mutual cooperation increases. Depending upon the form of tradeoffs, evolution may even move away from the Iterated Prisoner’s Dilemma game altogether. Our work offers a new perspective on the Prisoner’s Dilemma and its predictions for cooperation in natural populations; and it provides a general framework to understand the coevolution of strategies and payoffs in iterated interactions. PMID:25422421

  18. Collapse of cooperation in evolving games.

    PubMed

    Stewart, Alexander J; Plotkin, Joshua B

    2014-12-09

    Game theory provides a quantitative framework for analyzing the behavior of rational agents. The Iterated Prisoner's Dilemma in particular has become a standard model for studying cooperation and cheating, with cooperation often emerging as a robust outcome in evolving populations. Here we extend evolutionary game theory by allowing players' payoffs as well as their strategies to evolve in response to selection on heritable mutations. In nature, many organisms engage in mutually beneficial interactions and individuals may seek to change the ratio of risk to reward for cooperation by altering the resources they commit to cooperative interactions. To study this, we construct a general framework for the coevolution of strategies and payoffs in arbitrary iterated games. We show that, when there is a tradeoff between the benefits and costs of cooperation, coevolution often leads to a dramatic loss of cooperation in the Iterated Prisoner's Dilemma. The collapse of cooperation is so extreme that the average payoff in a population can decline even as the potential reward for mutual cooperation increases. Depending upon the form of tradeoffs, evolution may even move away from the Iterated Prisoner's Dilemma game altogether. Our work offers a new perspective on the Prisoner's Dilemma and its predictions for cooperation in natural populations; and it provides a general framework to understand the coevolution of strategies and payoffs in iterated interactions.

  19. The Ames Philosophical Belief Inventory: Reliability and Validity

    ERIC Educational Resources Information Center

    Sawyer, R. N.

    1971-01-01

    This study investigated the reliability and validity of the Philosophical Belief Inventory (PBI). With the exception of the relationship between idealism and pragmatism and realism and existentialism, the PBI scales appear to be assessing independent facets of belief. (Author)

  20. The Ames Philosophical Belief Inventory: Reliability and Validity

    ERIC Educational Resources Information Center

    Sawyer, R. N.

    1971-01-01

    This study investigated the reliability and validity of the Philosophical Belief Inventory (PBI). With the exception of the relationship between idealism and pragmatism and realism and existentialism, the PBI scales appear to be assessing independent facets of belief. (Author)

  1. Conceptualizing Essay Tests' Reliability and Validity: From Research to Theory

    ERIC Educational Resources Information Center

    Badjadi, Nour El Imane

    2013-01-01

    The current paper on writing assessment surveys the literature on the reliability and validity of essay tests. The paper aims to examine the two concepts in relationship with essay testing as well as to provide a snapshot of the current understandings of the reliability and validity of essay tests as drawn in recent research studies. Bearing in…

  2. Space Shuttle Propulsion System Reliability

    NASA Technical Reports Server (NTRS)

    Welzyn, Ken; VanHooser, Katherine; Moore, Dennis; Wood, David

    2011-01-01

    This session includes the following sessions: (1) External Tank (ET) System Reliability and Lessons, (2) Space Shuttle Main Engine (SSME), Reliability Validated by a Million Seconds of Testing, (3) Reusable Solid Rocket Motor (RSRM) Reliability via Process Control, and (4) Solid Rocket Booster (SRB) Reliability via Acceptance and Testing.

  3. Reliability after inspection. [of flaws on laminate surface

    NASA Technical Reports Server (NTRS)

    Davidson, J. R.

    1975-01-01

    The investigation is concerned with the derivation of relationships between the probability of having manufacturing defects, the probability of detecting a flaw, and the final reliability. Equations for the simple situation in which only one flaw can be present are used to introduce the relationships in a Bayes' theorem approach to the assessment of the final reliability. Situations which are prevalent in composites manufacturing are considered. Attention is given to a case involving the random occurrence of flaws on a laminate surface.

  4. Human Reliability Program Workshop

    SciTech Connect

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  5. Reliable broadcast protocols

    NASA Technical Reports Server (NTRS)

    Joseph, T. A.; Birman, Kenneth P.

    1989-01-01

    A number of broadcast protocols that are reliable subject to a variety of ordering and delivery guarantees are considered. Developing applications that are distributed over a number of sites and/or must tolerate the failures of some of them becomes a considerably simpler task when such protocols are available for communication. Without such protocols the kinds of distributed applications that can reasonably be built will have a very limited scope. As the trend towards distribution and decentralization continues, it will not be surprising if reliable broadcast protocols have the same role in distributed operating systems of the future that message passing mechanisms have in the operating systems of today. On the other hand, the problems of engineering such a system remain large. For example, deciding which protocol is the most appropriate to use in a certain situation or how to balance the latency-communication-storage costs is not an easy question.

  6. Waste package reliability analysis

    SciTech Connect

    Pescatore, C.; Sastre, C.

    1983-01-01

    Proof of future performance of a complex system such as a high-level nuclear waste package over a period of hundreds to thousands of years cannot be had in the ordinary sense of the word. The general method of probabilistic reliability analysis could provide an acceptable framework to identify, organize, and convey the information necessary to satisfy the criterion of reasonable assurance of waste package performance according to the regulatory requirements set forth in 10 CFR 60. General principles which may be used to evaluate the qualitative and quantitative reliability of a waste package design are indicated and illustrated with a sample calculation of a repository concept in basalt. 8 references, 1 table.

  7. Reliability of digital reactor protection system based on extenics.

    PubMed

    Zhao, Jing; He, Ya-Nan; Gu, Peng-Fei; Chen, Wei-Hua; Gao, Feng

    2016-01-01

    After the Fukushima nuclear accident, safety of nuclear power plants (NPPs) is widespread concerned. The reliability of reactor protection system (RPS) is directly related to the safety of NPPs, however, it is difficult to accurately evaluate the reliability of digital RPS. The method is based on estimating probability has some uncertainties, which can not reflect the reliability status of RPS dynamically and support the maintenance and troubleshooting. In this paper, the reliability quantitative analysis method based on extenics is proposed for the digital RPS (safety-critical), by which the relationship between the reliability and response time of RPS is constructed. The reliability of the RPS for CPR1000 NPP is modeled and analyzed by the proposed method as an example. The results show that the proposed method is capable to estimate the RPS reliability effectively and provide support to maintenance and troubleshooting of digital RPS system.

  8. Laser Reliability Prediction

    DTIC Science & Technology

    1975-08-01

    data, and formulating quantitative reliability prediction models based on the data. In this way, models have been constructed for the six laser...C-0091 The purpose of the contract was to formulate models for predicting the failure rates of coherent light emitting devices such as lasers and...with high quality lens tissue usinp, moisture from breath (If necessary). 3. Flush with distilled water and a mild laboratory detergent (if necessary

  9. Reliability and testing

    NASA Technical Reports Server (NTRS)

    Auer, Werner

    1996-01-01

    Reliability and its interdependence with testing are important topics for development and manufacturing of successful products. This generally accepted fact is not only a technical statement, but must be also seen in the light of 'Human Factors.' While the background for this paper is the experience gained with electromechanical/electronic space products, including control and system considerations, it is believed that the content could be also of interest for other fields.

  10. Spacecraft transmitter reliability

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A workshop on spacecraft transmitter reliability was held at the NASA Lewis Research Center on September 25 and 26, 1979, to discuss present knowledge and to plan future research areas. Since formal papers were not submitted, this synopsis was derived from audio tapes of the workshop. The following subjects were covered: users' experience with space transmitters; cathodes; power supplies and interfaces; and specifications and quality assurance. A panel discussion ended the workshop.

  11. Compact, Reliable EEPROM Controller

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Kleyner, Igor

    2010-01-01

    A compact, reliable controller for an electrically erasable, programmable read-only memory (EEPROM) has been developed specifically for a space-flight application. The design may be adaptable to other applications in which there are requirements for reliability in general and, in particular, for prevention of inadvertent writing of data in EEPROM cells. Inadvertent writes pose risks of loss of reliability in the original space-flight application and could pose such risks in other applications. Prior EEPROM controllers are large and complex and do not provide all reasonable protections (in many cases, few or no protections) against inadvertent writes. In contrast, the present controller provides several layers of protection against inadvertent writes. The controller also incorporates a write-time monitor, enabling determination of trends in the performance of an EEPROM through all phases of testing. The controller has been designed as an integral subsystem of a system that includes not only the controller and the controlled EEPROM aboard a spacecraft but also computers in a ground control station, relatively simple onboard support circuitry, and an onboard communication subsystem that utilizes the MIL-STD-1553B protocol. (MIL-STD-1553B is a military standard that encompasses a method of communication and electrical-interface requirements for digital electronic subsystems connected to a data bus. MIL-STD- 1553B is commonly used in defense and space applications.) The intent was to both maximize reliability while minimizing the size and complexity of onboard circuitry. In operation, control of the EEPROM is effected via the ground computers, the MIL-STD-1553B communication subsystem, and the onboard support circuitry, all of which, in combination, provide the multiple layers of protection against inadvertent writes. There is no controller software, unlike in many prior EEPROM controllers; software can be a major contributor to unreliability, particularly in fault

  12. Laser System Reliability

    DTIC Science & Technology

    1977-03-01

    NEALE CAPT. RANDALL D. GODFREY CAPT. JOHN E. ACTON HR. DAVE B. LEMMING (ASD) :,^ 19 . ••^w**** SECTION III RELIABILITY PREDICTION...Dete Exchange Program) failure rate date bank. In addition, some data have been obtained from Hughes. Rocketdyne , Garrett, and the AFWL’s APT Failure...Central Ave, Suite 306, Albuq, NM 87108 R/M Systems, Inc (Dr. K. Blemel), 10801 Lomas 81vd NE, Albuquerque, NM 87112 Rocketdyne 01 v, Rockwell

  13. Designing reliability into accelerators

    NASA Astrophysics Data System (ADS)

    Hutton, A.

    1992-07-01

    Future accelerators will have to provide a high degree of reliability. Quality must be designed in right from the beginning and must remain a central theme throughout the project. The problem is similar to the problems facing US industry today, and examples of the successful application of quality engineering will be given. Different aspects of an accelerator project will be addressed: Concept, Design, Motivation, Management Techniques, and Fault Diagnosis. The importance of creating and maintaining a coherent team will be stressed.

  14. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  15. General Aviation Aircraft Reliability Study

    NASA Technical Reports Server (NTRS)

    Pettit, Duane; Turnbull, Andrew; Roelant, Henk A. (Technical Monitor)

    2001-01-01

    This reliability study was performed in order to provide the aviation community with an estimate of Complex General Aviation (GA) Aircraft System reliability. To successfully improve the safety and reliability for the next generation of GA aircraft, a study of current GA aircraft attributes was prudent. This was accomplished by benchmarking the reliability of operational Complex GA Aircraft Systems. Specifically, Complex GA Aircraft System reliability was estimated using data obtained from the logbooks of a random sample of the Complex GA Aircraft population.

  16. Crystalline-silicon reliability lessons for thin-film modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1985-01-01

    The reliability of crystalline silicon modules has been brought to a high level with lifetimes approaching 20 years, and excellent industry credibility and user satisfaction. The transition from crystalline modules to thin film modules is comparable to the transition from discrete transistors to integrated circuits. New cell materials and monolithic structures will require new device processing techniques, but the package function and design will evolve to a lesser extent. Although there will be new encapsulants optimized to take advantage of the mechanical flexibility and low temperature processing features of thin films, the reliability and life degradation stresses and mechanisms will remain mostly unchanged. Key reliability technologies in common between crystalline and thin film modules include hot spot heating, galvanic and electrochemical corrosion, hail impact stresses, glass breakage, mechanical fatigue, photothermal degradation of encapsulants, operating temperature, moisture sorption, circuit design strategies, product safety issues, and the process required to achieve a reliable product from a laboratory prototype.

  17. Life and reliability models for helicopter transmissions

    NASA Technical Reports Server (NTRS)

    Savage, M.; Knorr, R. J.; Coy, J. J.

    1982-01-01

    Computer models of life and reliability are presented for planetary gear trains with a fixed ring gear, input applied to the sun gear, and output taken from the planet arm. For this transmission the input and output shafts are co-axial and the input and output torques are assumed to be coaxial with these shafts. Thrust and side loading are neglected. The reliability model is based on the Weibull distributions of the individual reliabilities of the in transmission components. The system model is also a Weibull distribution. The load versus life model for the system is a power relationship as the models for the individual components. The load-life exponent and basic dynamic capacity are developed as functions of the components capacities. The models are used to compare three and four planet, 150 kW (200 hp), 5:1 reduction transmissions with 1500 rpm input speed to illustrate their use.

  18. Synchronization in evolving snowdrift game model

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Wu, L.; Zhu, S. Q.

    2009-06-01

    The interaction between the evolution of the game and the underlying network structure with evolving snowdrift game model is investigated. The constructed network follows a power-law degree distribution typically showing scale-free feature. The topological features of average path length, clustering coefficient, degree-degree correlations and the dynamical feature of synchronizability are studied. The synchronizability of the constructed networks changes by the interaction. It will converge to a certain value when sufficient new nodes are added. It is found that initial payoffs of nodes greatly affect the synchronizability. When initial payoffs for players are equal, low common initial payoffs may lead to more heterogeneity of the network and good synchronizability. When initial payoffs follow certain distributions, better synchronizability is obtained compared to equal initial payoff. The result is also true for phase synchronization of nonidentical oscillators.

  19. Evolving resistance among Gram-positive pathogens.

    PubMed

    Munita, Jose M; Bayer, Arnold S; Arias, Cesar A

    2015-09-15

    Antimicrobial therapy is a key component of modern medical practice and a cornerstone for the development of complex clinical interventions in critically ill patients. Unfortunately, the increasing problem of antimicrobial resistance is now recognized as a major public health threat jeopardizing the care of thousands of patients worldwide. Gram-positive pathogens exhibit an immense genetic repertoire to adapt and develop resistance to virtually all antimicrobials clinically available. As more molecules become available to treat resistant gram-positive infections, resistance emerges as an evolutionary response. Thus, antimicrobial resistance has to be envisaged as an evolving phenomenon that demands constant surveillance and continuous efforts to identify emerging mechanisms of resistance to optimize the use of antibiotics and create strategies to circumvent this problem. Here, we will provide a broad perspective on the clinical aspects of antibiotic resistance in relevant gram-positive pathogens with emphasis on the mechanistic strategies used by these organisms to avoid being killed by commonly used antimicrobial agents.

  20. Evolving unipolar memristor spiking neural networks

    NASA Astrophysics Data System (ADS)

    Howard, David; Bull, Larry; De Lacy Costello, Ben

    2015-10-01

    Neuromorphic computing - brain-like computing in hardware - typically requires myriad complimentary metal oxide semiconductor spiking neurons interconnected by a dense mesh of nanoscale plastic synapses. Memristors are frequently cited as strong synapse candidates due to their statefulness and potential for low-power implementations. To date, plentiful research has focused on the bipolar memristor synapse, which is capable of incremental weight alterations and can provide adaptive self-organisation under a Hebbian learning scheme. In this paper, we consider the unipolar memristor synapse - a device capable of non-Hebbian switching between only two states (conductive and resistive) through application of a suitable input voltage - and discuss its suitability for neuromorphic systems. A self-adaptive evolutionary process is used to autonomously find highly fit network configurations. Experimentation on two robotics tasks shows that unipolar memristor networks evolve task-solving controllers faster than both bipolar memristor networks and networks containing constant non-plastic connections whilst performing at least comparably.

  1. Life cycle planning: An evolving concept

    SciTech Connect

    Moore, P.J.R.; Gorman, I.G.

    1994-12-31

    Life-cycle planning is an evolving concept in the management of oil and gas projects. BHP Petroleum now interprets this idea to include all development planning from discovery and field appraisal to final abandonment and includes safety, environmental, technical, plant, regulatory, and staffing issues. This article describes in the context of the Timor Sea, how despite initial successes and continuing facilities upgrades, BHPP came to perceive that current operations could be the victim of early development successes, particularly in the areas of corrosion and maintenance. The search for analogies elsewhere lead to the UK North Sea, including the experiences of Britoil and BP, both of which performed detailed Life of Field studies in the later eighties. These materials have been used to construct a format and content for total Life-cycle plans in general and the social changes required to ensure their successful application in Timor Sea operations and deployment throughout Australia.

  2. Properties of evolving e-mail networks

    NASA Astrophysics Data System (ADS)

    Wang, Juan; de Wilde, Philippe

    2004-12-01

    Computer viruses spread by attaching to an e-mail message and sending themselves to users whose addresses are in the e-mail address book of the recipients. Here we investigate a simple model of an evolving e-mail network, with nodes as e-mail address books of users and links as the records of e-mail addresses in the address books. Within specific periods, some new links are generated and some old links are deleted. We study the statistical properties of this e-mail network and observe the effect of the evolution on the structure of the network. We also find that the balance between the generation procedure and deletion procedure is dependent on different parameters of the model.

  3. Design Space Issues for Intrinsic Evolvable Hardware

    NASA Technical Reports Server (NTRS)

    Hereford, James; Gwaltney, David

    2004-01-01

    This paper discuss the problem of increased programming time for intrinsic evolvable hardware (EHW) as the complexity of the circuit grows. We develop equations for the size of the population, n, and the number of generations required for the population to converge, ngen, based on L, the length of the programming string. We show that the processing time of the computer becomes negligible for intrinsic EHW since the selection/crossover/mutation steps are only done once per generation, suggesting there is room for use of more complex evolutionary algorithms m intrinsic EHW. F i y , we review the state of the practice and discuss the notion of a system design approach for intrinsic EHW.

  4. Analysis of an evolving email network

    NASA Astrophysics Data System (ADS)

    Zhu, Chaopin; Kuh, Anthony; Wang, Juan; de Wilde, Philippe

    2006-10-01

    In this paper we study an evolving email network model first introduced by Wang and De Wilde, to the best of our knowledge. The model is analyzed by formulating the network topology as a random process and studying the dynamics of the process. Our analytical results show a number of steady state properties about the email traffic between different nodes and the aggregate networking behavior (i.e., degree distribution, clustering coefficient, average path length, and phase transition), and also confirm the empirical results obtained by Wang and De Wilde. We also conducted simulations confirming the analytical results. Extensive simulations were run to evaluate email traffic behavior at the link and network levels, phase transition phenomena, and also studying the behavior of email traffic in a hierarchical network. The methods established here are also applicable to many other practical networks including sensor networks and social networks.

  5. Tracking correlated, simultaneously evolving target populations, II

    NASA Astrophysics Data System (ADS)

    Mahler, Ronald

    2017-05-01

    This paper is the sixth in a series aimed at weakening the independence assumptions that are typically presumed in multitarget tracking. Earlier papers investigated Bayes …lters that propagate the correlations between two evolving multitarget systems. Last year at this conference we attempted to derive PHD …lter-type approximations that account for both spatial correlation and cardinality correlation (i.e., correlation between the target numbers of the two systems). Unfortunately, this approach required heuristic models of both clutter and target appearance in order to incorporate both spatial and cardinality correlation. This paper describes a fully rigorous approach- provided, however, that spatial correlation between the two populations is ignored and only their cardinality correlations are taken into account. We derive the time-update and measurement-update equations for a CPHD …lter describing the evolution of such correlated multitarget populations.

  6. Evolvement of molecular nanomagnets in China.

    PubMed

    Wang, Bing-Wu; Wang, Xin-Yi; Sun, Hao-Ling; Jiang, Shang-Da; Gao, Song

    2013-10-13

    Molecular nanomagnets have been undergoing development for 20 years since the first single-molecule magnet (SMM), Mn₁₂Ac, was characterized as the molecule-behaved magnet. The multi-disciplinary scientists promoted the magnetic characteristics to be more suitable for use in information science and spintronics. The concept of molecular nanomagnets has also evolved to include single-chain magnets (SCMs), single-ion magnets (SIMs) and even magnetic molecules that showed only slow magnetic relaxation, in addition to the initial cluster-type SMMs. In this review, several aspects, including SMMs, SCMs and SIMs, are introduced briefly through some representative examples. In particular, the contribution of Chinese chemists is highlighted in the design, synthesis and understanding of various types of molecular nanomagnets.

  7. Renal cell carcinoma: Evolving and emerging subtypes

    PubMed Central

    Crumley, Suzanne M; Divatia, Mukul; Truong, Luan; Shen, Steven; Ayala, Alberto G; Ro, Jae Y

    2013-01-01

    Our knowledge of renal cell carcinoma (RCC) is rapidly expanding. For those who diagnose and treat RCC, it is important to understand the new developments. In recent years, many new renal tumors have been described and defined, and our understanding of the biology and clinical correlates of these tumors is changing. Evolving concepts in Xp11 translocation carcinoma, mucinous tubular and spindle cell carcinoma, multilocular cystic clear cell RCC, and carcinoma associated with neuroblastoma are addressed within this review. Tubulocystic carcinoma, thyroid-like follicular carcinoma of kidney, acquired cystic disease-associated RCC, and clear cell papillary RCC are also described. Finally, candidate entities, including RCC with t(6;11) translocation, hybrid oncocytoma/chromophobe RCC, hereditary leiomyomatosis and RCC syndrome, and renal angiomyoadenomatous tumor are reviewed. Knowledge of these new entities is important for diagnosis, treatment and subsequent prognosis. This review provides a targeted summary of new developments in RCC. PMID:24364021

  8. The distances of highly evolved planetary nebulae

    NASA Astrophysics Data System (ADS)

    Phillips, J. P.

    2005-02-01

    The central stars of highly evolved planetary nebulae (PNe) are expected to have closely similar absolute visual magnitudes MV. This enables us to determine approximate distances to these sources where one knows their central star visual magnitudes, and levels of extinction. We find that such an analysis implies values of D which are similar to those determined by Phillips; Cahn, Kaler & Stanghellin; Acker, and Daub. However, our distances are very much smaller than those of Zhang; Bensby & Lundstrom, and van de Steene & Zijlstra. The reasons for these differences are discussed, and can be traced to errors in the assumed relation between brightness temperature and radius. Finally, we determine that the binary companions of such stars can be no brighter than MV~ 6mag, implying a spectral type of K0 or later in the case of main-sequence stars.

  9. Modelling of the Evolving Stable Boundary Layer

    NASA Astrophysics Data System (ADS)

    Sorbjan, Zbigniew

    2014-06-01

    A single-column model of the evolving stable boundary layer (SBL) is tested for self-similar properties of the flow and effects of ambient forcing. The turbulence closure of the model is diagnostic, based on the K-theory approach, with a semi-empirical form of the mixing length, and empirical stability functions of the Richardson number. The model results, expressed in terms of local similarity scales, are universal functions, satisfied in the entire SBL. Based on similarity expression, a realizability condition is derived for the minimum allowable turbulent heat flux in the SBL. Numerical experiments show that the development of "horse-shoe" shaped, fixed-elevation hodographs in the interior of the SBL around sunrise is controlled by effects imposed by surface thermal forcing.

  10. Pulmonary Sporotrichosis: An Evolving Clinical Paradigm.

    PubMed

    Aung, Ar K; Spelman, Denis W; Thompson, Philip J

    2015-10-01

    In recent decades, sporotrichosis, caused by thermally dimorphic fungi Sporothrix schenckii complex, has become an emerging infection in many parts of the world. Pulmonary infection with S. schenckii still remains relatively uncommon, possibly due to underrecognition. Pulmonary sporotrichosis presents with distinct clinical and radiological patterns in both immunocompetent and immunocompromised hosts and can often result in significant morbidity and mortality despite treatment. Current understanding regarding S. schenckii biology, epidemiology, immunopathology, clinical diagnostics, and treatment options has been evolving in the recent years with increased availability of molecular sequencing techniques. However, this changing knowledge has not yet been fully translated into a better understanding of the clinical aspects of pulmonary sporotrichosis, as such current management guidelines remain unsupported by high-level clinical evidence. This article examines recent advances in the knowledge of sporotrichosis and its application to the difficult challenges of managing pulmonary sporotrichosis.

  11. Mach cones in an evolving medium

    SciTech Connect

    Renk, Thorsten; Ruppert, Joerg

    2006-01-15

    The energy and momentum lost by a hard parton propagating through hot and dense matter has to be redistributed in the nuclear medium. Apart from heating the medium, there is the possibility that collective modes are excited. We outline a formalism that can be used to track the propagation of such a mode through the evolving medium if its dispersion relation is known. Under the assumption that a sound wave is created, we track the jet energy loss as a function of spacetime and follow the resulting mach cone throughout the fireball evolution. We compare with the angular correlation pattern of hard hadrons as obtained by the PHENIX Collaboration and find good agreement with the data provided that a substantial fraction of jet energy ({approx}90%) is deposited into a propagating mode and that the hot matter can be characterized by an equation of state with a soft point (not necessarily a mixed phase)

  12. Finch: A System for Evolving Java (Bytecode)

    NASA Astrophysics Data System (ADS)

    Orlov, Michael; Sipper, Moshe

    The established approach in genetic programming (GP) involves the definition of functions and terminals appropriate to the problem at hand, after which evolution of expressions using these definitions takes place. We have recently developed a system, dubbed FINCH (Fertile Darwinian Bytecode Harvester), to evolutionarily improve actual, extant software, which was not intentionally written for the purpose of serving as a GP representation in particular, nor for evolution in general. This is in contrast to existing work that uses restricted subsets of the Java bytecode instruction set as a representation language for individuals in genetic programming. The ability to evolve Java programs will hopefully lead to a valuable new tool in the software engineer's toolkit.

  13. Language as a coordination tool evolves slowly

    PubMed Central

    2016-01-01

    Social living ultimately depends on coordination between group members, and communication is necessary to make this possible. We suggest that this might have been the key selection pressure acting on the evolution of language in humans and use a behavioural coordination model to explore the impact of communication efficiency on social group coordination. We show that when language production is expensive but there is an individual benefit to the efficiency with which individuals coordinate their behaviour, the evolution of efficient communication is selected for. Contrary to some views of language evolution, the speed of evolution is necessarily slow because there is no advantage in some individuals evolving communication abilities that much exceed those of the community at large. However, once a threshold competence has been achieved, evolution of higher order language skills may indeed be precipitate. PMID:28083091

  14. Evolving surgical approaches in liver transplantation.

    PubMed

    Petrowsky, Henrik; Busuttil, Ronald W

    2009-02-01

    The growing discrepancy between the need and the availability of donor livers has resulted in evolving surgical approaches in liver transplantation during the last two decades to expand the donor pool. One approach is to transplant partial grafts, obtained either from a living donor or splitting a cadaveric donor liver. For both surgical methods, it is important to obtain a minimal viable graft volume to prevent small-for-size syndrome and graft failure. This minimal volume, expressed as graft-to-whole body ratio, must be between 0.8 and 1%. Living donor liver transplantation (LDLT) became the primary transplant option in many Asian countries and is increasingly performed as an adjunct transplant option in countries with low donation rates. Split liver transplantation (SLT) is a surgical method that creates two allografts from one deceased donor. The most widely used splitting technique is the division of the liver into a left lateral sectoral graft (segments 2 and 3) for a pediatric patient and a right trisegmental graft (segments 1 and 4 to 8) for an adult patient. Both LDLT and SLT are also important and established methods for the treatment of pediatric patients. Another evolving surgical approach is auxiliary liver transplantation, which describes the transplanting a whole or partial graft with preservation of the partial native liver. This bridging technique is applied in patients with fulminate liver failure and should allow the regeneration of the injured liver with the potential to discontinue immunosuppression. Other methods such as xenotransplantation, as well as hepatocyte and stem cell transplantation, are promising approaches that are still in experimental phases.

  15. Evolving roles of highly successful mentors.

    PubMed

    Melanson, Mark A

    2007-01-01

    This article was written groom mentors, old and new, by identifying the evolving roles that highly successful mentors in order to share wisdom with their protégés. First aspiring mentors need to become subject matter experts in their profession by achieving the relevant benchmarks of mastery in their career fields. Next, they must win the coveted role of respected leader by being trustworthy and putting the development of others first. The mastery of knowledge and earning of respect must also be balanced by genuine humility if the mentor is to be effective in sharing what he or she knows. Highly successful mentors are teachers, plain and simple, and they are most effective in this role when they have a deep passion for teaching. By properly using reflection, adroit mentors look back upon their many adventures and carefully select critical stories to share with their protégés. While outstanding mentors have a lot to say, they need to spend most of their time listening to those they mentor to ensure that they understand their protégés' individualized needs and goals. To help protégés through the setbacks and disappointments that can accompany an Army career, high-speed mentors need to don the role of optimistic cheerleaders, encouraging those they mentor and restoring hope. Once a striving mentor has finally reached the summit of mentoring and is a trusted counselor, he or she must continue to nurture and protect this fragile, but most influential role. Lastly, great mentors need to be consummate students of both their profession and mentoring so that they remain vital and responsive to those they mentor. In closing, it is hoped that by considering and embracing these evolving roles, dedicated students of mentoring will derive a deeper satisfaction and have greater success in this critically important leadership responsibility.

  16. Studying evolved stars with Herschel observations

    NASA Astrophysics Data System (ADS)

    da Silva Santos, João Manuel

    2016-07-01

    A systematic inspection of the far-infrared (FIR) properties of evolved stars allows not only to constrain physical models, but also to understand the chemical evolution that takes place in the end of their lives. In this work we intend to study the circumstellar envelopes (CSE) on a sample of stars in the THROES catalogue from AGB/post-AGB stars to planetary nebulae using photometry and spectroscopy provided by the PACS instrument on-board Herschel telescope. In the first part we are interested in obtaining an estimate of the size of FIR emitting region and to sort our targets in two classes: point-like and extended. Secondly, we focus on the molecular component of the envelope traced by carbon monoxide (CO) rotational lines. We conduct a line survey on a sample of evolved stars by identifying and measuring flux of both 12CO and 13CO isotopologues in the PACS range, while looking at the overall properties of the sample. Lastly, we will be interested in obtaining physical parameters of the CSE, namely gas temperature, mass and mass-loss rate on a sample of carbon stars. For that, we make use of PACS large wavelength coverage, which enables the simultaneous study of a large number of CO transitions, to perform the rotational diagram analysis. We report the detection of CO emission in a high number of stars from the catalogue, which were mostly classified as point-like targets with a few exceptions of planetary nebulae. High J rotational number transitions were detected in a number of targets, revealing the presence of a significant amount of hot gas (T ˜ 400-900 K) and high mass-loss rates. We conclude that Herschel/PACS is in a privileged position to detect a new population of warmer gas, typically missed in sub-mm/mm observations.

  17. How evolved psychological mechanisms empower cultural group selection.

    PubMed

    Henrich, Joseph; Boyd, Robert

    2016-01-01

    Driven by intergroup competition, social norms, beliefs, and practices can evolve in ways that more effectively tap into a wide variety of evolved psychological mechanisms to foster group-beneficial behavior. The more powerful such evolved mechanisms are, the more effectively culture can potentially harness and manipulate them to generate greater phenotypic variation across groups, thereby fueling cultural group selection.

  18. Conditional Reliability Coefficients for Test Scores.

    PubMed

    Nicewander, W Alan

    2017-04-06

    The most widely used, general index of measurement precision for psychological and educational test scores is the reliability coefficient-a ratio of true variance for a test score to the true-plus-error variance of the score. In item response theory (IRT) models for test scores, the information function is the central, conditional index of measurement precision. In this inquiry, conditional reliability coefficients for a variety of score types are derived as simple transformations of information functions. It is shown, for example, that the conditional reliability coefficient for an ordinary, number-correct score, X, is equal to, ρ(X,X'|θ)=I(X,θ)/[I(X,θ)+1] Where: θ is a latent variable measured by an observed test score, X; p(X, X'|θ) is the conditional reliability of X at a fixed value of θ; and I(X, θ) is the score information function. This is a surprisingly simple relationship between the 2, basic indices of measurement precision from IRT and classical test theory (CTT). This relationship holds for item scores as well as test scores based on sums of item scores-and it holds for dichotomous as well as polytomous items, or a mix of both item types. Also, conditional reliabilities are derived for computerized adaptive test scores, and for θ-estimates used as alternatives to number correct scores. These conditional reliabilities are all related to information in a manner similar-or-identical to the 1 given above for the number-correct (NC) score. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Interrelation Between Safety Factors and Reliability

    NASA Technical Reports Server (NTRS)

    Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.

  20. Static and Evolving Norovirus Genotypes: Implications for Epidemiology and Immunity.

    PubMed

    Parra, Gabriel I; Squires, R Burke; Karangwa, Consolee K; Johnson, Jordan A; Lepore, Cara J; Sosnovtsev, Stanislav V; Green, Kim Y

    2017-01-01

    Noroviruses are major pathogens associated with acute gastroenteritis worldwide. Their RNA genomes are diverse, with two major genogroups (GI and GII) comprised of at least 28 genotypes associated with human disease. To elucidate mechanisms underlying norovirus diversity and evolution, we used a large-scale genomics approach to analyze human norovirus sequences. Comparison of over 2000 nearly full-length ORF2 sequences representing most of the known GI and GII genotypes infecting humans showed a limited number (≤5) of distinct intra-genotypic variants within each genotype, with the exception of GII.4. The non-GII.4 genotypes were comprised of one or more intra-genotypic variants, with each variant containing strains that differed by only a few residues over several decades (remaining "static") and that have co-circulated with no clear epidemiologic pattern. In contrast, the GII.4 genotype presented the largest number of variants (>10) that have evolved over time with a clear pattern of periodic variant replacement. To expand our understanding of these two patterns of diversification ("static" versus "evolving"), we analyzed using NGS the nearly full-length norovirus genome in healthy individuals infected with GII.4, GII.6 or GII.17 viruses in different outbreak settings. The GII.4 viruses accumulated mutations rapidly within and between hosts, while the GII.6 and GII.17 viruses remained relatively stable, consistent with their diversification patterns. Further analysis of genetic relationships and natural history patterns identified groupings of certain genotypes into larger related clusters designated here as "immunotypes". We propose that "immunotypes" and their evolutionary patterns influence the prevalence of a particular norovirus genotype in the human population.

  1. Capturing the Interpersonal Implications of Evolved Preferences? Frequency of Sex Shapes Automatic, But Not Explicit, Partner Evaluations

    PubMed Central

    Hicks, Lindsey L.; McNulty, James K.; Meltzer, Andrea L.; Olson, Michael A.

    2016-01-01

    Sex is crucial to reproduction, and thus humans likely evolved a strong predisposition to engage in sexual intercourse. Given that meeting interpersonal preferences tends to promote positive relationship evaluations, sex within a relationship should be positively associated with relationship satisfaction. Nevertheless, prior research has been inconclusive in demonstrating such a link, with longitudinal and experimental studies showing no association between sexual frequency and relationship satisfaction. Crucially, though, all prior research has utilized explicit reports of satisfaction, which reflect deliberative processes that may override the more automatic implications of phylogenetically older evolved preferences. Accordingly, capturing the implications of sexual frequency for relationship evaluations may require implicit measurements that bypass deliberative reasoning. Consistent with this idea, one cross-sectional and one three-year study of newlywed couples revealed a positive association between sexual frequency and automatic partner evaluations but not explicit satisfaction. These findings highlight the importance of automatic measurements to understanding interpersonal relationships. (150 words) PMID:27084851

  2. Ultimately Reliable Pyrotechnic Systems

    NASA Technical Reports Server (NTRS)

    Scott, John H.; Hinkel, Todd

    2015-01-01

    This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing

  3. CR reliability testing

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Rill, Lynn; Frost, Meryll M.; Staab, Edward V.

    1998-07-01

    The purpose of this work was to develop a method for systematically testing the reliability of a CR system under realistic daily loads in a non-clinical environment prior to its clinical adoption. Once digital imaging replaces film, it will be very difficult to revert back should the digital system become unreliable. Prior to the beginning of the test, a formal evaluation was performed to set the benchmarks for performance and functionality. A formal protocol was established that included all the 62 imaging plates in the inventory for each 24-hour period in the study. Imaging plates were exposed using different combinations of collimation, orientation, and SID. Anthropomorphic phantoms were used to acquire images of different sizes. Each combination was chosen randomly to simulate the differences that could occur in clinical practice. The tests were performed over a wide range of times with batches of plates processed to simulate the temporal constraints required by the nature of portable radiographs taken in the Intensive Care Unit (ICU). Current patient demographics were used for the test studies so automatic routing algorithms could be tested. During the test, only three minor reliability problems occurred, two of which were not directly related to the CR unit. One plate was discovered to cause a segmentation error that essentially reduced the image to only black and white with no gray levels. This plate was removed from the inventory to be replaced. Another problem was a PACS routing problem that occurred when the DICOM server with which the CR was communicating had a problem with disk space. The final problem was a network printing failure to the laser cameras. Although the units passed the reliability test, problems with interfacing to workstations were discovered. The two issues that were identified were the interpretation of what constitutes a study for CR and the construction of the look-up table for a proper gray scale display.

  4. Blade reliability collaborative :

    SciTech Connect

    Ashwill, Thomas D.; Ogilvie, Alistair B.; Paquette, Joshua A.

    2013-04-01

    The Blade Reliability Collaborative (BRC) was started by the Wind Energy Technologies Department of Sandia National Laboratories and DOE in 2010 with the goal of gaining insight into planned and unplanned O&M issues associated with wind turbine blades. A significant part of BRC is the Blade Defect, Damage and Repair Survey task, which will gather data from blade manufacturers, service companies, operators and prior studies to determine details about the largest sources of blade unreliability. This report summarizes the initial findings from this work.

  5. Reliability Growth Prediction

    DTIC Science & Technology

    1986-09-01

    the Duane model because: *e ’he reliability gSod’ data analyzed were reflective of a single Lesr- for • each equipment as opposed to a series of zest ...fabrication) and costs wbich are a function of test length (e.g., chamber operations). A life -cycle cost model, Ref. 14 for example, can be exercised tc...J. Gibson and K. K. Mcain APPROVED FOR FUBLIC RE1EAS NI ODSrRUTIG1,N UY-LUlHfF :-:-4 ROME AIR DEVELOPMENT CENTER Air Force Systems Command Griffiss

  6. Reliable VLSI sequential controllers

    NASA Technical Reports Server (NTRS)

    Whitaker, S.; Maki, G.; Shamanna, M.

    1990-01-01

    A VLSI architecture for synchronous sequential controllers is presented that has attractive qualities for producing reliable circuits. In these circuits, one hardware implementation can realize any flow table with a maximum of 2(exp n) internal states and m inputs. Also all design equations are identical. A real time fault detection means is presented along with a strategy for verifying the correctness of the checking hardware. This self check feature can be employed with no increase in hardware. The architecture can be modified to achieve fail safe designs. With no increase in hardware, an adaptable circuit can be realized that allows replacement of faulty transitions with fault free transitions.

  7. Reliability, synchrony and noise

    PubMed Central

    Ermentrout, G. Bard; Galán, Roberto F.; Urban, Nathaniel N.

    2008-01-01

    The brain is noisy. Neurons receive tens of thousands of highly fluctuating inputs and generate spike trains that appear highly irregular. Much of this activity is spontaneous—uncoupled to overt stimuli or motor outputs—leading to questions about the functional impact of this noise. Although noise is most often thought of as disrupting patterned activity and interfering with the encoding of stimuli, recent theoretical and experimental work has shown that noise can play a constructive role—leading to increased reliability or regularity of neuronal firing in single neurons and across populations. These results raise fundamental questions about how noise can influence neural function and computation. PMID:18603311

  8. [How Reliable is Neuronavigation?].

    PubMed

    Stieglitz, Lennart Henning

    2016-02-17

    Neuronavigation plays a central role in modern neurosurgery. It allows visualizing instruments and three-dimensional image data intraoperatively and supports spatial orientation. Thus it allows to reduce surgical risks and speed up complex surgical procedures. The growing availability and importance of neuronavigation makes clear how relevant it is to know about its reliability and accuracy. Different factors may influence the accuracy during the surgery unnoticed, misleading the surgeon. Besides the best possible optimization of the systems themselves, a good knowledge about its weaknesses is mandatory for every neurosurgeon.

  9. Ferrite logic reliability study

    NASA Technical Reports Server (NTRS)

    Baer, J. A.; Clark, C. B.

    1973-01-01

    Development and use of digital circuits called all-magnetic logic are reported. In these circuits the magnetic elements and their windings comprise the active circuit devices in the logic portion of a system. The ferrite logic device belongs to the all-magnetic class of logic circuits. The FLO device is novel in that it makes use of a dual or bimaterial ferrite composition in one physical ceramic body. This bimaterial feature, coupled with its potential for relatively high speed operation, makes it attractive for high reliability applications. (Maximum speed of operation approximately 50 kHz.)

  10. Testing of reliability - Analysis tools

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1989-01-01

    An outline is presented of issues raised in verifying the accuracy of reliability analysis tools. State-of-the-art reliability analysis tools implement various decomposition, aggregation, and estimation techniques to compute the reliability of a diversity of complex fault-tolerant computer systems. However, no formal methodology has been formulated for validating the reliability estimates produced by these tools. The author presents three states of testing that can be performed on most reliability analysis tools to effectively increase confidence in a tool. These testing stages were applied to the SURE (semi-Markov Unreliability Range Evaluator) reliability analysis tool, and the results of the testing are discussed.

  11. Understanding the Elements of Operational Reliability: A Key for Achieving High Reliability

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    2010-01-01

    This viewgraph presentation reviews operational reliability and its role in achieving high reliability through design and process reliability. The topics include: 1) Reliability Engineering Major Areas and interfaces; 2) Design Reliability; 3) Process Reliability; and 4) Reliability Applications.

  12. Load Control System Reliability

    SciTech Connect

    Trudnowski, Daniel

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  13. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1990-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  14. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1988-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  15. The Reliability of Neurons

    PubMed Central

    Bullock, Theodore Holmes

    1970-01-01

    The prevalent probabilistic view is virtually untestable; it remains a plausible belief. The cases usually cited can not be taken as evidence for it. Several grounds for this conclusion are developed. Three issues are distinguished in an attempt to clarify a murky debate: (a) the utility of probabilistic methods in data reduction, (b) the value of models that assume indeterminacy, and (c) the validity of the inference that the nervous system is largely indeterministic at the neuronal level. No exception is taken to the first two; the second is a private heuristic question. The third is the issue to which the assertion in the first two sentences is addressed. Of the two kinds of uncertainty, statistical mechanical (= practical unpredictability) as in a gas, and Heisenbergian indeterminancy, the first certainly exists, the second is moot at the neuronal level. It would contribute to discussion to recognize that neurons perform with a degree of reliability. Although unreliability is difficult to establish, to say nothing of measure, evidence that some neurons have a high degree of reliability, in both connections and activity is increasing greatly. An example is given from sternarchine electric fish. PMID:5462670

  16. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  17. VLSI design for reliability

    NASA Astrophysics Data System (ADS)

    Hajj, Ibrahim N.; Najm, Farid N.; Ping, Yang

    1990-05-01

    The results are reported of supplementary work done related to the reliability analysis of Application Specific Very Large Scale Integrated (ASIC VLSI) CMOS circuits. The electromigration susceptibility of VLSI circuits was determined. Electromigration is a major reliability problem caused by the transport of atoms in a metal line due to the electron flow. Under persistent current stress, electromigration can cause deformations of the metal lines which may result in shorts or open circuits. The failure rate due to electromigration depends on the current density in the metal lines and is usually expressed as a median-time-to-failure (MTF). This work focuses on the electromigration problem in the power and ground busses. To estimate the bust MTF, an estimate of the current waveform in each branch of the bus is required. In general, the MTF is dependent on the shape of the current waveform, and not simply on its time-average. However, a very large number of such waveform shapes are possible, depending on what inputs are applied to the circuit. This is especially true for complementary metal oxide semiconductors circuits, which draw current only during switching.

  18. Human reliability and confinement.

    PubMed

    Hauty, G T

    1964-01-01

    Problems inherent in the modifiability of circadian periodicity and in impoverished sensory environments were explored for the purpose of appraising attenuative effects upon human reliability. Accordingly, highly selected subjects were confined within a one-man altitude chamber for prolonged periods of time and under a variety of designed conditions. The findings relative to the modifiability of biological rhythm indicate that adjustment to a drastic revision of the 24-hour biological day was accomplished to a significant and practical extent by certain subjects, the extent of adjustment was directly related to the maintenance of high initial levels of proficiency, and just as subjects differ greatly in their adjustment to revised biological time, they differ to an equal extent in the degree of synchronization manifested by the apparent periodicities of the different physiological systems. In the investigation of impoverished sensory environments, it was found that the joint effects of impoverished sensory conditions and continuous work at an operator system drastically degraded the reliability of certain subjects. Further, neither prior experience nor knowledge acted to mitigate the degree of aberrancy experienced which in the case of one subject was so extreme as to necessitate his removal from the chamber prior to the termination of confinement period. Finally, management of certain aberrant behavior, specifically hallucinatory experiences, could be successfully achieved by those subjects who continuously attempted to maintain a diversity of sensory input.

  19. The evolving energy budget of accretionary wedges

    NASA Astrophysics Data System (ADS)

    McBeck, Jessica; Cooke, Michele; Maillot, Bertrand; Souloumiac, Pauline

    2017-04-01

    The energy budget of evolving accretionary systems reveals how deformational processes partition energy as faults slip, topography uplifts, and layer-parallel shortening produces distributed off-fault deformation. The energy budget provides a quantitative framework for evaluating the energetic contribution or consumption of diverse deformation mechanisms. We investigate energy partitioning in evolving accretionary prisms by synthesizing data from physical sand accretion experiments and numerical accretion simulations. We incorporate incremental strain fields and cumulative force measurements from two suites of experiments to design numerical simulations that represent accretionary wedges with stronger and weaker detachment faults. One suite of the physical experiments includes a basal glass bead layer and the other does not. Two physical experiments within each suite implement different boundary conditions (stable base versus moving base configuration). Synthesizing observations from the differing base configurations reduces the influence of sidewall friction because the force vector produced by sidewall friction points in opposite directions depending on whether the base is fixed or moving. With the numerical simulations, we calculate the energy budget at two stages of accretion: at the maximum force preceding the development of the first thrust pair, and at the minimum force following the development of the pair. To identify the appropriate combination of material and fault properties to apply in the simulations, we systematically vary the Young's modulus and the fault static and dynamic friction coefficients in numerical accretion simulations, and identify the set of parameters that minimizes the misfit between the normal force measured on the physical backwall and the numerically simulated force. Following this derivation of the appropriate material and fault properties, we calculate the components of the work budget in the numerical simulations and in the

  20. Evolving Technologies in DoD Acquisition

    DTIC Science & Technology

    2007-05-01

    Among the challenges for business are business intelligence and its relationship to information technology budgets which tends to focus on well...the programs themselves. Business Intelligence typically falls outside of operationally focused priorities. Support of strategic initiatives thus drops to the bottom of IT’s priority tree.

  1. Sustaining an International Partnership: An Evolving Collaboration

    ERIC Educational Resources Information Center

    Pierson, Melinda R.; Myck-Wayne, Janice; Stang, Kristin K.; Basinska, Anna

    2015-01-01

    Universities across the United States have an increasing interest in international education. Increasing global awareness through educational collaborations will promote greater cross-cultural understanding and build effective relationships with diverse communities. This paper documents one university's effort to build an effective international…

  2. Initial value sensitivity of the Chinese stock market and its relationship with the investment psychology

    NASA Astrophysics Data System (ADS)

    Ying, Shangjun; Li, Xiaojun; Zhong, Xiuqin

    2015-04-01

    This paper discusses the initial value sensitivity (IVS) of Chinese stock market, including the single stock market and the Chinese A-share stock market, with respect to real markets and evolving models. The aim is to explore the relationship between IVS of the Chinese A-share stock market and the investment psychology based on the evolving model of genetic cellular automaton (GCA). We find: (1) The Chinese stock market is sensitively dependent on the initial conditions. (2) The GCA model provides a considerable reliability in complexity simulation (e.g. the IVS). (3) The IVS of stock market is positively correlated with the imitation probability when the intensity of the imitation psychology reaches a certain threshold. The paper suggests that the government should seek to keep the imitation psychology under a certain level, otherwise it may induce severe fluctuation to the market.

  3. Fatigue Reliability of Gas Turbine Engine Structures

    NASA Technical Reports Server (NTRS)

    Cruse, Thomas A.; Mahadevan, Sankaran; Tryon, Robert G.

    1997-01-01

    The results of an investigation are described for fatigue reliability in engine structures. The description consists of two parts. Part 1 is for method development. Part 2 is a specific case study. In Part 1, the essential concepts and practical approaches to damage tolerance design in the gas turbine industry are summarized. These have evolved over the years in response to flight safety certification requirements. The effect of Non-Destructive Evaluation (NDE) methods on these methods is also reviewed. Assessment methods based on probabilistic fracture mechanics, with regard to both crack initiation and crack growth, are outlined. Limit state modeling techniques from structural reliability theory are shown to be appropriate for application to this problem, for both individual failure mode and system-level assessment. In Part 2, the results of a case study for the high pressure turbine of a turboprop engine are described. The response surface approach is used to construct a fatigue performance function. This performance function is used with the First Order Reliability Method (FORM) to determine the probability of failure and the sensitivity of the fatigue life to the engine parameters for the first stage disk rim of the two stage turbine. A hybrid combination of regression and Monte Carlo simulation is to use incorporate time dependent random variables. System reliability is used to determine the system probability of failure, and the sensitivity of the system fatigue life to the engine parameters of the high pressure turbine. 'ne variation in the primary hot gas and secondary cooling air, the uncertainty of the complex mission loading, and the scatter in the material data are considered.

  4. Heritability and evolvability of fitness and nonfitness traits: Lessons from livestock.

    PubMed

    Hoffmann, Ary A; Merilä, Juha; Kristensen, Torsten N

    2016-08-01

    Data from natural populations have suggested a disconnection between trait heritability (variance standardized additive genetic variance, VA ) and evolvability (mean standardized VA ) and emphasized the importance of environmental variation as a determinant of trait heritability but not evolvability. However, these inferences are based on heterogeneous and often small datasets across species from different environments. We surveyed the relationship between evolvability and heritability in >100 traits in farmed cattle, taking advantage of large sample sizes and consistent genetic approaches. Heritability and evolvability estimates were positively correlated (r = 0.37/0.54 on untransformed/log scales) reflecting a substantial impact of VA on both measures. Furthermore, heritabilities and residual variances were uncorrelated. The differences between this and previously described patterns may reflect lower environmental variation experienced in farmed systems, but also low and heterogeneous quality of data from natural populations. Similar to studies on wild populations, heritabilities for life-history and behavioral traits were lower than for other traits. Traits having extremely low heritabilities and evolvabilities (17% of the studied traits) were almost exclusively life-history or behavioral traits, suggesting that evolutionary constraints stemming from lack of genetic variability are likely to be most common for classical "fitness" (cf. life-history) rather than for "nonfitness" (cf. morphological) traits. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.

  5. Evolving paradigms in multifocal breast cancer.

    PubMed

    Salgado, Roberto; Aftimos, Philippe; Sotiriou, Christos; Desmedt, Christine

    2015-04-01

    The 7th edition of the TNM defines multifocal breast cancer as multiple simultaneous ipsilateral and synchronous breast cancer lesions, provided they are macroscopically distinct and measurable using current traditional pathological and clinical tools. According to the College of American Pathologists (CAP), the characterization of only the largest lesion is considered sufficient, unless the grade and/or histology are different between the lesions. Here, we review three potentially clinically relevant aspects of multifocal breast cancers: first, the importance of a different intrinsic breast cancer subtype of the various lesions; second, the emerging awareness of inter-lesion heterogeneity; and last but not least, the potential introduction of bias in clinical trials due to the unrecognized biological diversity of these cancers. Although the current strategy to assess the lesion with the largest diameter has clearly its advantages in terms of costs and feasibility, this recommendation may not be sustainable in time and might need to be adapted to be compliant with new evolving paradigms in breast cancer.

  6. Evolving dynamic web pages using web mining

    NASA Astrophysics Data System (ADS)

    Menon, Kartik; Dagli, Cihan H.

    2003-08-01

    The heterogeneity and the lack of structure that permeates much of the ever expanding information sources on the WWW makes it difficult for the user to properly and efficiently access different web pages. Different users have different needs from the same web page. It is necessary to train the system to understand the needs and demands of the users. In other words there is a need for efficient and proper web mining. In this paper issues and possible ways of training the system and providing high level of organization for semi structured data available on the web is discussed. Web pages can be evolved based on history of query searches, browsing, links traversed and observation of the user behavior like book marking and time spent on viewing. Fuzzy clustering techniques help in grouping natural users and groups, neural networks, association rules and web traversals patterns help in efficient sequential anaysis based on previous searches and queries by the user. In this paper we analyze web server logs using above mentioned techniques to know more about user interactions. Analyzing these web server logs help to closely understand the user behavior and his/her web access pattern.

  7. An evolving model of online bipartite networks

    NASA Astrophysics Data System (ADS)

    Zhang, Chu-Xu; Zhang, Zi-Ke; Liu, Chuang

    2013-12-01

    Understanding the structure and evolution of online bipartite networks is a significant task since they play a crucial role in various e-commerce services nowadays. Recently, various attempts have been tried to propose different models, resulting in either power-law or exponential degree distributions. However, many empirical results show that the user degree distribution actually follows a shifted power-law distribution, the so-called Mandelbrot’s law, which cannot be fully described by previous models. In this paper, we propose an evolving model, considering two different user behaviors: random and preferential attachment. Extensive empirical results on two real bipartite networks, Delicious and CiteULike, show that the theoretical model can well characterize the structure of real networks for both user and object degree distributions. In addition, we introduce a structural parameter p, to demonstrate that the hybrid user behavior leads to the shifted power-law degree distribution, and the region of power-law tail will increase with the increment of p. The proposed model might shed some lights in understanding the underlying laws governing the structure of real online bipartite networks.

  8. Generative Representations for Evolving Families of Designs

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2003-01-01

    Since typical evolutionary design systems encode only a single artifact with each individual, each time the objective changes a new set of individuals must be evolved. When this objective varies in a way that can be parameterized, a more general method is to use a representation in which a single individual encodes an entire class of artifacts. In addition to saving time by preventing the need for multiple evolutionary runs, the evolution of parameter-controlled designs can create families of artifacts with the same style and a reuse of parts between members of the family. In this paper an evolutionary design system is described which uses a generative representation to encode families of designs. Because a generative representation is an algorithmic encoding of a design, its input parameters are a way to control aspects of the design it generates. By evaluating individuals multiple times with different input parameters the evolutionary design system creates individuals in which the input parameter controls specific aspects of a design. This system is demonstrated on two design substrates: neural-networks which solve the 3/5/7-parity problem and three-dimensional tables of varying heights.

  9. The continually evolving Clostridium difficile species.

    PubMed

    Cairns, Michelle D; Stabler, Richard A; Shetty, Nandini; Wren, Brendan W

    2012-08-01

    Clostridium difficile is a spore-forming Gram-positive bacterium that causes chronic diarrhea and sometimes life-threatening disease mainly in elderly and hospitalized patients. The reported incidence of C. difficile infection has changed dramatically over the last decade and has been related to the emergence of distinct clonal lineages that appear more transmissible and cause more severe infection. These include PCR ribotypes 027, 017 and more recently 078. Population biology studies using multilocus sequence typing and whole-genome comparisons has helped to define the C. difficile species into four clonal complexes that include PCR ribotypes 027, 017, 078 and 023, as well as a general grouping of most other PCR ribotypes. Further analysis of strains from diverse sources and geographical origins reveal significant microdiversity of clonal complexes and confirms that C. difficile is continuing to evolve. The study of C. difficile represents a real-time global evolutionary experiment where the pathogen is responding to a range of selective pressures created by human activity and practices in healthcare settings. The advent of whole-genome sequencing coupled with phylogeny (phylogeography and phylohistory) will provide unprecedented detail on the local and global emergence and disappearance of C. difficile clones, and facilitate more rational approaches to disease control. This review will highlight the emergence of virulent C. difficile clones and our current understanding of molecular epidemiology of the species.

  10. Extreme insular dwarfism evolved in a mammoth

    PubMed Central

    Herridge, Victoria L.; Lister, Adrian M.

    2012-01-01

    The insular dwarfism seen in Pleistocene elephants has come to epitomize the island rule; yet our understanding of this phenomenon is hampered by poor taxonomy. For Mediterranean dwarf elephants, where the most extreme cases of insular dwarfism are observed, a key systematic question remains unresolved: are all taxa phyletic dwarfs of a single mainland species Palaeoloxodon antiquus (straight-tusked elephant), or are some referable to Mammuthus (mammoths)? Ancient DNA and geochronological evidence have been used to support a Mammuthus origin for the Cretan ‘Palaeoloxodon’ creticus, but these studies have been shown to be flawed. On the basis of existing collections and recent field discoveries, we present new, morphological evidence for the taxonomic status of ‘P’. creticus, and show that it is indeed a mammoth, most probably derived from Early Pleistocene Mammuthus meridionalis or possibly Late Pliocene Mammuthus rumanus. We also show that Mammuthus creticus is smaller than other known insular dwarf mammoths, and is similar in size to the smallest dwarf Palaeoloxodon species from Sicily and Malta, making it the smallest mammoth species known to have existed. These findings indicate that extreme insular dwarfism has evolved to a similar degree independently in two elephant lineages. PMID:22572206

  11. Tearing Mode Stability of Evolving Toroidal Equilibria

    NASA Astrophysics Data System (ADS)

    Pletzer, A.; McCune, D.; Manickam, J.; Jardin, S. C.

    2000-10-01

    There are a number of toroidal equilibrium (such as JSOLVER, ESC, EFIT, and VMEC) and transport codes (such as TRANSP, BALDUR, and TSC) in our community that utilize differing equilibrium representations. There are also many heating and current drive (LSC and TORRAY), and stability (PEST1-3, GATO, NOVA, MARS, DCON, M3D) codes that require this equilibrium information. In an effort to provide seamless compatibility between the codes that produce and need these equilibria, we have developed two Fortran 90 modules, MEQ and XPLASMA, that serve as common interfaces between these two classes of codes. XPLASMA provides a common equilibrium representation for the heating and current drive applications while MEQ provides common equilibrium and associated metric information needed by MHD stability codes. We illustrate the utility of this approach by presenting results of PEST-3 tearing stability calculations of an NSTX discharge performed on profiles provided by the TRANSP code. Using the MEQ module, the TRANSP equilibrium data are stored in a Fortran 90 derived type and passed to PEST3 as a subroutine argument. All calculations are performed on the fly, as the profiles evolve.

  12. Evolving dark energy with w not = -1.

    PubMed

    Hall, Lawrence J; Nomura, Yasunori; Oliver, Steven J

    2005-09-30

    Theories of evolving quintessence are constructed that generically lead to deviations from the w = -1 prediction of nonevolving dark energy. The small mass scale that governs evolution, m(phi) approximately = 10(-33) eV, is radiatively stable, and the "Why now?" problem is solved. These results rest on seesaw cosmology: Fundamental physics and cosmology can be broadly understood from only two mass scales, the weak scale nu and the Planck scale M. Requiring a scale of dark energy rho(DE)(1/4) governed by nu2/M and a radiatively stable evolution rate m(phi) given by nu4/M3 leads to a distinctive form for the equation of state w(z). Dark energy resides in the potential of a hidden axion field that is generated by a new QCD-like force that gets strong at the scale lambda approximately = nu2/M approximately = rho(DE)(1/4). The evolution rate is given by a second seesaw that leads to the axion mass m(phi) approximately = lambda2/f, with f approximately = M.

  13. Origins of stereoselectivity in evolved ketoreductases.

    PubMed

    Noey, Elizabeth L; Tibrewal, Nidhi; Jiménez-Osés, Gonzalo; Osuna, Sílvia; Park, Jiyong; Bond, Carly M; Cascio, Duilio; Liang, Jack; Zhang, Xiyun; Huisman, Gjalt W; Tang, Yi; Houk, Kendall N

    2015-12-22

    Mutants of Lactobacillus kefir short-chain alcohol dehydrogenase, used here as ketoreductases (KREDs), enantioselectively reduce the pharmaceutically relevant substrates 3-thiacyclopentanone and 3-oxacyclopentanone. These substrates differ by only the heteroatom (S or O) in the ring, but the KRED mutants reduce them with different enantioselectivities. Kinetic studies show that these enzymes are more efficient with 3-thiacyclopentanone than with 3-oxacyclopentanone. X-ray crystal structures of apo- and NADP(+)-bound selected mutants show that the substrate-binding loop conformational preferences are modified by these mutations. Quantum mechanical calculations and molecular dynamics (MD) simulations are used to investigate the mechanism of reduction by the enzyme. We have developed an MD-based method for studying the diastereomeric transition state complexes and rationalize different enantiomeric ratios. This method, which probes the stability of the catalytic arrangement within the theozyme, shows a correlation between the relative fractions of catalytically competent poses for the enantiomeric reductions and the experimental enantiomeric ratio. Some mutations, such as A94F and Y190F, induce conformational changes in the active site that enlarge the small binding pocket, facilitating accommodation of the larger S atom in this region and enhancing S-selectivity with 3-thiacyclopentanone. In contrast, in the E145S mutant and the final variant evolved for large-scale production of the intermediate for the antibiotic sulopenem, R-selectivity is promoted by shrinking the small binding pocket, thereby destabilizing the pro-S orientation.

  14. Origins of stereoselectivity in evolved ketoreductases

    PubMed Central

    Noey, Elizabeth L.; Tibrewal, Nidhi; Jiménez-Osés, Gonzalo; Osuna, Sílvia; Park, Jiyong; Bond, Carly M.; Cascio, Duilio; Liang, Jack; Zhang, Xiyun; Huisman, Gjalt W.; Tang, Yi; Houk, Kendall N.

    2015-01-01

    Mutants of Lactobacillus kefir short-chain alcohol dehydrogenase, used here as ketoreductases (KREDs), enantioselectively reduce the pharmaceutically relevant substrates 3-thiacyclopentanone and 3-oxacyclopentanone. These substrates differ by only the heteroatom (S or O) in the ring, but the KRED mutants reduce them with different enantioselectivities. Kinetic studies show that these enzymes are more efficient with 3-thiacyclopentanone than with 3-oxacyclopentanone. X-ray crystal structures of apo- and NADP+-bound selected mutants show that the substrate-binding loop conformational preferences are modified by these mutations. Quantum mechanical calculations and molecular dynamics (MD) simulations are used to investigate the mechanism of reduction by the enzyme. We have developed an MD-based method for studying the diastereomeric transition state complexes and rationalize different enantiomeric ratios. This method, which probes the stability of the catalytic arrangement within the theozyme, shows a correlation between the relative fractions of catalytically competent poses for the enantiomeric reductions and the experimental enantiomeric ratio. Some mutations, such as A94F and Y190F, induce conformational changes in the active site that enlarge the small binding pocket, facilitating accommodation of the larger S atom in this region and enhancing S-selectivity with 3-thiacyclopentanone. In contrast, in the E145S mutant and the final variant evolved for large-scale production of the intermediate for the antibiotic sulopenem, R-selectivity is promoted by shrinking the small binding pocket, thereby destabilizing the pro-S orientation. PMID:26644568

  15. Fast evolving pair-instability supernovae

    DOE PAGES

    Kozyreva, Alexandra; Gilmer, Matthew; Hirschi, Raphael; ...

    2016-10-06

    With an increasing number of superluminous supernovae (SLSNe) discovered the ques- tion of their origin remains open and causes heated debates in the supernova commu- nity. Currently, there are three proposed mechanisms for SLSNe: (1) pair-instability supernovae (PISN), (2) magnetar-driven supernovae, and (3) models in which the su- pernova ejecta interacts with a circumstellar material ejected before the explosion. Based on current observations of SLSNe, the PISN origin has been disfavoured for a number of reasons. Many PISN models provide overly broad light curves and too reddened spectra, because of massive ejecta and a high amount of nickel. In themore » cur- rent study we re-examine PISN properties using progenitor models computed with the GENEC code. We calculate supernova explosions with FLASH and light curve evolu- tion with the radiation hydrodynamics code STELLA. We find that high-mass models (200 M⊙ and 250 M⊙) at relatively high metallicity (Z=0.001) do not retain hydro- gen in the outer layers and produce relatively fast evolving PISNe Type I and might be suitable to explain some SLSNe. We also investigate uncertainties in light curve modelling due to codes, opacities, the nickel-bubble effect and progenitor structure and composition.« less

  16. Fast evolving pair-instability supernovae

    SciTech Connect

    Kozyreva, Alexandra; Gilmer, Matthew; Hirschi, Raphael; Frohlich, Carla; Blinnikov, Sergey; Wollaeger, Ryan Thomas; Noebauer, Ulrich M.; van Rossum, Daniel R.; Heger, Alexander; Even, Wesley Paul; Waldman, Roni; Tolstov, Alexey; Chatzopoulos, Emmanouil; Sorokina, Elena

    2016-10-06

    With an increasing number of superluminous supernovae (SLSNe) discovered the ques- tion of their origin remains open and causes heated debates in the supernova commu- nity. Currently, there are three proposed mechanisms for SLSNe: (1) pair-instability supernovae (PISN), (2) magnetar-driven supernovae, and (3) models in which the su- pernova ejecta interacts with a circumstellar material ejected before the explosion. Based on current observations of SLSNe, the PISN origin has been disfavoured for a number of reasons. Many PISN models provide overly broad light curves and too reddened spectra, because of massive ejecta and a high amount of nickel. In the cur- rent study we re-examine PISN properties using progenitor models computed with the GENEC code. We calculate supernova explosions with FLASH and light curve evolu- tion with the radiation hydrodynamics code STELLA. We find that high-mass models (200 M⊙ and 250 M⊙) at relatively high metallicity (Z=0.001) do not retain hydro- gen in the outer layers and produce relatively fast evolving PISNe Type I and might be suitable to explain some SLSNe. We also investigate uncertainties in light curve modelling due to codes, opacities, the nickel-bubble effect and progenitor structure and composition.

  17. Consensus in evolving networks of mobile agents

    NASA Astrophysics Data System (ADS)

    Baronchelli, Andrea; Díaz-Guilera, Albert

    2012-02-01

    Populations of mobile and communicating agents describe a vast array of technological and natural systems, ranging from sensor networks to animal groups. Here, we investigate how a group-level agreement may emerge in the continuously evolving networks defined by the local interactions of the moving individuals. We adopt a general scheme of motion in two dimensions and we let the individuals interact through the minimal naming game, a prototypical scheme to investigate social consensus. We distinguish different regimes of convergence determined by the emission range of the agents and by their mobility, and we identify the corresponding scaling behaviors of the consensus time. In the same way, we rationalize also the behavior of the maximum memory used during the convergence process, which determines the minimum cognitive/storage capacity needed by the individuals. Overall, we believe that the simple and general model presented in this talk can represent a helpful reference for a better understanding of the behavior of populations of mobile agents.

  18. Evolving application of biomimetic nanostructured hydroxyapatite.

    PubMed

    Roveri, Norberto; Iafisco, Michele

    2010-11-09

    By mimicking Nature, we can design and synthesize inorganic smart materials that are reactive to biological tissues. These smart materials can be utilized to design innovative third-generation biomaterials, which are able to not only optimize their interaction with biological tissues and environment, but also mimic biogenic materials in their functionalities. The biomedical applications involve increasing the biomimetic levels from chemical composition, structural organization, morphology, mechanical behavior, nanostructure, and bulk and surface chemical-physical properties until the surface becomes bioreactive and stimulates cellular materials. The chemical-physical characteristics of biogenic hydroxyapatites from bone and tooth have been described, in order to point out the elective sides, which are important to reproduce the design of a new biomimetic synthetic hydroxyapatite. This review outlines the evolving applications of biomimetic synthetic calcium phosphates, details the main characteristics of bone and tooth, where the calcium phosphates are present, and discusses the chemical-physical characteristics of biomimetic calcium phosphates, methods of synthesizing them, and some of their biomedical applications.

  19. How does cognition evolve? Phylogenetic comparative psychology.

    PubMed

    MacLean, Evan L; Matthews, Luke J; Hare, Brian A; Nunn, Charles L; Anderson, Rindy C; Aureli, Filippo; Brannon, Elizabeth M; Call, Josep; Drea, Christine M; Emery, Nathan J; Haun, Daniel B M; Herrmann, Esther; Jacobs, Lucia F; Platt, Michael L; Rosati, Alexandra G; Sandel, Aaron A; Schroepfer, Kara K; Seed, Amanda M; Tan, Jingzhi; van Schaik, Carel P; Wobber, Victoria

    2012-03-01

    Now more than ever animal studies have the potential to test hypotheses regarding how cognition evolves. Comparative psychologists have developed new techniques to probe the cognitive mechanisms underlying animal behavior, and they have become increasingly skillful at adapting methodologies to test multiple species. Meanwhile, evolutionary biologists have generated quantitative approaches to investigate the phylogenetic distribution and function of phenotypic traits, including cognition. In particular, phylogenetic methods can quantitatively (1) test whether specific cognitive abilities are correlated with life history (e.g., lifespan), morphology (e.g., brain size), or socio-ecological variables (e.g., social system), (2) measure how strongly phylogenetic relatedness predicts the distribution of cognitive skills across species, and (3) estimate the ancestral state of a given cognitive trait using measures of cognitive performance from extant species. Phylogenetic methods can also be used to guide the selection of species comparisons that offer the strongest tests of a priori predictions of cognitive evolutionary hypotheses (i.e., phylogenetic targeting). Here, we explain how an integration of comparative psychology and evolutionary biology will answer a host of questions regarding the phylogenetic distribution and history of cognitive traits, as well as the evolutionary processes that drove their evolution.

  20. Epidemic spreading on evolving signed networks.

    PubMed

    Saeedian, M; Azimi-Tafreshi, N; Jafari, G R; Kertesz, J

    2017-02-01

    Most studies of disease spreading consider the underlying social network as obtained without the contagion, though epidemic influences people's willingness to contact others: A "friendly" contact may be turned to "unfriendly" to avoid infection. We study the susceptible-infected disease-spreading model on signed networks, in which each edge is associated with a positive or negative sign representing the friendly or unfriendly relation between its end nodes. In a signed network, according to Heider's theory, edge signs evolve such that finally a state of structural balance is achieved, corresponding to no frustration in physics terms. However, the danger of infection affects the evolution of its edge signs. To describe the coupled problem of the sign evolution and disease spreading, we generalize the notion of structural balance by taking into account the state of the nodes. We introduce an energy function and carry out Monte Carlo simulations on complete networks to test the energy landscape, where we find local minima corresponding to the so-called jammed states. We study the effect of the ratio of initial friendly to unfriendly connections on the propagation of disease. The steady state can be balanced or a jammed state such that a coexistence occurs between susceptible and infected nodes in the system.

  1. Approximating centrality in evolving graphs: toward sublinearity

    NASA Astrophysics Data System (ADS)

    Priest, Benjamin W.; Cybenko, George

    2017-05-01

    The identification of important nodes is a ubiquitous problem in the analysis of social networks. Centrality indices (such as degree centrality, closeness centrality, betweenness centrality, PageRank, and others) are used across many domains to accomplish this task. However, the computation of such indices is expensive on large graphs. Moreover, evolving graphs are becoming increasingly important in many applications. It is therefore desirable to develop on-line algorithms that can approximate centrality measures using memory sublinear in the size of the graph. We discuss the challenges facing the semi-streaming computation of many centrality indices. In particular, we apply recent advances in the streaming and sketching literature to provide a preliminary streaming approximation algorithm for degree centrality utilizing CountSketch and a multi-pass semi-streaming approximation algorithm for closeness centrality leveraging a spanner obtained through iteratively sketching the vertex-edge adjacency matrix. We also discuss possible ways forward for approximating betweenness centrality, as well as spectral measures of centrality. We provide a preliminary result using sketched low-rank approximations to approximate the output of the HITS algorithm.

  2. Evolving role of MRI in Crohn's disease.

    PubMed

    Yacoub, Joseph H; Obara, Piotr; Oto, Aytekin

    2013-06-01

    MR enterography is playing an evolving role in the evaluation of small bowel Crohn's disease (CD). Standard MR enterography includes a combination of rapidly acquired T2 sequence, balanced steady-state acquisition, and contrast enhanced T1-weighted gradient echo sequence. The diagnostic performance of these sequences has been shown to be comparable, and in some respects superior, to other small bowel imaging modalities. The findings of CD on MR enterography have been well described in the literature. New and emerging techniques such as diffusion-weighted imaging (DWI), dynamic contrast enhanced MRI (DCE-MRI), cinematography, and magnetization transfer, may lead to improved accuracy in characterizing the disease. These advanced techniques can provide quantitative parameters that may prove to be useful in assessing disease activity, severity, and response to treatment. In the future, MR enterography may play an increasing role in management decisions for patients with small bowel CD; however, larger studies are needed to validate these emerging MRI parameters as imaging biomarkers.

  3. Extreme insular dwarfism evolved in a mammoth.

    PubMed

    Herridge, Victoria L; Lister, Adrian M

    2012-08-22

    The insular dwarfism seen in Pleistocene elephants has come to epitomize the island rule; yet our understanding of this phenomenon is hampered by poor taxonomy. For Mediterranean dwarf elephants, where the most extreme cases of insular dwarfism are observed, a key systematic question remains unresolved: are all taxa phyletic dwarfs of a single mainland species Palaeoloxodon antiquus (straight-tusked elephant), or are some referable to Mammuthus (mammoths)? Ancient DNA and geochronological evidence have been used to support a Mammuthus origin for the Cretan 'Palaeoloxodon' creticus, but these studies have been shown to be flawed. On the basis of existing collections and recent field discoveries, we present new, morphological evidence for the taxonomic status of 'P'. creticus, and show that it is indeed a mammoth, most probably derived from Early Pleistocene Mammuthus meridionalis or possibly Late Pliocene Mammuthus rumanus. We also show that Mammuthus creticus is smaller than other known insular dwarf mammoths, and is similar in size to the smallest dwarf Palaeoloxodon species from Sicily and Malta, making it the smallest mammoth species known to have existed. These findings indicate that extreme insular dwarfism has evolved to a similar degree independently in two elephant lineages.

  4. Women's oral health: the evolving science.

    PubMed

    Sinkford, Jeanne C; Valachovic, Richard W; Harrison, Sonja G

    2008-02-01

    The evidence base for women's oral health is emerging from legislative action, clinical research, and survey documentation. The Women's Health in the Dental School Curriculum study (1999) followed a similar study (1996) of medical school curricula. Both of these major efforts resulted from statutory mandates in the National Institutes of Health Revitalization Act of 1993 (updated October 2000). A major study of the Institute of Medicine (IOM) National Academy of Sciences in 2001 concluded that "the study of sex differences is evolving into a mature science." This IOM study documented the scientific basis for gender-related policy and research and challenged the dental research enterprise to conduct collaborative, cross-disciplinary research on gender-related issues in oral health, disease, and disparities. This report chronicles some of the factors that have and continue to influence concepts of women's oral health in dental education, research, and practice. Gender issues related to women's health are no longer restricted to reproductive issues but are being considered across the life span and include psychosocial factors that impact women's health and treatment outcomes.

  5. Fast evolving pair-instability supernovae

    SciTech Connect

    Kozyreva, Alexandra; Gilmer, Matthew; Hirschi, Raphael; Frohlich, Carla; Blinnikov, Sergey; Wollaeger, Ryan Thomas; Noebauer, Ulrich M.; van Rossum, Daniel R.; Heger, Alexander; Even, Wesley Paul; Waldman, Roni; Tolstov, Alexey; Chatzopoulos, Emmanouil; Sorokina, Elena

    2016-10-06

    With an increasing number of superluminous supernovae (SLSNe) discovered the ques- tion of their origin remains open and causes heated debates in the supernova commu- nity. Currently, there are three proposed mechanisms for SLSNe: (1) pair-instability supernovae (PISN), (2) magnetar-driven supernovae, and (3) models in which the su- pernova ejecta interacts with a circumstellar material ejected before the explosion. Based on current observations of SLSNe, the PISN origin has been disfavoured for a number of reasons. Many PISN models provide overly broad light curves and too reddened spectra, because of massive ejecta and a high amount of nickel. In the cur- rent study we re-examine PISN properties using progenitor models computed with the GENEC code. We calculate supernova explosions with FLASH and light curve evolu- tion with the radiation hydrodynamics code STELLA. We find that high-mass models (200 M⊙ and 250 M⊙) at relatively high metallicity (Z=0.001) do not retain hydro- gen in the outer layers and produce relatively fast evolving PISNe Type I and might be suitable to explain some SLSNe. We also investigate uncertainties in light curve modelling due to codes, opacities, the nickel-bubble effect and progenitor structure and composition.

  6. Speciation genetics: current status and evolving approaches.

    PubMed

    Wolf, Jochen B W; Lindell, Johan; Backström, Niclas

    2010-06-12

    The view of species as entities subjected to natural selection and amenable to change put forth by Charles Darwin and Alfred Wallace laid the conceptual foundation for understanding speciation. Initially marred by a rudimental understanding of hereditary principles, evolutionists gained appreciation of the mechanistic underpinnings of speciation following the merger of Mendelian genetic principles with Darwinian evolution. Only recently have we entered an era where deciphering the molecular basis of speciation is within reach. Much focus has been devoted to the genetic basis of intrinsic postzygotic isolation in model organisms and several hybrid incompatibility genes have been successfully identified. However, concomitant with the recent technological advancements in genome analysis and a newfound interest in the role of ecology in the differentiation process, speciation genetic research is becoming increasingly open to non-model organisms. This development will expand speciation research beyond the traditional boundaries and unveil the genetic basis of speciation from manifold perspectives and at various stages of the splitting process. This review aims at providing an extensive overview of speciation genetics. Starting from key historical developments and core concepts of speciation genetics, we focus much of our attention on evolving approaches and introduce promising methodological approaches for future research venues.

  7. Evolving application of biomimetic nanostructured hydroxyapatite

    PubMed Central

    Roveri, Norberto; Iafisco, Michele

    2010-01-01

    By mimicking Nature, we can design and synthesize inorganic smart materials that are reactive to biological tissues. These smart materials can be utilized to design innovative third-generation biomaterials, which are able to not only optimize their interaction with biological tissues and environment, but also mimic biogenic materials in their functionalities. The biomedical applications involve increasing the biomimetic levels from chemical composition, structural organization, morphology, mechanical behavior, nanostructure, and bulk and surface chemical–physical properties until the surface becomes bioreactive and stimulates cellular materials. The chemical–physical characteristics of biogenic hydroxyapatites from bone and tooth have been described, in order to point out the elective sides, which are important to reproduce the design of a new biomimetic synthetic hydroxyapatite. This review outlines the evolving applications of biomimetic synthetic calcium phosphates, details the main characteristics of bone and tooth, where the calcium phosphates are present, and discusses the chemical–physical characteristics of biomimetic calcium phosphates, methods of synthesizing them, and some of their biomedical applications. PMID:24198477

  8. Evolving epidemiology of HIV-associated malignancies.

    PubMed

    Shiels, Meredith S; Engels, Eric A

    2017-01-01

    The purpose of this review is to describe the epidemiology of cancers that occur at an elevated rate among people with HIV infection in the current treatment era, including discussion of the cause of these cancers, as well as changes in cancer incidence and burden over time. Rates of Kaposi sarcoma, non-Hodgkin lymphoma and cervical cancer have declined sharply in developed countries during the highly active antiretroviral therapy era, but remain elevated 800-fold, 10-fold and four-fold, respectively, compared with the general population. Most studies have reported significant increases in liver cancer rates and decreases in lung cancer over time. Although some studies have reported significant increases in anal cancer rates and declines in Hodgkin lymphoma rates, others have shown stable incidence. Declining mortality among HIV-infected individuals has resulted in the growth and aging of the HIV-infected population, causing an increase in the number of non-AIDS-defining cancers diagnosed each year in HIV-infected people. The epidemiology of cancer among HIV-infected people has evolved since the beginning of the HIV epidemic with particularly marked changes since the introduction of modern treatment. Public health interventions aimed at prevention and early detection of cancer among HIV-infected people are needed.

  9. Metapopulation capacity of evolving fluvial landscapes

    NASA Astrophysics Data System (ADS)

    Bertuzzo, Enrico; Rodriguez-Iturbe, Ignacio; Rinaldo, Andrea

    2015-04-01

    The form of fluvial landscapes is known to attain stationary network configurations that settle in dynamically accessible minima of total energy dissipation by landscape-forming discharges. Recent studies have highlighted the role of the dendritic structure of river networks in controlling population dynamics of the species they host and large-scale biodiversity patterns. Here, we systematically investigate the relation between energy dissipation, the physical driver for the evolution of river networks, and the ecological dynamics of their embedded biota. To that end, we use the concept of metapopulation capacity, a measure to link landscape structures with the population dynamics they host. Technically, metapopulation capacity is the leading eigenvalue λM of an appropriate "landscape" matrix subsuming whether a given species is predicted to persist in the long run. λM can conveniently be used to rank different landscapes in terms of their capacity to support viable metapopulations. We study how λM changes in response to the evolving network configurations of spanning trees. Such sequence of configurations is theoretically known to relate network selection to general landscape evolution equations through imperfect searches for dynamically accessible states frustrated by the vagaries of Nature. Results show that the process shaping the metric and the topological properties of river networks, prescribed by physical constraints, leads to a progressive increase in the corresponding metapopulation capacity and therefore on the landscape capacity to support metapopulations—with implications on biodiversity in fluvial ecosystems.

  10. Evolving role of pediatric nurse practitioners.

    PubMed

    Aruda, Mary M; Griffin, Valerie J; Schartz, Kathryn; Geist, Melissa

    2016-02-01

    To report and interpret findings from national pediatric nurse practitioner (PNP) job analysis surveys reflecting the changes in the knowledge and skills required for advanced practice. National role delineation studies (RDS) conducted by American Nurses Credentialing Center (ANCC) in 2003, 2008, and 2011. Since the first nurse practitioner (NP) program was established in 1965 to train pediatric nurses for advanced practice, the role of the PNP has continued to develop. The RDS results demonstrate the increased autonomy of PNP's prescription of medication as the top work activity category identified, followed by the reporting of suspected abuse, exploitation, and/or neglect and immunizing based on current recommendations. Analysis of the changes in role or work activities, tied to the knowledge and skills required to perform those activities, can provide content for educators updating curriculum, for clinicians to remain current in their practice and impact healthcare policy. The current PNP role has evolved to meet the workforce demands of providing primary care to the pediatric population with increasing complex social and healthcare needs. Role analysis is important as NPs move forward to practice to the full extent of their education and training. ©2015 American Association of Nurse Practitioners.

  11. How does cognition evolve? Phylogenetic comparative psychology

    PubMed Central

    Matthews, Luke J.; Hare, Brian A.; Nunn, Charles L.; Anderson, Rindy C.; Aureli, Filippo; Brannon, Elizabeth M.; Call, Josep; Drea, Christine M.; Emery, Nathan J.; Haun, Daniel B. M.; Herrmann, Esther; Jacobs, Lucia F.; Platt, Michael L.; Rosati, Alexandra G.; Sandel, Aaron A.; Schroepfer, Kara K.; Seed, Amanda M.; Tan, Jingzhi; van Schaik, Carel P.; Wobber, Victoria

    2014-01-01

    Now more than ever animal studies have the potential to test hypotheses regarding how cognition evolves. Comparative psychologists have developed new techniques to probe the cognitive mechanisms underlying animal behavior, and they have become increasingly skillful at adapting methodologies to test multiple species. Meanwhile, evolutionary biologists have generated quantitative approaches to investigate the phylogenetic distribution and function of phenotypic traits, including cognition. In particular, phylogenetic methods can quantitatively (1) test whether specific cognitive abilities are correlated with life history (e.g., lifespan), morphology (e.g., brain size), or socio-ecological variables (e.g., social system), (2) measure how strongly phylogenetic relatedness predicts the distribution of cognitive skills across species, and (3) estimate the ancestral state of a given cognitive trait using measures of cognitive performance from extant species. Phylogenetic methods can also be used to guide the selection of species comparisons that offer the strongest tests of a priori predictions of cognitive evolutionary hypotheses (i.e., phylogenetic targeting). Here, we explain how an integration of comparative psychology and evolutionary biology will answer a host of questions regarding the phylogenetic distribution and history of cognitive traits, as well as the evolutionary processes that drove their evolution. PMID:21927850

  12. Epidemic spreading on evolving signed networks

    NASA Astrophysics Data System (ADS)

    Saeedian, M.; Azimi-Tafreshi, N.; Jafari, G. R.; Kertesz, J.

    2017-02-01

    Most studies of disease spreading consider the underlying social network as obtained without the contagion, though epidemic influences people's willingness to contact others: A "friendly" contact may be turned to "unfriendly" to avoid infection. We study the susceptible-infected disease-spreading model on signed networks, in which each edge is associated with a positive or negative sign representing the friendly or unfriendly relation between its end nodes. In a signed network, according to Heider's theory, edge signs evolve such that finally a state of structural balance is achieved, corresponding to no frustration in physics terms. However, the danger of infection affects the evolution of its edge signs. To describe the coupled problem of the sign evolution and disease spreading, we generalize the notion of structural balance by taking into account the state of the nodes. We introduce an energy function and carry out Monte Carlo simulations on complete networks to test the energy landscape, where we find local minima corresponding to the so-called jammed states. We study the effect of the ratio of initial friendly to unfriendly connections on the propagation of disease. The steady state can be balanced or a jammed state such that a coexistence occurs between susceptible and infected nodes in the system.

  13. Speciation genetics: current status and evolving approaches

    PubMed Central

    Wolf, Jochen B. W.; Lindell, Johan; Backström, Niclas

    2010-01-01

    The view of species as entities subjected to natural selection and amenable to change put forth by Charles Darwin and Alfred Wallace laid the conceptual foundation for understanding speciation. Initially marred by a rudimental understanding of hereditary principles, evolutionists gained appreciation of the mechanistic underpinnings of speciation following the merger of Mendelian genetic principles with Darwinian evolution. Only recently have we entered an era where deciphering the molecular basis of speciation is within reach. Much focus has been devoted to the genetic basis of intrinsic postzygotic isolation in model organisms and several hybrid incompatibility genes have been successfully identified. However, concomitant with the recent technological advancements in genome analysis and a newfound interest in the role of ecology in the differentiation process, speciation genetic research is becoming increasingly open to non-model organisms. This development will expand speciation research beyond the traditional boundaries and unveil the genetic basis of speciation from manifold perspectives and at various stages of the splitting process. This review aims at providing an extensive overview of speciation genetics. Starting from key historical developments and core concepts of speciation genetics, we focus much of our attention on evolving approaches and introduce promising methodological approaches for future research venues. PMID:20439277

  14. Origins and evolvability of the PAX family.

    PubMed

    Paixão-Côrtes, Vanessa R; Salzano, Francisco M; Bortolini, Maria Cátira

    2015-08-01

    The paired box (PAX) family of transcription/developmental genes plays a key role in numerous stages of embryonic development, as well as in adult organogenesis. There is evidence linking the acquisition of a paired-like DNA binding domain (PD) to domestication of a Tc1/mariner transposon. Further duplication/deletion processes led to at least five paralogous metazoan protein groups, which can be classified into two supergroups, PAXB-like or PAXD-like, using ancestral defining structures; the PD plus an octapeptide motif (OP) and a paired-type homeobox DNA binding domain (PTHD), producing the PD-OP-PTHD structure characteristic of the PAXB-like group, whereas an additional domain, the paired-type homeodomain tail (PHT), is present in the PAXD-like group, producing a PD-OP-PTHD-PHT structure. We examined their patterns of distribution in various species, using both available data and new bioinformatic analyses, including vertebrate PAX genes and their shared and specific functions, as well as inter- and intraspecific variability of PAX in primates. These analyses revealed a relatively conserved PAX network, accompanied by specific changes that led to adaptive novelties. Therefore, both stability and evolvability shaped the molecular evolution of this key transcriptional network. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Evolving Resistance Among Gram-positive Pathogens

    PubMed Central

    Munita, Jose M.; Bayer, Arnold S.; Arias, Cesar A.

    2015-01-01

    Antimicrobial therapy is a key component of modern medical practice and a cornerstone for the development of complex clinical interventions in critically ill patients. Unfortunately, the increasing problem of antimicrobial resistance is now recognized as a major public health threat jeopardizing the care of thousands of patients worldwide. Gram-positive pathogens exhibit an immense genetic repertoire to adapt and develop resistance to virtually all antimicrobials clinically available. As more molecules become available to treat resistant gram-positive infections, resistance emerges as an evolutionary response. Thus, antimicrobial resistance has to be envisaged as an evolving phenomenon that demands constant surveillance and continuous efforts to identify emerging mechanisms of resistance to optimize the use of antibiotics and create strategies to circumvent this problem. Here, we will provide a broad perspective on the clinical aspects of antibiotic resistance in relevant gram-positive pathogens with emphasis on the mechanistic strategies used by these organisms to avoid being killed by commonly used antimicrobial agents. PMID:26316558

  16. Lower mass limit of an evolving interstellar cloud and chemistry in an evolving oscillatory cloud

    NASA Technical Reports Server (NTRS)

    Tarafdar, S. P.

    1986-01-01

    Simultaneous solution of the equation of motion, equation of state and energy equation including heating and cooling processes for interstellar medium gives for a collapsing cloud a lower mass limit which is significantly smaller than the Jeans mass for the same initial density. The clouds with higher mass than this limiting mass collapse whereas clouds with smaller than critical mass pass through a maximum central density giving apparently similar clouds (i.e., same Av, size and central density) at two different phases of its evolution (i.e., with different life time). Preliminary results of chemistry in such an evolving oscillatory cloud show significant difference in abundances of some of the molecules in two physically similar clouds with different life times. The problems of depletion and short life time of evolving clouds appear to be less severe in such an oscillatory cloud.

  17. Eosinophilic esophagitis: current understanding and evolving concepts

    PubMed Central

    Kweh, Barry; Thien, Francis

    2017-01-01

    Eosinophilic esophagitis (EoE) is now considered to represent a form of food allergy and this is demonstrated by a response to elimination diet in many patients. A critical additional factor may be an inherent impairment in epithelial barrier integrity, possibly worsened by reflux of gastric contents and improved with proton pump inhibitor (PPI) use. Key clinic challenges are posed by the absence of reliable allergy tests to guide elimination diet, and the subsequent need for invasive endoscopic assessment following empirical food challenge, meaning that corticosteroids will remain the mainstay of therapy for many. From a research standpoint, determining if impairments in barrier integrity are innate, and how PPIs address this deficit (which may be pH independent) are important questions that when answered may allow future therapeutic advancement. PMID:28154800

  18. Testing for PV Reliability (Presentation)

    SciTech Connect

    Kurtz, S.; Bansal, S.

    2014-09-01

    The DOE SUNSHOT workshop is seeking input from the community about PV reliability and how the DOE might address gaps in understanding. This presentation describes the types of testing that are needed for PV reliability and introduces a discussion to identify gaps in our understanding of PV reliability testing.

  19. Reliability of Fault Tolerant Control Systems. Part 1

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva

    2001-01-01

    This paper reports Part I of a two part effort, that is intended to delineate the relationship between reliability and fault tolerant control in a quantitative manner. Reliability analysis of fault-tolerant control systems is performed using Markov models. Reliability properties, peculiar to fault-tolerant control systems are emphasized. As a consequence, coverage of failures through redundancy management can be severely limited. It is shown that in the early life of a syi1ein composed of highly reliable subsystems, the reliability of the overall system is affine with respect to coverage, and inadequate coverage induces dominant single point failures. The utility of some existing software tools for assessing the reliability of fault tolerant control systems is also discussed. Coverage modeling is attempted in Part II in a way that captures its dependence on the control performance and on the diagnostic resolution.

  20. Variant profiling of evolving prokaryotic populations

    PubMed Central

    Zojer, Markus; Schuster, Lisa N.; Schulz, Frederik; Pfundner, Alexander; Horn, Matthias

    2017-01-01

    Genomic heterogeneity of bacterial species is observed and studied in experimental evolution experiments and clinical diagnostics, and occurs as micro-diversity of natural habitats. The challenge for genome research is to accurately capture this heterogeneity with the currently used short sequencing reads. Recent advances in NGS technologies improved the speed and coverage and thus allowed for deep sequencing of bacterial populations. This facilitates the quantitative assessment of genomic heterogeneity, including low frequency alleles or haplotypes. However, false positive variant predictions due to sequencing errors and mapping artifacts of short reads need to be prevented. We therefore created VarCap, a workflow for the reliable prediction of different types of variants even at low frequencies. In order to predict SNPs, InDels and structural variations, we evaluated the sensitivity and accuracy of different software tools using synthetic read data. The results suggested that the best sensitivity could be reached by a union of different tools, however at the price of increased false positives. We identified possible reasons for false predictions and used this knowledge to improve the accuracy by post-filtering the predicted variants according to properties such as frequency, coverage, genomic environment/localization and co-localization with other variants. We observed that best precision was achieved by using an intersection of at least two tools per variant. This resulted in the reliable prediction of variants above a minimum relative abundance of 2%. VarCap is designed for being routinely used within experimental evolution experiments or for clinical diagnostics. The detected variants are reported as frequencies within a VCF file and as a graphical overview of the distribution of the different variant/allele/haplotype frequencies. The source code of VarCap is available at https://github.com/ma2o/VarCap. In order to provide this workflow to a broad community

  1. Variant profiling of evolving prokaryotic populations.

    PubMed

    Zojer, Markus; Schuster, Lisa N; Schulz, Frederik; Pfundner, Alexander; Horn, Matthias; Rattei, Thomas

    2017-01-01

    Genomic heterogeneity of bacterial species is observed and studied in experimental evolution experiments and clinical diagnostics, and occurs as micro-diversity of natural habitats. The challenge for genome research is to accurately capture this heterogeneity with the currently used short sequencing reads. Recent advances in NGS technologies improved the speed and coverage and thus allowed for deep sequencing of bacterial populations. This facilitates the quantitative assessment of genomic heterogeneity, including low frequency alleles or haplotypes. However, false positive variant predictions due to sequencing errors and mapping artifacts of short reads need to be prevented. We therefore created VarCap, a workflow for the reliable prediction of different types of variants even at low frequencies. In order to predict SNPs, InDels and structural variations, we evaluated the sensitivity and accuracy of different software tools using synthetic read data. The results suggested that the best sensitivity could be reached by a union of different tools, however at the price of increased false positives. We identified possible reasons for false predictions and used this knowledge to improve the accuracy by post-filtering the predicted variants according to properties such as frequency, coverage, genomic environment/localization and co-localization with other variants. We observed that best precision was achieved by using an intersection of at least two tools per variant. This resulted in the reliable prediction of variants above a minimum relative abundance of 2%. VarCap is designed for being routinely used within experimental evolution experiments or for clinical diagnostics. The detected variants are reported as frequencies within a VCF file and as a graphical overview of the distribution of the different variant/allele/haplotype frequencies. The source code of VarCap is available at https://github.com/ma2o/VarCap. In order to provide this workflow to a broad community

  2. Discrete Reliability Projection

    DTIC Science & Technology

    2014-12-01

    Defense, Handbook MIL - HDBK -189C, 2011 Hall, J. B., Methodology for Evaluating Reliability Growth Programs of Discrete Systems, Ph.D. thesis, University...pk,i ] · [ 1− (1− θ̆k) · ( N k · T )]k−m , (2.13) 5 2 Hall’s Model where m is the number of observed failure modes and d∗i estimates di (either based...Mode Failures FEF Ni d ∗ i 1 1 0.95 2 1 0.70 3 1 0.90 4 1 0.90 5 4 0.95 6 2 0.70 7 1 0.80 Using equations 2.1 and 2.2 we can calculate the failure

  3. The Evolving Context for Science and Society

    NASA Astrophysics Data System (ADS)

    Leshner, Alan I.

    2012-01-01

    The relationship between science and the rest of society is critical both to the support it receives from the public and to the receptivity of the broader citizenry to science's explanations of the nature of the world and to its other outputs. Science's ultimate usefulness depends on a receptive public. For example, given that science and technology are imbedded in virtually every issue of modern life, either as a cause or a cure, it is critical that the relationship be strong and that the role of science is well appreciated by society, or the impacts of scientific advances will fall short of their great potential. Unfortunately, a variety of problems have been undermining the science-society relationship for over a decade. Some problems emerge from within the scientific enterprise - like scientific misconduct or conflicts of interest - and tarnish or weaken its image and credibility. Other problems and stresses come from outside the enterprise. The most obvious external pressure is that the world economic situation is undermining the financial support of both the conduct and infrastructure of science. Other examples of external pressures include conflicts between what science is revealing and political or economic expediency - e.g., global climate change - or instances where scientific advances encroach upon core human values or beliefs - e.g., scientific understanding of the origins and evolution of the universe as compared to biblical accounts of creation. Significant efforts - some dramatically non-traditional for many in the scientific community - are needed to restore balance to the science-society relationship.

  4. Evolution of evolvability and phenotypic plasticity in virtual cells.

    PubMed

    Cuypers, Thomas D; Rutten, Jacob P; Hogeweg, Paulien

    2017-02-28

    Changing environmental conditions pose a challenge for the survival of species. To meet this challenge organisms adapt their phenotype by physiological regulation (phenotypic plasticity) or by evolving. Regulatory mechanisms that ensure a constant internal environment in the face of continuous external fluctuations (homeostasis) are ubiquitous and essential for survival. However, more drastic and enduring environmental change, often requires lineages to adapt by mutating. In vitro evolutionary experiments with microbes show that adaptive, large phenotypic changes occur remarkably quickly, requiring only a few mutations. It has been proposed that the high evolvability demonstrated by these microbes, is an evolved property. If both regulation (phenotypic plasticity) and evolvability can evolve as strategies to adapt to change, what are the conditions that favour the emergence of either of these strategy? Does evolution of one strategy hinder or facilitate evolution of the other strategy? Here we investigate this with computational evolutionary modelling in populations of Virtual Cells. During a preparatory evolutionary phase, Virtual Cells evolved homeostasis regulation for internal metabolite concentrations in a fluctuating environment. The resulting wild-type Virtual Cell strains (WT-VCS) were then exposed to periodic, drastic environmental changes, while maintaining selection on homeostasis regulation. In different sets of simulations the nature and frequencies of environmental change were varied. Pre-evolved WT-VCS were highly evolvable, showing rapid evolutionary adaptation after novel environmental change. Moreover, continued low frequency changes resulted in evolutionary restructuring of the genome that enables even faster adaptation with very few mutations. In contrast, when change frequency is high, lineages evolve phenotypic plasticity that allows them to be fit in different environments without mutations. Yet, evolving phenotypic plasticity is a

  5. A systems approach to creating reliable batteries for implantable medical applications

    NASA Astrophysics Data System (ADS)

    Clark, William D. K.; Syracuse, Kenneth C.; Visbisky, Mark

    Lithium batteries have been used to power implantable medical devices for over 25 years. During this period a system to ensure the reliability of battery performance has evolved, and continues to evolve, that embodies the use of quality systems, statistical sampling and testing of product, life testing and performance modeling. The development of a new lithium/carbon monofluoride battery product line will be used as an example of how these elements work together.

  6. BUBBLE DYNAMICS AT GAS-EVOLVING ELECTRODES

    SciTech Connect

    Sides, Paul J.

    1980-12-01

    Nucleation of bubbles, their growth by diffusion of dissolved gas to the bubble surface and by coalescence, and their detachment from the electrode are all very fast phenomena; furthermore, electrolytically generated bubbles range in size from ten to a few hundred microns; therefore, magnification and high speed cinematography are required to observe bubbles and the phenomena of their growth on the electrode surface. Viewing the action from the front side (the surface on which the bubbles form) is complicated because the most important events occur close to the surface and are obscured by other bubbles passing between the camera and the electrode; therefore, oxygen was evolved on a transparent tin oxide "window" electrode and the events were viewed from the backside. The movies showed that coalescence of bubbles is very important for determining the size of bubbles and in the chain of transport processes; growth by diffusion and by coalescence proceeds in series and parallel; coalescing bubbles cause significant fluid motion close to the electrode; bubbles can leave and reattach; and bubbles evolve in a cycle of growth by diffusion and different modes of coalescence. An analytical solution for the primary potential and current distribution around a spherical bubble in contact with a plane electrode is presented. Zero at the contact point, the current density reaches only one percent of its undisturbed value at 30 percent of the radius from that point and goes through a shallow maximum two radii away. The solution obtained for spherical bubbles is shown to apply for the small bubbles of electrolytic processes. The incremental resistance in ohms caused by sparse arrays of bubbles is given by {Delta}R = 1.352 af/kS where f is the void fraction of gas in the bubble layer, a is the bubble layer thickness, k is the conductivity of gas free electrolyte, and S is the electrode area. A densely populated gas bubble layer on an electrode was modeled as a hexagonal array of

  7. Transit probabilities in secularly evolving planetary systems

    NASA Astrophysics Data System (ADS)

    Read, Matthew J.; Wyatt, Mark C.; Triaud, Amaury H. M. J.

    2017-07-01

    This paper considers whether the population of known transiting exoplanets provides evidence for additional outer planets on inclined orbits, due to the perturbing effect of such planets on the orbits of inner planets. As such, we develop a semi-analytical method for calculating the probability that two mutually inclined planets are observed to transit. We subsequently derive a simplified analytical form to describe how the mutual inclination between two planets evolves due to secular interactions with a wide orbit inclined planet and use this to determine the mean probability that the two inner planets are observed to transit. From application to Kepler-48 and HD-106315, we constrain the inclinations of the outer planets in these systems (known from radial velocity). We also apply this work to the so-called Kepler Dichotomy, which describes the excess of single transiting systems observed by Kepler. We find three different ways of explaining this dichotomy: Some systems could be inherently single, some multiplanet systems could have inherently large mutual inclinations, while some multiplanet systems could cyclically attain large mutual inclinations through interaction with an inclined outer planet. We show how the different mechanisms can be combined to fit the observed populations of Kepler systems with one and two transiting planets. We also show how the distribution of mutual inclinations of transiting two-planet systems constrains the fraction of two-planet systems that have perturbing outer planets, since such systems should be preferentially discovered by Kepler when the inner planets are coplanar due to an increased transit probability.

  8. Evolving Dynamics of the Supergranular Flow Field

    NASA Astrophysics Data System (ADS)

    De Rosa, M. L.; Lisle, J. P.; Toomre, J.

    2000-05-01

    We study several large (45-degree square) fields of supergranules for as long as they remain visible on the solar disk (about 6 days) to characterize the dynamics of the supergranular flow field and its interaction with surrounding photospheric magnetic field elements. These flow fields are determined by applying correlation tracking methods to time series of mesogranules seen in full-disk SOI-MDI velocity images. We have shown previously that mesogranules observed in this way are systematically advected by the larger scale supergranular flow field in which they are embedded. Applying correlation tracking methods to such time series yields the positions of the supergranular outflows quite well, even for locations close to disk center. These long-duration datasets contain several instances where individual supergranules are recognizable for time scales as long as 50 hours, though most cells persist for about 25 hours that is often quoted as a supergranular lifetime. Many supergranule merging and splitting events are observed, as well as other evolving flow patterns such as lanes of converging and diverging fluid. By comparing the flow fields with the corresponding images of magnetic fields, we confirm the result that small-scale photospheric magnetic field elements are quickly advected to the intercellular lanes to form a network between the supergranular outflows. In addition, we characterize the influence of larger-scale regions of magnetic flux, such as active regions, on the flow fields. Furthermore, we have measured even larger-scale flows by following the motions of the supergranules, but these flow fields contain a high noise component and are somewhat difficult to interpret. This research was supported by NASA through grants NAG 5-8133 and NAG 5-7996, and by NSF through grant ATM-9731676.

  9. Emergent spacetime in stochastically evolving dimensions

    NASA Astrophysics Data System (ADS)

    Afshordi, Niayesh; Stojkovic, Dejan

    2014-12-01

    Changing the dimensionality of the space-time at the smallest and largest distances has manifold theoretical advantages. If the space is lower dimensional in the high energy regime, then there are no ultraviolet divergencies in field theories, it is possible to quantize gravity, and the theory of matter plus gravity is free of divergencies or renormalizable. If the space is higher dimensional at cosmological scales, then some cosmological problems (including the cosmological constant problem) can be attacked from a completely new perspective. In this paper, we construct an explicit model of "evolving dimensions" in which the dimensions open up as the temperature of the universe drops. We adopt the string theory framework in which the dimensions are fields that live on the string worldsheet, and add temperature dependent mass terms for them. At the Big Bang, all the dimensions are very heavy and are not excited. As the universe cools down, dimensions open up one by one. Thus, the dimensionality of the space we live in depends on the energy or temperature that we are probing. In particular, we provide a kinematic Brandenberger-Vafa argument for how a discrete causal set, and eventually a continuum (3 + 1)-dim spacetime along with Einstein gravity emerges in the Infrared from the worldsheet action. The (3 + 1)-dim Planck mass and the string scale become directly related, without any compactification. Amongst other predictions, we argue that LHC might be blind to new physics even if it comes at the TeV scale. In contrast, cosmic ray experiments, especially those that can register the very beginning of the shower, and collisions with high multiplicity and density of particles, might be sensitive to the dimensional cross-over.

  10. Evolutionary genomics of fast evolving tunicates.

    PubMed

    Berná, Luisa; Alvarez-Valin, Fernando

    2014-07-08

    Tunicates have been extensively studied because of their crucial phylogenetic location (the closest living relatives of vertebrates) and particular developmental plan. Recent genome efforts have disclosed that tunicates are also remarkable in their genome organization and molecular evolutionary patterns. Here, we review these latter aspects, comparing the similarities and specificities of two model species of the group: Oikopleura dioica and Ciona intestinalis. These species exhibit great genome plasticity and Oikopleura in particular has undergone a process of extreme genome reduction and compaction that can be explained in part by gene loss, but is mostly due to other mechanisms such as shortening of intergenic distances and introns, and scarcity of mobile elements. In Ciona, genome reorganization was less severe being more similar to the other chordates in several aspects. Rates and patterns of molecular evolution are also peculiar in tunicates, being Ciona about 50% faster than vertebrates and Oikopleura three times faster. In fact, the latter species is considered as the fastest evolving metazoan recorded so far. Two processes of increase in evolutionary rates have taken place in tunicates. One of them is more extreme, and basically restricted to genes encoding regulatory proteins (transcription regulators, chromatin remodeling proteins, and metabolic regulators), and the other one is less pronounced but affects the whole genome. Very likely adaptive evolution has played a very significant role in the first, whereas the functional and/or evolutionary causes of the second are less clear and the evidence is not conclusive. The evidences supporting the incidence of increased mutation and less efficient negative selection are presented and discussed.

  11. Evolutionary Genomics of Fast Evolving Tunicates

    PubMed Central

    Berná, Luisa; Alvarez-Valin, Fernando

    2014-01-01

    Tunicates have been extensively studied because of their crucial phylogenetic location (the closest living relatives of vertebrates) and particular developmental plan. Recent genome efforts have disclosed that tunicates are also remarkable in their genome organization and molecular evolutionary patterns. Here, we review these latter aspects, comparing the similarities and specificities of two model species of the group: Oikopleura dioica and Ciona intestinalis. These species exhibit great genome plasticity and Oikopleura in particular has undergone a process of extreme genome reduction and compaction that can be explained in part by gene loss, but is mostly due to other mechanisms such as shortening of intergenic distances and introns, and scarcity of mobile elements. In Ciona, genome reorganization was less severe being more similar to the other chordates in several aspects. Rates and patterns of molecular evolution are also peculiar in tunicates, being Ciona about 50% faster than vertebrates and Oikopleura three times faster. In fact, the latter species is considered as the fastest evolving metazoan recorded so far. Two processes of increase in evolutionary rates have taken place in tunicates. One of them is more extreme, and basically restricted to genes encoding regulatory proteins (transcription regulators, chromatin remodeling proteins, and metabolic regulators), and the other one is less pronounced but affects the whole genome. Very likely adaptive evolution has played a very significant role in the first, whereas the functional and/or evolutionary causes of the second are less clear and the evidence is not conclusive. The evidences supporting the incidence of increased mutation and less efficient negative selection are presented and discussed. PMID:25008364

  12. Evolving Recommendations on Prostate Cancer Screening.

    PubMed

    Brawley, Otis W; Thompson, Ian M; Grönberg, Henrik

    2016-01-01

    Results of a number of studies demonstrate that the serum prostate-specific antigen (PSA) in and of itself is an inadequate screening test. Today, one of the most pressing questions in prostate cancer medicine is how can screening be honed to identify those who have life-threatening disease and need aggressive treatment. A number of efforts are underway. One such effort is the assessment of men in the landmark Prostate Cancer Prevention Trial that has led to a prostate cancer risk calculator (PCPTRC), which is available online. PCPTRC version 2.0 predicts the probability of the diagnosis of no cancer, low-grade cancer, or high-grade cancer when variables such as PSA, age, race, family history, and physical findings are input. Modern biomarker development promises to provide tests with fewer false positives and improved ability to find high-grade cancers. Stockholm III (STHLM3) is a prospective, population-based, paired, screen-positive, prostate cancer diagnostic study assessing a combination of plasma protein biomarkers along with age, family history, previous biopsy, and prostate examination for prediction of prostate cancer. Multiparametric MRI incorporates anatomic and functional imaging to better characterize and predict future behavior of tumors within the prostate. After diagnosis of cancer, several genomic tests promise to better distinguish the cancers that need treatment versus those that need observation. Although the new technologies are promising, there is an urgent need for evaluation of these new tests in high-quality, large population-based studies. Until these technologies are proven, most professional organizations have evolved to a recommendation of informed or shared decision making in which there is a discussion between the doctor and patient.

  13. Quantum mechanics in an evolving Hilbert space

    NASA Astrophysics Data System (ADS)

    Artacho, Emilio; O'Regan, David D.

    2017-03-01

    Many basis sets for electronic structure calculations evolve with varying external parameters, such as moving atoms in dynamic simulations, giving rise to extra derivative terms in the dynamical equations. Here we revisit these derivatives in the context of differential geometry, thereby obtaining a more transparent formalization, and a geometrical perspective for better understanding the resulting equations. The effect of the evolution of the basis set within the spanned Hilbert space separates explicitly from the effect of the turning of the space itself when moving in parameter space, as the tangent space turns when moving in a curved space. New insights are obtained using familiar concepts in that context such as the Riemann curvature. The differential geometry is not strictly that for curved spaces as in general relativity, a more adequate mathematical framework being provided by fiber bundles. The language used here, however, will be restricted to tensors and basic quantum mechanics. The local gauge implied by a smoothly varying basis set readily connects with Berry's formalism for geometric phases. Generalized expressions for the Berry connection and curvature are obtained for a parameter-dependent occupied Hilbert space spanned by nonorthogonal Wannier functions. The formalism is applicable to basis sets made of atomic-like orbitals and also more adaptative moving basis functions (such as in methods using Wannier functions as intermediate or support bases), but should also apply to other situations in which nonorthogonal functions or related projectors should arise. The formalism is applied to the time-dependent quantum evolution of electrons for moving atoms. The geometric insights provided here allow us to propose new finite-difference time integrators, and also better understand those already proposed.

  14. Synthetic Model of the Oxygen-Evolving Center: Photosystem II under the Spotlight.

    PubMed

    Yu, Yang; Hu, Cheng; Liu, Xiaohong; Wang, Jiangyun

    2015-09-21

    The oxygen-evolving center (OEC) in photosystem II catalyzes a water splitting reaction. Great efforts have already been made to artificially synthesize the OEC, in order to elucidate the structure-function relationship and the mechanism of the reaction. Now, a new synthetic model makes the best mimic yet of the OEC. This recent study opens up the possibility to study the mechanism of photosystem II and photosynthesis in general for applications in renewable energy and synthetic biology.

  15. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  16. Sauropod dinosaurs evolved moderately sized genomes unrelated to body size.

    PubMed

    Organ, Chris L; Brusatte, Stephen L; Stein, Koen

    2009-12-22

    Sauropodomorph dinosaurs include the largest land animals to have ever lived, some reaching up to 10 times the mass of an African elephant. Despite their status defining the upper range for body size in land animals, it remains unknown whether sauropodomorphs evolved larger-sized genomes than non-avian theropods, their sister taxon, or whether a relationship exists between genome size and body size in dinosaurs, two questions critical for understanding broad patterns of genome evolution in dinosaurs. Here we report inferences of genome size for 10 sauropodomorph taxa. The estimates are derived from a Bayesian phylogenetic generalized least squares approach that generates posterior distributions of regression models relating genome size to osteocyte lacunae volume in extant tetrapods. We estimate that the average genome size of sauropodomorphs was 2.02 pg (range of species means: 1.77-2.21 pg), a value in the upper range of extant birds (mean = 1.42 pg, range: 0.97-2.16 pg) and near the average for extant non-avian reptiles (mean = 2.24 pg, range: 1.05-5.44 pg). The results suggest that the variation in size and architecture of genomes in extinct dinosaurs was lower than the variation found in mammals. A substantial difference in genome size separates the two major clades within dinosaurs, Ornithischia (large genomes) and Saurischia (moderate to small genomes). We find no relationship between body size and estimated genome size in extinct dinosaurs, which suggests that neutral forces did not dominate the evolution of genome size in this group.

  17. Sauropod dinosaurs evolved moderately sized genomes unrelated to body size

    PubMed Central

    Organ, Chris L.; Brusatte, Stephen L.; Stein, Koen

    2009-01-01

    Sauropodomorph dinosaurs include the largest land animals to have ever lived, some reaching up to 10 times the mass of an African elephant. Despite their status defining the upper range for body size in land animals, it remains unknown whether sauropodomorphs evolved larger-sized genomes than non-avian theropods, their sister taxon, or whether a relationship exists between genome size and body size in dinosaurs, two questions critical for understanding broad patterns of genome evolution in dinosaurs. Here we report inferences of genome size for 10 sauropodomorph taxa. The estimates are derived from a Bayesian phylogenetic generalized least squares approach that generates posterior distributions of regression models relating genome size to osteocyte lacunae volume in extant tetrapods. We estimate that the average genome size of sauropodomorphs was 2.02 pg (range of species means: 1.77–2.21 pg), a value in the upper range of extant birds (mean = 1.42 pg, range: 0.97–2.16 pg) and near the average for extant non-avian reptiles (mean = 2.24 pg, range: 1.05–5.44 pg). The results suggest that the variation in size and architecture of genomes in extinct dinosaurs was lower than the variation found in mammals. A substantial difference in genome size separates the two major clades within dinosaurs, Ornithischia (large genomes) and Saurischia (moderate to small genomes). We find no relationship between body size and estimated genome size in extinct dinosaurs, which suggests that neutral forces did not dominate the evolution of genome size in this group. PMID:19793755

  18. Fifty Years of Evolving Partnerships in Veterinary Medical Education.

    PubMed

    Kochevar, Deborah T

    2015-01-01

    The Association of American Veterinary Medical College's (AAVMC's) role in the progression of academic veterinary medical education has been about building successful partnerships in the US and internationally. Membership in the association has evolved over the past 50 years, as have traditions of collaboration that strengthen veterinary medical education and the association. The AAVMC has become a source of information and a place for debate on educational trends, innovative pedagogy, and the value of a diverse learning environment. The AAVMC's relationship with the American Veterinary Medical Association Council on Education (AVMA COE), the accreditor of veterinary medical education recognized by the United Sates Department of Education (DOE), is highlighted here because of the key role that AAVMC members have played in the evolution of veterinary accreditation. The AAVMC has also been a partner in the expansion of veterinary medical education to include global health and One Health and in the engagement of international partners around shared educational opportunities and challenges. Recently, the association has reinforced its desire to be a truly international organization rather than an American organization with international members. To that end, strategic AAVMC initiatives aim to expand and connect the global community of veterinary educators to the benefit of students and the profession around the world. Tables in this article are intended to provide historical context, chronology, and an accessible way to view highlights.

  19. Highly dynamically evolved intermediate-age open clusters

    NASA Astrophysics Data System (ADS)

    Piatti, Andrés E.; Dias, Wilton S.; Sampedro, Laura M.

    2017-04-01

    We present a comprehensive UBVRI and Washington CT1T2 photometric analysis of seven catalogued open clusters, namely: Ruprecht 3, 9, 37, 74, 150, ESO 324-15 and 436-2. The multiband photometric data sets in combination with 2MASS photometry and Gaia astrometry for the brighter stars were used to estimate their structural parameters and fundamental astrophysical properties. We found that Ruprecht 3 and ESO 436-2 do not show self-consistent evidence of being physical systems. The remained studied objects are open clusters of intermediate age (9.0 ≤ log(t yr-1) ≤ 9.6), of relatively small size (rcls ∼ 0.4-1.3 pc) and placed between 0.6 and 2.9 kpc from the Sun. We analysed the relationships between core, half-mass, tidal and Jacoby radii as well as half-mass relaxation times to conclude that the studied clusters are in an evolved dynamical stage. The total cluster masses obtained by summing those of the observed cluster stars resulted to be ∼10-15 per cent of the masses of open clusters of similar age located closer than 2 kpc from the Sun. We found that cluster stars occupy volumes as large as those for tidally filled clusters.

  20. Complex Formation History of Highly Evolved Basaltic Shergottite, Zagami

    NASA Technical Reports Server (NTRS)

    Niihara, T.; Misawa, K.; Mikouchi, T.; Nyquist, L. E.; Park, J.; Hirata, D.

    2012-01-01

    Zagami, a basaltic shergottite, contains several kinds of lithologies such as Normal Zagami consisting of Fine-grained (FG) and Coarse-grained (CG), Dark Mottled lithology (DML), and Olivine-rich late-stage melt pocket (DN). Treiman and Sutton concluded that Zagami (Normal Zagami) is a fractional crystallization product from a single magma. It has been suggested that there were two igneous stages (deep magma chamber and shallow magma chamber or surface lava flow) on the basis of chemical zoning features of pyroxenes which have homogeneous Mg-rich cores and FeO, CaO zoning at the rims. Nyquist et al. reported that FG has a different initial Sr isotopic ratio than CG and DML, and suggested the possibility of magma mixing on Mars. Here we report new results of petrology and mineralogy for DML and the Olivine-rich lithology (we do not use DN here), the most evolved lithology in this rock, to understand the relationship among lithologies and reveal Zagami s formation history

  1. Predatory prokaryotes: predation and primary consumption evolved in bacteria

    NASA Technical Reports Server (NTRS)

    Guerrero, R.; Pedros-Alio, C.; Esteve, I.; Mas, J.; Chase, D.; Margulis, L.

    1986-01-01

    Two kinds of predatory bacteria have been observed and characterized by light and electron microscopy in samples from freshwater sulfurous lakes in northeastern Spain. The first bacterium, named Vampirococcus, is Gram-negative and ovoidal (0.6 micrometer wide). An anaerobic epibiont, it adheres to the surface of phototrophic bacteria (Chromatium spp.) by specific attachment structures and, as it grows and divides by fission, destroys its prey. An important in situ predatory role can be inferred for Vampirococcus from direct counts in natural samples. The second bacterium, named Daptobacter, is a Gram-negative, facultatively anaerobic straight rod (0.5 x 1.5 micrometers) with a single polar flagellum, which collides, penetrates, and grows inside the cytoplasm of its prey (several genera of Chromatiaceae). Considering also the well-known case of Bdellovibrio, a Gram-negative, aerobic curved rod that penetrates and divides in the periplasmic space of many chemotrophic Gram-negative bacteria, there are three types of predatory prokaryotes presently known (epibiotic, cytoplasmic, and periplasmic). Thus, we conclude that antagonistic relationships such as primary consumption, predation, and scavenging had already evolved in microbial ecosystems prior to the appearance of eukaryotes. Furthermore, because they represent methods by which prokaryotes can penetrate other prokaryotes in the absence of phagocytosis, these associations can be considered preadaptation for the origin of intracellular organelles.

  2. Predatory prokaryotes: predation and primary consumption evolved in bacteria

    NASA Technical Reports Server (NTRS)

    Guerrero, R.; Pedros-Alio, C.; Esteve, I.; Mas, J.; Chase, D.; Margulis, L.

    1986-01-01

    Two kinds of predatory bacteria have been observed and characterized by light and electron microscopy in samples from freshwater sulfurous lakes in northeastern Spain. The first bacterium, named Vampirococcus, is Gram-negative and ovoidal (0.6 micrometer wide). An anaerobic epibiont, it adheres to the surface of phototrophic bacteria (Chromatium spp.) by specific attachment structures and, as it grows and divides by fission, destroys its prey. An important in situ predatory role can be inferred for Vampirococcus from direct counts in natural samples. The second bacterium, named Daptobacter, is a Gram-negative, facultatively anaerobic straight rod (0.5 x 1.5 micrometers) with a single polar flagellum, which collides, penetrates, and grows inside the cytoplasm of its prey (several genera of Chromatiaceae). Considering also the well-known case of Bdellovibrio, a Gram-negative, aerobic curved rod that penetrates and divides in the periplasmic space of many chemotrophic Gram-negative bacteria, there are three types of predatory prokaryotes presently known (epibiotic, cytoplasmic, and periplasmic). Thus, we conclude that antagonistic relationships such as primary consumption, predation, and scavenging had already evolved in microbial ecosystems prior to the appearance of eukaryotes. Furthermore, because they represent methods by which prokaryotes can penetrate other prokaryotes in the absence of phagocytosis, these associations can be considered preadaptation for the origin of intracellular organelles.

  3. Evolving Improvements to TRMM Ground Validation Rainfall Estimates

    NASA Technical Reports Server (NTRS)

    Robinson, M.; Kulie, M. S.; Marks, D. A.; Wolff, D. B.; Ferrier, B. S.; Amitai, E.; Silberstein, D. S.; Fisher, B. L.; Wang, J.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The primary function of the TRMM Ground Validation (GV) Program is to create GV rainfall products that provide basic validation of satellite-derived precipitation measurements for select primary sites. Since the successful 1997 launch of the TRMM satellite, GV rainfall estimates have demonstrated systematic improvements directly related to improved radar and rain gauge data, modified science techniques, and software revisions. Improved rainfall estimates have resulted in higher quality GV rainfall products and subsequently, much improved evaluation products for the satellite-based precipitation estimates from TRMM. This presentation will demonstrate how TRMM GV rainfall products created in a semi-automated, operational environment have evolved and improved through successive generations. Monthly rainfall maps and rainfall accumulation statistics for each primary site will be presented for each stage of GV product development. Contributions from individual product modifications involving radar reflectivity (Ze)-rain rate (R) relationship refinements, improvements in rain gauge bulk-adjustment and data quality control processes, and improved radar and gauge data will be discussed. Finally, it will be demonstrated that as GV rainfall products have improved, rainfall estimation comparisons between GV and satellite have converged, lending confidence to the satellite-derived precipitation measurements from TRMM.

  4. Did the ctenophore nervous system evolve independently?

    PubMed

    Ryan, Joseph F

    2014-08-01

    Recent evidence supports the placement of ctenophores as the most distant relative to all other animals. This revised animal tree means that either the ancestor of all animals possessed neurons (and that sponges and placozoans apparently lost them) or that ctenophores developed them independently. Differentiating between these possibilities is important not only from a historical perspective, but also for the interpretation of a wide range of neurobiological results. In this short perspective paper, I review the evidence in support of each scenario and show that the relationship between the nervous system of ctenophores and other animals is an unsolved, yet tractable problem. Copyright © 2014 Elsevier GmbH. All rights reserved.

  5. Hemicrania continua evolving from episodic paroxysmal hemicrania.

    PubMed

    Castellanos-Pinedo, F; Zurdo, M; Martínez-Acebes, E

    2006-09-01

    A 45-year-old woman, who had been diagnosed in our unit with episodic paroxysmal hemicrania, was seen 2 years later for ipsilateral hemicrania continua in remitting form. Both types of headache had a complete response to indomethacin and did not occur simultaneously. The patient had a previous history of episodic moderate headaches that met criteria for probable migraine without aura and also had a family history of headache. The clinical course in this case suggests a pathogenic relationship between both types of primary headache.

  6. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  7. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  8. Star formation in evolving molecular clouds

    NASA Astrophysics Data System (ADS)

    Völschow, M.; Banerjee, R.; Körtgen, B.

    2017-09-01

    Molecular clouds are the principle stellar nurseries of our universe; they thus remain a focus of both observational and theoretical studies. From observations, some of the key properties of molecular clouds are well known but many questions regarding their evolution and star formation activity remain open. While numerical simulations feature a large number and complexity of involved physical processes, this plethora of effects may hide the fundamentals that determine the evolution of molecular clouds and enable the formation of stars. Purely analytical models, on the other hand, tend to suffer from rough approximations or a lack of completeness, limiting their predictive power. In this paper, we present a model that incorporates central concepts of astrophysics as well as reliable results from recent simulations of molecular clouds and their evolutionary paths. Based on that, we construct a self-consistent semi-analytical framework that describes the formation, evolution, and star formation activity of molecular clouds, including a number of feedback effects to account for the complex processes inside those objects. The final equation system is solved numerically but at much lower computational expense than, for example, hydrodynamical descriptions of comparable systems. The model presented in this paper agrees well with a broad range of observational results, showing that molecular cloud evolution can be understood as an interplay between accretion, global collapse, star formation, and stellar feedback.

  9. Shoulder arthroplasty: evolving techniques and indications.

    PubMed

    Walch, Gilles; Boileau, Pascal; Noël, Eric

    2010-12-01

    The development of modern shoulder replacement surgery started over half a century ago with the pioneering work done by CS Neer. Several designs for shoulder prostheses are now available, allowing surgeons to select the best design for each situation. When the rotator cuff is intact, unconstrained prostheses produce reliable and reproducible results, with prosthesis survival rates of 97% after 10 years and 84% after 20 years. In patients with three- or four-part fractures of the proximal humerus, the outcome of shoulder arthroplasty depends largely on healing of the greater tuberosity, which is therefore a major treatment objective. Factors crucial to greater tuberosity union include selection of the optimal prosthesis design, flawless fixation of the tuberosities, and appropriate postoperative immobilization. The reverse shoulder prosthesis developed by Grammont has been recognized since 1991 as a valid option for patients with glenohumeral osteoarthritis. Ten-year prosthesis survival rates are 91% overall (including trauma and revisions) and 94% for glenohumeral osteoarthritis with head migration. These good results are generating interest in the reverse shoulder prosthesis as a treatment option in situations where unconstrained prostheses are unsatisfactory (primary glenohumeral osteoarthritis with marked glenoid cavity erosion; comminuted fractures in patients older than 75 years; post-traumatic osteoarthritis with severe tuberosity malunion or nonunion; massive irreparable rotator cuff tears with pseudoparalysis; failed rotator cuff repair; and proximal humerus tumor requiring resection of the rotator cuff insertions).

  10. The Evolvement of Automobile Steering System Based on TRIZ

    NASA Astrophysics Data System (ADS)

    Zhao, Xinjun; Zhang, Shuang

    Products and techniques pass through a process of birth, growth, maturity, death and quit the stage like biological evolution process. The developments of products and techniques conform to some evolvement rules. If people know and hold these rules, they can design new kind of products and forecast the develop trends of the products. Thereby, enterprises can grasp the future technique directions of products, and make product and technique innovation. Below, based on TRIZ theory, the mechanism evolvement, the function evolvement and the appearance evolvement of automobile steering system had been analyzed and put forward some new ideas about future automobile steering system.

  11. The Development of a Computerized System for the Estimation of Reliability for Measurement Systems Employing Interval or Ratio Data.

    ERIC Educational Resources Information Center

    Porter, D. Thomas

    Critical to precise quantitative research is reliability estimation. Researchers have limited tools, however, to assess the reliability of evolving instruments. Consequently, cursory assessment is typical and in-depth evaluation is rare. This paper presents a rationale for and description of PIAS, a computerized instrument analysis system. PIAS…

  12. Mechanics of evolving thin film structures

    NASA Astrophysics Data System (ADS)

    Liang, Jim

    In the Stranski-Krastanov system, the lattice mismatch between the film and the substrate causes the film to break into islands. During annealing, both the surface energy and the elastic energy drive the islands to coarsen. Motivated by several related studies, we suggest that stable islands should form when a stiff ceiling is placed at a small gap above the film. We show that the role of elasticity is reversed: with the ceiling, the total elastic energy stored in the system increases as the islands coarsen laterally. Consequently, the islands select an equilibrium size to minimize the combined elastic energy and surface energy. In lithographically-induced self-assembly, when a two-phase fluid confined between parallel substrates is subjected to an electric field, one phase can self-assemble into a triangular lattice of islands in another phase. We describe a theory of the stability of the island lattice. The islands select the equilibrium diameter to minimize the combined interface energy and electrostatic energy. Furthermore, we study compressed SiGe thin film islands fabricated on a glass layer, which itself lies on a silicon wafer. Upon annealing, the glass flows, and the islands relax. A small island relaxes by in-plane expansion. A large island, however, wrinkles at the center before the in-plane relaxation arrives. The wrinkles may cause significant tensile stress in the island, leading to fracture. We model the island by the von Karman plate theory and the glass layer by the Reynolds lubrication theory. Numerical simulations evolve the in-plane expansion and the wrinkles simultaneously. We determine the critical island size, below which in-plane expansion prevails over wrinkling. Finally, in devices that integrate dissimilar materials in small dimensions, crack extension in one material often accompanies inelastic deformation in another. We analyze a channel crack advancing in an elastic film under tension, while an underlayer creeps. We use a two

  13. Is quantitative electromyography reliable?

    PubMed

    Cecere, F; Ruf, S; Pancherz, H

    1996-01-01

    The reliability of quantitative electromyography (EMG) of the masticatory muscles was investigated in 14 subjects without any signs or symptoms of temporomandibular disorders. Integrated EMG activity from the anterior temporalis and masseter muscles was recorded bilaterally by means of bipolar surface electrodes during chewing and biting activities. In the first experiment, the influence of electrode relocation was investigated. No influence of electrode relocation on the recorded EMG signal could be detected. In a second experiment, three sessions of EMG recordings during five different chewing and biting activities were performed in the morning (I); 1 hour later without intermediate removal of the electrodes (II); and in the afternoon, using new electrodes (III). The method errors for different time intervals (I-II and I-III errors) for each muscle and each function were calculated. Depending on the time interval between the EMG recordings, the muscles considered, and the function performed, the individual errors ranged from 5% to 63%. The method error increased significantly (P < .05 to P < .01) with the time interval between recordings. The error for the masseter (mean 27.2%) was higher than for the temporalis (mean 20.0%). The largest function error was found during maximal biting in intercuspal position (mean 23.1%). Based on the findings, quantitative electromyography of the masticatory muscles seems to have a limited value in diagnostics and in the evaluation of individual treatment results.

  14. The tortoise and the hare: slowly evolving T-cell responses take hastily evolving KIR

    PubMed Central

    van Bergen, Jeroen; Koning, Frits

    2010-01-01

    The killer cell immunoglobulin-like receptor (KIR) locus comprises a variable and rapidly evolving set of genes encoding multiple inhibitory and activating receptors. The activating receptors recently evolved from the inhibitory receptors and both bind HLA class I and probably also class I-like structures induced by viral infection. Although generally considered natural killer (NK) cell receptors, KIR are also expressed by a large fraction of effector memory T cells, which slowly accumulate during human life. These effector memory cells are functionally similar to NK cells, as they are immediate effector cells that are cytotoxic and produce IFN-γ. However, different rules apply to NK and T cells with respect to KIR expression and function. For example, KIR tend to modulate signals driven by the T-cell receptor (TCR) rather than to act independently, and use different signal transduction pathways to modulate only a subset of effector functions. The most important difference may lie in the rules governing tolerance: while NK cells with activating KIR binding self-HLA are hyporesponsive, the same is unlikely to apply to T cells. We argue that the expression of activating KIR on virus-specific T cells carrying TCR that weakly cross-react with autoantigens can unleash the autoreactive potential of these cells. This may be the case in rheumatoid arthritis, where cytomegalovirus-specific KIR2DS2+ T cells might cause vasculitis. Thus, the rapid evolution of activating KIR may have allowed for efficient NK-cell control of viruses, but may also have increased the risk that slowly evolving T-cell responses to persistent pathogens derail into autoimmunity. PMID:20722764

  15. The reliability of calculated laboratory results.

    PubMed

    Coskun, Abdurrahman

    2005-01-01

    In clinical laboratories, patient results can be obtained in two ways: (i) by direct determination of requested tests using various chemical methods, (ii) by calculation of unknown test results, based on relationships between measured tests. The reliability of measured tests can be checked by various quality control rules. However, no test is performed to check the reliability of calculated data. In this study we develop a method using Taylor series expansion and an alternative equation to obtain the standard deviation of calculated laboratory tests and discuss the reliability of calculated data. To obtain reliable test results by calculation instead of being measured by chemical methods, the standard deviation of each measured component of the equation must be thoroughly analyzed and then the standard deviation of the equation must be determined. We conclude that the analytical coefficient of variation of any measured component must be lower so as to obtain an acceptable analytical coefficient of variation for calculated tests. Otherwise we should measure the concentration of requested tests by chemical methods instead of calculation by equation using specified components.

  16. Ethical Implications of Validity-vs.-Reliability Trade-Offs in Educational Research

    ERIC Educational Resources Information Center

    Fendler, Lynn

    2016-01-01

    In educational research that calls itself empirical, the relationship between validity and reliability is that of trade-off: the stronger the bases for validity, the weaker the bases for reliability (and vice versa). Validity and reliability are widely regarded as basic criteria for evaluating research; however, there are ethical implications of…

  17. Ethical Implications of Validity-vs.-Reliability Trade-Offs in Educational Research

    ERIC Educational Resources Information Center

    Fendler, Lynn

    2016-01-01

    In educational research that calls itself empirical, the relationship between validity and reliability is that of trade-off: the stronger the bases for validity, the weaker the bases for reliability (and vice versa). Validity and reliability are widely regarded as basic criteria for evaluating research; however, there are ethical implications of…

  18. Could life have evolved in cometary nuclei

    NASA Technical Reports Server (NTRS)

    Bar-Nun, A.; Lazcano-Araujo, A.; Oro, J.

    1981-01-01

    The suggestion by Hoyle and Wickramasinghe (1978) that life might have originated in cometary nuclei rather than directly on the earth is discussed. Factors in the cometary environment including the conditions at perihelion passage leading to the ablation of cometary ices, ice temperatures, the absence of an atmosphere and discrete liquid and solid surfaces, weak cometary structure incapable of supporting a liquid core, and radiation are presented as arguments against biopoesis in comets. It is concluded that although the contribution of cometary and meteoritic matter was significant in shaping the earth environment, the view that life on earth originally arose in comets is untenable, and the proposition that the process of interplanetary infection still occurs is unlikely in view of the high specificity of host-parasite relationships.

  19. Could life have evolved in cometary nuclei

    NASA Technical Reports Server (NTRS)

    Bar-Nun, A.; Lazcano-Araujo, A.; Oro, J.

    1981-01-01

    The suggestion by Hoyle and Wickramasinghe (1978) that life might have originated in cometary nuclei rather than directly on the earth is discussed. Factors in the cometary environment including the conditions at perihelion passage leading to the ablation of cometary ices, ice temperatures, the absence of an atmosphere and discrete liquid and solid surfaces, weak cometary structure incapable of supporting a liquid core, and radiation are presented as arguments against biopoesis in comets. It is concluded that although the contribution of cometary and meteoritic matter was significant in shaping the earth environment, the view that life on earth originally arose in comets is untenable, and the proposition that the process of interplanetary infection still occurs is unlikely in view of the high specificity of host-parasite relationships.

  20. Food Addiction: An Evolving Nonlinear Science

    PubMed Central

    Shriner, Richard; Gold, Mark

    2014-01-01

    The purpose of this review is to familiarize readers with the role that addiction plays in the formation and treatment of obesity, type 2 diabetes and disorders of eating. We will outline several useful models that integrate metabolism, addiction, and human relationship adaptations to eating. A special effort will be made to demonstrate how the use of simple and straightforward nonlinear models can and are being used to improve our knowledge and treatment of patients suffering from nutritional pathology. Moving forward, the reader should be able to incorporate some of the findings in this review into their own practice, research, teaching efforts or other interests in the fields of nutrition, diabetes, and/or bariatric (weight) management. PMID:25421535