Science.gov

Sample records for evolving reliable relationships

  1. Reliability of genetic networks is evolvable

    NASA Astrophysics Data System (ADS)

    Braunewell, Stefan; Bornholdt, Stefan

    2008-06-01

    Control of the living cell functions with remarkable reliability despite the stochastic nature of the underlying molecular networks—a property presumably optimized by biological evolution. We ask here to what extent the ability of a stochastic dynamical network to produce reliable dynamics is an evolvable trait. Using an evolutionary algorithm based on a deterministic selection criterion for the reliability of dynamical attractors, we evolve networks of noisy discrete threshold nodes. We find that, starting from any random network, reliability of the attractor landscape can often be achieved with only a few small changes to the network structure. Further, the evolvability of networks toward reliable dynamics while retaining their function is investigated and a high success rate is found.

  2. An Evolving Relationship.

    ERIC Educational Resources Information Center

    May, Therese M.

    1990-01-01

    Responds to five major articles by Duckworth, Goldman, Healy, Sampson, and Goodyear on issues pertaining to testing and assessment in counseling psychology. Suggests that the interactive, collaborative aspects of the assessment relationship between psychologist and client need more attention. (TE)

  3. Relatedness influences signal reliability in evolving robots.

    PubMed

    Mitri, Sara; Floreano, Dario; Keller, Laurent

    2011-02-01

    Communication is an indispensable component of animal societies, yet many open questions remain regarding the factors affecting the evolution and reliability of signalling systems. A potentially important factor is the level of genetic relatedness between signallers and receivers. To quantitatively explore the role of relatedness in the evolution of reliable signals, we conducted artificial evolution over 500 generations in a system of foraging robots that can emit and perceive light signals. By devising a quantitative measure of signal reliability, and comparing independently evolving populations differing in within-group relatedness, we show a strong positive correlation between relatedness and reliability. Unrelated robots produced unreliable signals, whereas highly related robots produced signals that reliably indicated the location of the food source and thereby increased performance. Comparisons across populations also revealed that the frequency for signal production-which is often used as a proxy of signal reliability in empirical studies on animal communication-is a poor predictor of signal reliability and, accordingly, is not consistently correlated with group performance. This has important implications for our understanding of signal evolution and the empirical tools that are used to investigate communication.

  4. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, Troy; Toon, Jamie; Conner, Angelo C.; Adams, Timothy C.; Miranda, David J.

    2016-01-01

    This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program’s subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.

  5. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, Jamie; Toon, Troy; Adams, Timothy C.; Miranda, David J.

    2016-01-01

    This paper describes the methodology that was developed to allocate reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program's subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. Allocating is an iterative process; as systems moved beyond their conceptual and preliminary design phases this provided an opportunity for the reliability engineering team to reevaluate allocations based on updated designs and maintainability characteristics of the components. Trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper will discuss the value of modifying reliability and maintainability allocations made for the GSDO subsystems as the program nears the end of its design phase.

  6. Towards resolving Lamiales relationships: insights from rapidly evolving chloroplast sequences

    PubMed Central

    2010-01-01

    Background In the large angiosperm order Lamiales, a diverse array of highly specialized life strategies such as carnivory, parasitism, epiphytism, and desiccation tolerance occur, and some lineages possess drastically accelerated DNA substitutional rates or miniaturized genomes. However, understanding the evolution of these phenomena in the order, and clarifying borders of and relationships among lamialean families, has been hindered by largely unresolved trees in the past. Results Our analysis of the rapidly evolving trnK/matK, trnL-F and rps16 chloroplast regions enabled us to infer more precise phylogenetic hypotheses for the Lamiales. Relationships among the nine first-branching families in the Lamiales tree are now resolved with very strong support. Subsequent to Plocospermataceae, a clade consisting of Carlemanniaceae plus Oleaceae branches, followed by Tetrachondraceae and a newly inferred clade composed of Gesneriaceae plus Calceolariaceae, which is also supported by morphological characters. Plantaginaceae (incl. Gratioleae) and Scrophulariaceae are well separated in the backbone grade; Lamiaceae and Verbenaceae appear in distant clades, while the recently described Linderniaceae are confirmed to be monophyletic and in an isolated position. Conclusions Confidence about deep nodes of the Lamiales tree is an important step towards understanding the evolutionary diversification of a major clade of flowering plants. The degree of resolution obtained here now provides a first opportunity to discuss the evolution of morphological and biochemical traits in Lamiales. The multiple independent evolution of the carnivorous syndrome, once in Lentibulariaceae and a second time in Byblidaceae, is strongly supported by all analyses and topological tests. The evolution of selected morphological characters such as flower symmetry is discussed. The addition of further sequence data from introns and spacers holds promise to eventually obtain a fully resolved plastid tree of

  7. [Study of the relationship between human quality and reliability].

    PubMed

    Long, S; Wang, C; Wang, L i; Yuan, J; Liu, H; Jiao, X

    1997-02-01

    To clarify the relationship between human quality and reliability, 1925 experiments in 20 subjects were carried out to study the relationship between disposition character, digital memory, graphic memory, multi-reaction time and education level and simulated aircraft operation. Meanwhile, effects of task difficulty and enviromental factor on human reliability were also studied. The results showed that human quality can be predicted and evaluated through experimental methods. The better the human quality, the higher the human reliability. PMID:11539889

  8. Cats: their history and our evolving relationship with them.

    PubMed

    2016-07-01

    Cats have had a long relationship with people, and their history as a domesticated animal can be traced back as far as 2000 BC. Delegates at a recent conference titled 'People, cats and vets through history' delved a little deeper into the changing nature of this relationship. Georgina Mills reports. PMID:27389749

  9. Young People and Alcohol in Italy: An Evolving Relationship

    ERIC Educational Resources Information Center

    Beccaria, Franca; Prina, Franco

    2010-01-01

    In Italy, commonly held opinions and interpretations about the relationship between young people and alcohol are often expressed as generalizations and approximations. In order to further understanding of the relationship between young people and alcohol in contemporary Italy, we have gathered, compared and discussed all the available data, both…

  10. Risk and responsibility: a complex and evolving relationship.

    PubMed

    Kermisch, Céline

    2012-03-01

    This paper analyses the nature of the relationship between risk and responsibility. Since neither the concept of risk nor the concept of responsibility has an unequivocal definition, it is obvious that there is no single interpretation of their relationship. After introducing the different meanings of responsibility used in this paper, we analyse four conceptions of risk. This allows us to make their link with responsibility explicit and to determine if a shift in the connection between risk and responsibility can be outlined. (1) In the engineer's paradigm, the quantitative conception of risk does not include any concept of responsibility. Their relationship is indirect, the locus of responsibility being risk management. (2) In Mary Douglas' cultural theory, risks are constructed through the responsibilities they engage. (3) Rayner and (4) Wolff go further by integrating forms of responsibility in the definition of risk itself. Analysis of these four frameworks shows that the concepts of risk and responsibility are increasingly intertwined. This tendency is reinforced by increasing public awareness and a call for the integration of a moral dimension in risk management. Therefore, we suggest that a form of virtue-responsibility should also be integrated in the concept of risk. PMID:21103951

  11. Risk and responsibility: a complex and evolving relationship.

    PubMed

    Kermisch, Céline

    2012-03-01

    This paper analyses the nature of the relationship between risk and responsibility. Since neither the concept of risk nor the concept of responsibility has an unequivocal definition, it is obvious that there is no single interpretation of their relationship. After introducing the different meanings of responsibility used in this paper, we analyse four conceptions of risk. This allows us to make their link with responsibility explicit and to determine if a shift in the connection between risk and responsibility can be outlined. (1) In the engineer's paradigm, the quantitative conception of risk does not include any concept of responsibility. Their relationship is indirect, the locus of responsibility being risk management. (2) In Mary Douglas' cultural theory, risks are constructed through the responsibilities they engage. (3) Rayner and (4) Wolff go further by integrating forms of responsibility in the definition of risk itself. Analysis of these four frameworks shows that the concepts of risk and responsibility are increasingly intertwined. This tendency is reinforced by increasing public awareness and a call for the integration of a moral dimension in risk management. Therefore, we suggest that a form of virtue-responsibility should also be integrated in the concept of risk.

  12. [Creating a reliable therapeutic relationship with the patient].

    PubMed

    Matsuki, Kunihiro

    2012-01-01

    The factors necessary to create a reliable therapeutic relationship are presented in this paper. They include a demeanor and calmness of temperament as a psychiatric professional, a feeling of respect for the patient that is based on our common sense as human beings, an attitude of listening attentively to what the patient is revealing, maintaining an attitude of receptive neutrality, the ability to withstand the emotional burdens imposed on one by the patient, patience with any difficulty on one's own part to understand the patient, the ability to communicate clearly, including on the patient's negative aspects, and the ability to end psychiatric consultation sessions in a friendly and intimate manner. Creating a beneficial therapeutic relationship is about the building of a trusting relationship, in which the patient can constructively endure being questioned by us, or cope with the tough burdens we may place on them. However, a reliable relationship such as this contains paradoxes. Patients are able to talk to us about their suspicions, anxieties, dissatisfactions or anger only if the therapeutic relationship is good or based on trust. In other words, just like our patients, psychiatrists, too, must deal with what that the patient brings and directs toward us. It is at this point that what we call a true therapeutic relationship starts.

  13. Considering context: reliable entity networks through contextual relationship extraction

    NASA Astrophysics Data System (ADS)

    David, Peter; Hawes, Timothy; Hansen, Nichole; Nolan, James J.

    2016-05-01

    Existing information extraction techniques can only partially address the problem of exploiting unreadable-large amounts text. When discussion of events and relationships is limited to simple, past-tense, factual descriptions of events, current NLP-based systems can identify events and relationships and extract a limited amount of additional information. But the simple subset of available information that existing tools can extract from text is only useful to a small set of users and problems. Automated systems need to find and separate information based on what is threatened or planned to occur, has occurred in the past, or could potentially occur. We address the problem of advanced event and relationship extraction with our event and relationship attribute recognition system, which labels generic, planned, recurring, and potential events. The approach is based on a combination of new machine learning methods, novel linguistic features, and crowd-sourced labeling. The attribute labeler closes the gap between structured event and relationship models and the complicated and nuanced language that people use to describe them. Our operational-quality event and relationship attribute labeler enables Warfighters and analysts to more thoroughly exploit information in unstructured text. This is made possible through 1) More precise event and relationship interpretation, 2) More detailed information about extracted events and relationships, and 3) More reliable and informative entity networks that acknowledge the different attributes of entity-entity relationships.

  14. ELECTRICAL SUBSTATION RELIABILITY EVALUATION WITH EMPHASIS ON EVOLVING INTERDEPENDENCE ON COMMUNICATION INFRASTRUCTURE.

    SciTech Connect

    AZARM,M.A.BARI,R.A.MUSICKI,Z.

    2004-01-15

    The objective of this study is to develop a methodology for a probabilistic assessment of the reliability and security of electrical energy distribution networks. This includes consideration of the future grid system, which will rely heavily on the existing digitally based communication infrastructure for monitoring and protection. Another important objective of this study is to provide information and insights from this research to Consolidated Edison Company (Con Edison) that could be useful in the design of the new network segment to be installed in the area of the World Trade Center in lower Manhattan. Our method is microscopic in nature and relies heavily on the specific design of the portion of the grid being analyzed. It extensively models the types of faults that a grid could potentially experience, the response of the grid, and the specific design of the protection schemes. We demonstrate that the existing technology can be extended and applied to the electrical grid and to the supporting communication network. A small subsection of a hypothetical grid based on the existing New York City electrical grid system of Con Edison is used to demonstrate the methods. Sensitivity studies show that in the current design the frequency for the loss of the main station is sensitive to the communication network reliability. The reliability of the communication network could become a more important contributor to the electrical grid reliability as the utilization of the communication network significantly increases in the near future to support ''smart'' transmission and/or distributed generation. The identification of potential failure modes and their likelihood can support decisions on potential modifications to the network including hardware, monitoring instrumentation, and protection systems.

  15. Reliability assurance program and its relationship to other regulations

    SciTech Connect

    Polich, T.J.

    1994-12-31

    The need for a safety-oriented reliability effort for the nuclear industry was identified by the U.S. Nuclear Regulatory Commission (NRC) in the Three Mile Island Action Plan (NUREG-0660) Item II.C.4. In SECY-89-013, {open_quotes}Design Requirements Related to the Evolutionary ALWR,{close_quotes} the staff stated that the reliability assurance program (RAP) would be required for design certification to ensure that the design reliability of safety-significant structures, systems, and components (SSCs) is maintained over the life of a plant. In November 1988, the staff informed the advanced light water reactor (ALWR) vendors and the Electric Power Research Institute (EPRI) that it was considering this matter. Since that time, the staff has had numerous interactions with industry regarding RAP. These include discussions and subsequent safety evaluation reports on the EPRI utilities requirements document and for both Evolutionary Designs. The RAP has also been discussed in SECY-93-087, {open_quotes}Policy, Technical, and Licensing Issues Pertaining to Evolutionary and Advanced Light-Water Reactor (ALWR) Designs{close_quotes} and SECY-94-084, {open_quotes}Policy and Technical Issues Associated With the Regulatory Treatment of Non-Safety Systems in Passive Plant Designs.{close_quotes}

  16. Generalized storage-reliability-yield relationships for rainwater harvesting systems

    NASA Astrophysics Data System (ADS)

    Hanson, L. S.; Vogel, R. M.

    2014-07-01

    Sizing storage for rainwater harvesting (RWH) systems is often a difficult design consideration, as the system must be designed specifically for the local rainfall pattern. We introduce a generally applicable method for estimating the required storage by using regional regression equations to account for climatic differences in the behavior of RWH systems across the entire continental United States. A series of simulations for 231 locations with continuous daily precipitation records enable the development of storage-reliability-yield (SRY) relations at four useful reliabilities, 0.8, 0.9, 0.95, and 0.98. Multivariate, log-linear regression results in storage equations that include demand, collection area and local precipitation statistics. The continental regression equations demonstrated excellent goodness-of-fit (R2 0.96-0.99) using only two precipitation parameters, and fits improved when three geographic regions with more homogeneous rainfall characteristics were considered. The SRY models can be used to obtain a preliminary estimate of how large to build a storage tank almost anywhere in the United States based on desired yield and reliability, collection area, and local rainfall statistics. Our methodology could be extended to other regions of world, and the equations presented herein could be used to investigate how RWH systems would respond to changes in climatic variability. The resulting model may also prove useful in regional planning studies to evaluate the net benefits which result from the broad use of RWH to meet water supply requirements. We outline numerous other possible extensions to our work, which when taken together, illustrate the value of our initial generalized SRY model for RWH systems.

  17. Animals Used in Research and Education, 1966-2016: Evolving Attitudes, Policies, and Relationships.

    PubMed

    Lairmore, Michael D; Ilkiw, Jan

    2015-01-01

    Since the inception of the Association of American Veterinary Medical Colleges (AAVMC), the use of animals in research and education has been a central element of the programs of member institutions. As veterinary education and research programs have evolved over the past 50 years, so too have societal views and regulatory policies. AAVMC member institutions have continually responded to these events by exchanging best practices in training their students in the framework of comparative medicine and the needs of society. Animals provide students and faculty with the tools to learn the fundamental knowledge and skills of veterinary medicine and scientific discovery. The study of animal models has contributed extensively to medicine, veterinary medicine, and basic sciences as these disciplines seek to understand life processes. Changing societal views over the past 50 years have provided active examination and continued refinement of the use of animals in veterinary medical education and research. The future use of animals to educate and train veterinarians will likely continue to evolve as technological advances are applied to experimental design and educational systems. Natural animal models of both human and animal health will undoubtedly continue to serve a significant role in the education of veterinarians and in the development of new treatments of animal and human disease. As it looks to the future, the AAVMC as an organization will need to continue to support and promote best practices in the humane care and appropriate use of animals in both education and research.

  18. Animals Used in Research and Education, 1966-2016: Evolving Attitudes, Policies, and Relationships.

    PubMed

    Lairmore, Michael D; Ilkiw, Jan

    2015-01-01

    Since the inception of the Association of American Veterinary Medical Colleges (AAVMC), the use of animals in research and education has been a central element of the programs of member institutions. As veterinary education and research programs have evolved over the past 50 years, so too have societal views and regulatory policies. AAVMC member institutions have continually responded to these events by exchanging best practices in training their students in the framework of comparative medicine and the needs of society. Animals provide students and faculty with the tools to learn the fundamental knowledge and skills of veterinary medicine and scientific discovery. The study of animal models has contributed extensively to medicine, veterinary medicine, and basic sciences as these disciplines seek to understand life processes. Changing societal views over the past 50 years have provided active examination and continued refinement of the use of animals in veterinary medical education and research. The future use of animals to educate and train veterinarians will likely continue to evolve as technological advances are applied to experimental design and educational systems. Natural animal models of both human and animal health will undoubtedly continue to serve a significant role in the education of veterinarians and in the development of new treatments of animal and human disease. As it looks to the future, the AAVMC as an organization will need to continue to support and promote best practices in the humane care and appropriate use of animals in both education and research. PMID:26673210

  19. On the Relationships between Generative Encodings, Regularity, and Learning Abilities when Evolving Plastic Artificial Neural Networks

    PubMed Central

    Tonelli, Paul; Mouret, Jean-Baptiste

    2013-01-01

    A major goal of bio-inspired artificial intelligence is to design artificial neural networks with abilities that resemble those of animal nervous systems. It is commonly believed that two keys for evolving nature-like artificial neural networks are (1) the developmental process that links genes to nervous systems, which enables the evolution of large, regular neural networks, and (2) synaptic plasticity, which allows neural networks to change during their lifetime. So far, these two topics have been mainly studied separately. The present paper shows that they are actually deeply connected. Using a simple operant conditioning task and a classic evolutionary algorithm, we compare three ways to encode plastic neural networks: a direct encoding, a developmental encoding inspired by computational neuroscience models, and a developmental encoding inspired by morphogen gradients (similar to HyperNEAT). Our results suggest that using a developmental encoding could improve the learning abilities of evolved, plastic neural networks. Complementary experiments reveal that this result is likely the consequence of the bias of developmental encodings towards regular structures: (1) in our experimental setup, encodings that tend to produce more regular networks yield networks with better general learning abilities; (2) whatever the encoding is, networks that are the more regular are statistically those that have the best learning abilities. PMID:24236099

  20. On the relationships between generative encodings, regularity, and learning abilities when evolving plastic artificial neural networks.

    PubMed

    Tonelli, Paul; Mouret, Jean-Baptiste

    2013-01-01

    A major goal of bio-inspired artificial intelligence is to design artificial neural networks with abilities that resemble those of animal nervous systems. It is commonly believed that two keys for evolving nature-like artificial neural networks are (1) the developmental process that links genes to nervous systems, which enables the evolution of large, regular neural networks, and (2) synaptic plasticity, which allows neural networks to change during their lifetime. So far, these two topics have been mainly studied separately. The present paper shows that they are actually deeply connected. Using a simple operant conditioning task and a classic evolutionary algorithm, we compare three ways to encode plastic neural networks: a direct encoding, a developmental encoding inspired by computational neuroscience models, and a developmental encoding inspired by morphogen gradients (similar to HyperNEAT). Our results suggest that using a developmental encoding could improve the learning abilities of evolved, plastic neural networks. Complementary experiments reveal that this result is likely the consequence of the bias of developmental encodings towards regular structures: (1) in our experimental setup, encodings that tend to produce more regular networks yield networks with better general learning abilities; (2) whatever the encoding is, networks that are the more regular are statistically those that have the best learning abilities.

  1. The Relationship Quality Interview: evidence of reliability, convergent and divergent validity, and incremental utility.

    PubMed

    Lawrence, Erika; Barry, Robin A; Brock, Rebecca L; Bunde, Mali; Langer, Amie; Ro, Eunyoe; Fazio, Emily; Mulryan, Lorin; Hunt, Sara; Madsen, Lisa; Dzankovic, Sandra

    2011-03-01

    Relationship satisfaction and adjustment have been the target outcome variables for almost all couple research and therapies. In contrast, far less attention has been paid to the assessment of relationship quality. The present study introduces the Relationship Quality Interview (RQI), a semistructured, behaviorally anchored individual interview. The RQI was designed to provide a more objective assessment of relationship quality as a dynamic, dyadic construct across 5 dimensions: (a) quality of emotional intimacy in the relationship, (b) quality of the couple's sexual relationship, (c) quality of support transactions in the relationship, (d) quality of the couple's ability to share power in the relationship, and (e) quality of conflict/problem-solving interactions in the relationship. Psychometric properties of RQI ratings were examined through scores obtained from self-report questionnaires and behavioral observation data collected cross-sectionally from a sample of 91 dating participants and longitudinally from a sample of 101 married couples. RQI ratings demonstrated strong reliability (internal consistency, interrater agreement, interpartner agreement, and correlations among scales), convergent validity (correlations between RQI scale ratings and questionnaire scores assessing similar domains of relationship quality), and divergent validity (correlations between RQI scale ratings and (a) behavioral observation codes assessing related constructs, (b) global relationship satisfaction scores, and (c) scores on individual difference measures of related constructs). Clinical implications of the RQI for improving couple assessment and interventions are discussed.

  2. Evolving Relationship Structures in Multi-sourcing Arrangements: The Case of Mission Critical Outsourcing

    NASA Astrophysics Data System (ADS)

    Heitlager, Ilja; Helms, Remko; Brinkkemper, Sjaak

    Information Technology Outsourcing practice and research mainly considers the outsourcing phenomenon as a generic fulfilment of the IT function by external parties. Inspired by the logic of commodity, core competencies and economies of scale; assets, existing departments and IT functions are transferred to external parties. Although the generic approach might work for desktop outsourcing, where standardisation is the dominant factor, it does not work for the management of mission critical applications. Managing mission critical applications requires a different approach where building relationships is critical. The relationships involve inter and intra organisational parties in a multi-sourcing arrangement, called an IT service chain, consisting of multiple (specialist) parties that have to collaborate closely to deliver high quality services.

  3. The relationship between unstandardized and standardized alpha, true reliability, and the underlying measurement model.

    PubMed

    Falk, Carl F; Savalei, Victoria

    2011-01-01

    Popular computer programs print 2 versions of Cronbach's alpha: unstandardized alpha, α(Σ), based on the covariance matrix, and standardized alpha, α(R), based on the correlation matrix. Sources that accurately describe the theoretical distinction between the 2 coefficients are lacking, which can lead to the misconception that the differences between α(R) and α(Σ) are unimportant and to the temptation to report the larger coefficient. We explore the relationship between α(R) and α(Σ) and the reliability of the standardized and unstandardized composite under 3 popular measurement models; we clarify the theoretical meaning of each coefficient and conclude that researchers should choose an appropriate reliability coefficient based on theoretical considerations. We also illustrate that α(R) and α(Σ) estimate the reliability of different composite scores, and in most cases cannot be substituted for one another. PMID:21859284

  4. Assessing the Complex and Evolving Relationship between Charges and Payments in US Hospitals: 1996 – 2012

    PubMed Central

    Bulchis, Anne G.; Lomsadze, Liya; Joseph, Jonathan; Baral, Ranju; Bui, Anthony L.; Horst, Cody; Johnson, Elizabeth; Dieleman, Joseph L.

    2016-01-01

    Background In 2013 the United States spent $2.9 trillion on health care, more than in any previous year. Much of the debate around slowing health care spending growth focuses on the complicated pricing system for services. Our investigation contributes to knowledge of health care spending by assessing the relationship between charges and payments in the inpatient hospital setting. In the US, charges and payments differ because of a complex set of incentives that connect health care providers and funders. Our methodology can also be applied to adjust charge data to reflect actual spending. Methods We extracted cause of health care encounter (cause), primary payer (payer), charge, and payment information for 50,172 inpatient hospital stays from 1996 through 2012. We used linear regression to assess the relationship between charges and payments, stratified by payer, year, and cause. We applied our estimates to a large, nationally representative hospital charge sample to estimate payments. Results The average amount paid per $1 charged varies significantly across three dimensions: payer, year, and cause. Among the 10 largest causes of health care spending, average payments range from 23 to 55 cents per dollar charged. Over time, the amount paid per dollar charged is decreasing for those with private or public insurance, signifying that inpatient charges are increasing faster than the amount insurers pay. Conversely, the amount paid by out-of-pocket payers per dollar charged is increasing over time for several causes. Applying our estimates to a nationally representative hospital charge sample generates payment estimates which align with the official US estimates of inpatient spending. Conclusions The amount paid per $1 charged fluctuates significantly depending on the cause of a health care encounter and the primary payer. In addition, the amount paid per charge is changing over time. Transparent accounting of hospital spending requires a detailed assessment of the

  5. Visual perspective in autobiographical memories: reliability, consistency, and relationship to objective memory performance.

    PubMed

    Siedlecki, Karen L

    2015-01-01

    Visual perspective in autobiographical memories was examined in terms of reliability, consistency, and relationship to objective memory performance in a sample of 99 individuals. Autobiographical memories may be recalled from two visual perspectives--a field perspective in which individuals experience the memory through their own eyes, or an observer perspective in which individuals experience the memory from the viewpoint of an observer in which they can see themselves. Participants recalled nine word-cued memories that differed in emotional valence (positive, negative and neutral) and rated their memories on 18 scales. Results indicate that visual perspective was the most reliable memory characteristic overall and is consistently related to emotional intensity at the time of recall and amount of emotion experienced during the memory. Visual perspective is unrelated to memory for words, stories, abstract line drawings or faces.

  6. A novel method to detect proteins evolving at correlated rates: identifying new functional relationships between coevolving proteins.

    PubMed

    Clark, Nathaniel L; Aquadro, Charles F

    2010-05-01

    Interacting proteins evolve at correlated rates, possibly as the result of evolutionary pressures shared by functional groups and/or coevolution between interacting proteins. This evolutionary signature can be exploited to learn more about protein networks and to infer functional relationships between proteins on a genome-wide scale. Multiple methods have been introduced that detect correlated evolution using amino acid distances. One assumption made by these methods is that the neutral rate of nucleotide substitution is uniform over time; however, this is unlikely and such rate heterogeneity would adversely affect amino acid distance methods. We explored alternative methods that detect correlated rates using protein-coding nucleotide sequences in order to better estimate the rate of nonsynonymous substitution at each branch (d(N)) normalized by the underlying synonymous substitution rate (d(S)). Our novel likelihood method, which was robust to realistic simulation parameters, was tested on Drosophila nuclear pore proteins, which form a complex with well-documented physical interactions. The method revealed significantly correlated evolution between nuclear pore proteins, where members of a stable subcomplex showed stronger correlations compared with those proteins that interact transiently. Furthermore, our likelihood approach was better able to detect correlated evolution among closely related species than previous methods. Hence, these sequence-based methods are a complementary approach for detecting correlated evolution and could be applied genome-wide to provide candidate protein-protein interactions and functional group assignments using just coding sequences.

  7. Structural and reliability analysis of quality of relationship index in cancer patients.

    PubMed

    Cousson-Gélie, Florence; de Chalvron, Stéphanie; Zozaya, Carole; Lafaye, Anaïs

    2013-01-01

    Among psychosocial factors affecting emotional adjustment and quality of life, social support is one of the most important and widely studied in cancer patients, but little is known about the perception of support in specific significant relationships in patients with cancer. This study examined the psychometric properties of the Quality of Relationship Inventory (QRI) by evaluating its factor structure and its convergent and discriminant validity in a sample of cancer patients. A total of 388 patients completed the QRI. Convergent validity was evaluated by testing the correlations between the QRI subscales and measures of general social support, anxiety and depression symptoms. Discriminant validity was examined by testing group comparison. The QRI's longitudinal invariance across time was also tested. Principal axis factor analysis with promax rotation identified three factors accounting for 42.99% of variance: perceived social support, depth, and interpersonal conflict. Estimates of reliability with McDonald's ω coefficient were satisfactory for all the QRI subscales (ω ranging from 0.75 - 0.85). Satisfaction from general social support was negatively correlated with the interpersonal conflict subscale and positively with the depth subscale. The interpersonal conflict and social support scales were correlated with depression and anxiety scores. We also found a relative stability of QRI subscales (measured 3 months after the first evaluation) and differences between partner status and gender groups. The Quality of Relationship Inventory is a valid tool for assessing the quality of social support in a particular relationship with cancer patients. PMID:23514252

  8. Palmar Creases: Classification, Reliability and Relationships to Fetal Alcohol Spectrum Disorders (FASD).

    PubMed

    Mattison, Siobhán M; Brunson, Emily K; Holman, Darryl J

    2015-09-01

    A normal human palm contains 3 major creases: the distal transverse crease; the proximal transverse crease; and the thenar crease. Because permanent crease patterns are thought to be laid down during the first trimester, researchers have speculated that deviations in crease patterns could be indicative of insults during fetal development. The purpose of this study was twofold: (1) to compare the efficacy and reliability of two coding methods, the first (M1) classifying both "simiana" and Sydney line variants and the second (M2) counting the total number of crease points of origin on the radial border of the hand; and (2) to ascertain the relationship between palmar crease patterns and fetal alcohol spectrum disorders (FASD). Bilateral palm prints were taken using the carbon paper and tape method from 237 individuals diagnosed with FASD and 190 unexposed controls. All prints were coded for crease variants under M1 and M2. Additionally, a random sample of 98 matched (right and left) prints was selected from the controls to determine the reliabilities of M1 and M2. For this analysis, each palm was read twice, at different times, by two readers. Intra-observer Kappa coefficients were similar under both methods, ranging from 0.804-0.910. Inter-observer Kappa coefficients ranged from 0.582-0.623 under M1 and from 0.647-0.757 under M2. Using data from the entire sample of 427 prints and controlling for sex and ethnicity (white v. non-white), no relationship was found between palmar crease variants and FASD. Our results suggest that palmar creases can be classified reliably, but palmar crease patterns may not be affected by fetal alcohol exposure.

  9. Palmar Creases: Classification, Reliability and Relationships to Fetal Alcohol Spectrum Disorders (FASD).

    PubMed

    Mattison, Siobhán M; Brunson, Emily K; Holman, Darryl J

    2015-09-01

    A normal human palm contains 3 major creases: the distal transverse crease; the proximal transverse crease; and the thenar crease. Because permanent crease patterns are thought to be laid down during the first trimester, researchers have speculated that deviations in crease patterns could be indicative of insults during fetal development. The purpose of this study was twofold: (1) to compare the efficacy and reliability of two coding methods, the first (M1) classifying both "simiana" and Sydney line variants and the second (M2) counting the total number of crease points of origin on the radial border of the hand; and (2) to ascertain the relationship between palmar crease patterns and fetal alcohol spectrum disorders (FASD). Bilateral palm prints were taken using the carbon paper and tape method from 237 individuals diagnosed with FASD and 190 unexposed controls. All prints were coded for crease variants under M1 and M2. Additionally, a random sample of 98 matched (right and left) prints was selected from the controls to determine the reliabilities of M1 and M2. For this analysis, each palm was read twice, at different times, by two readers. Intra-observer Kappa coefficients were similar under both methods, ranging from 0.804-0.910. Inter-observer Kappa coefficients ranged from 0.582-0.623 under M1 and from 0.647-0.757 under M2. Using data from the entire sample of 427 prints and controlling for sex and ethnicity (white v. non-white), no relationship was found between palmar crease variants and FASD. Our results suggest that palmar creases can be classified reliably, but palmar crease patterns may not be affected by fetal alcohol exposure. PMID:26898079

  10. Impact of relationships between test and training animals and among training animals on reliability of genomic prediction.

    PubMed

    Wu, X; Lund, M S; Sun, D; Zhang, Q; Su, G

    2015-10-01

    One of the factors affecting the reliability of genomic prediction is the relationship among the animals of interest. This study investigated the reliability of genomic prediction in various scenarios with regard to the relationship between test and training animals, and among animals within the training data set. Different training data sets were generated from EuroGenomics data and a group of Nordic Holstein bulls (born in 2005 and afterwards) as a common test data set. Genomic breeding values were predicted using a genomic best linear unbiased prediction model and a Bayesian mixture model. The results showed that a closer relationship between test and training animals led to a higher reliability of genomic predictions for the test animals, while a closer relationship among training animals resulted in a lower reliability. In addition, the Bayesian mixture model in general led to a slightly higher reliability of genomic prediction, especially for the scenario of distant relationships between training and test animals. Therefore, to prevent a decrease in reliability, constant updates of the training population with animals from more recent generations are required. Moreover, a training population consisting of less-related animals is favourable for reliability of genomic prediction.

  11. Food Thought Suppression Inventory: Test-retest reliability and relationship to weight loss treatment outcomes.

    PubMed

    Barnes, Rachel D; Ivezaj, Valentina; Grilo, Carlos M

    2016-08-01

    This study examined the test-retest reliability of the Food Thought Suppression Inventory (FTSI) and its relationship with weight loss during weight loss treatment. Participants were 89 adults with and without binge eating disorder (BED) recruited through primary care for weight loss treatment who completed the FTSI twice prior to starting treatment. Intra-class correlations for the FTSI ranged from .74-.93. Participants with BED scored significantly higher on the FTSI than those without BED at baseline only. Percent weight loss from baseline to mid-treatment was significantly negatively correlated with the FTSI at baseline and at post-treatment. Participants reaching 5% loss of original body weight by post-treatment had significantly lower FTSI scores at post assessment when compared to those who did not reach this weight loss goal. While baseline binge-eating episodes were significantly positively correlated with baseline FTSI scores, change in binge-eating episodes during treatment were not significantly related to FTSI scores. The FTSI showed satisfactory one week test-retest reliability. Higher levels of food thought suppression may impair individuals' ability to lose weight while receiving weight loss treatment. PMID:27112114

  12. Food Thought Suppression Inventory: Test-retest reliability and relationship to weight loss treatment outcomes.

    PubMed

    Barnes, Rachel D; Ivezaj, Valentina; Grilo, Carlos M

    2016-08-01

    This study examined the test-retest reliability of the Food Thought Suppression Inventory (FTSI) and its relationship with weight loss during weight loss treatment. Participants were 89 adults with and without binge eating disorder (BED) recruited through primary care for weight loss treatment who completed the FTSI twice prior to starting treatment. Intra-class correlations for the FTSI ranged from .74-.93. Participants with BED scored significantly higher on the FTSI than those without BED at baseline only. Percent weight loss from baseline to mid-treatment was significantly negatively correlated with the FTSI at baseline and at post-treatment. Participants reaching 5% loss of original body weight by post-treatment had significantly lower FTSI scores at post assessment when compared to those who did not reach this weight loss goal. While baseline binge-eating episodes were significantly positively correlated with baseline FTSI scores, change in binge-eating episodes during treatment were not significantly related to FTSI scores. The FTSI showed satisfactory one week test-retest reliability. Higher levels of food thought suppression may impair individuals' ability to lose weight while receiving weight loss treatment.

  13. Establishing a Reliable Depth-Age Relationship for the Denali Ice Core

    NASA Astrophysics Data System (ADS)

    Wake, C. P.; Osterberg, E. C.; Winski, D.; Ferris, D.; Kreutz, K. J.; Introne, D.; Dalton, M.

    2015-12-01

    Reliable climate reconstruction from ice core records requires the development of a reliable depth-age relationship. We have established a sub-annual resolution depth-age relationship for the upper 198 meters of a 208 m ice core recovered in 2013 from Mt. Hunter (3,900 m asl), Denali National Park, central Alaska. The dating of the ice core was accomplished via annual layer counting of glaciochemical time-series combined with identification of reference horizons from volcanic eruptions and atmospheric nuclear weapons testing. Using the continuous ice core melter system at Dartmouth College, sub-seasonal samples have been collected and analyzed for major ions, liquid conductivity, particle size and concentration, and stable isotope ratios. Annual signals are apparent in several of the chemical species measured in the ice core samples. Calcium and magnesium peak in the spring, ammonium peaks in the summer, methanesulfonic acid (MSA) peaks in the autumn, and stable isotopes display a strong seasonal cycle with the most depleted values occurring during the winter. Thin ice layers representing infrequent summertime melt were also used to identify summer layers in the core. Analysis of approximately one meter sections of the core via nondestructive gamma spectrometry over depths from 84 to 124 m identified a strong radioactive cesium-137 peak at 89 m which corresponds to the 1963 layer deposited during extensive atmospheric nuclear weapons testing. Peaks in the sulfate and chloride record have been used for the preliminary identification of volcanic signals preserved in the ice core, including ten events since 1883. We are confident that the combination of robust annual layers combined with reference horizons provides a timescale for the 20th century that has an error of less than 0.5 years, making calibrations between ice core records and the instrumental climate data particularly robust. Initial annual layer counting through the entire 198 m suggests the Denali Ice

  14. Fold and fabric relationships in temporally and spatially evolving slump systems: A multi-cell flow model

    NASA Astrophysics Data System (ADS)

    Alsop, G. Ian; Marco, Shmuel

    2014-06-01

    Folds generated in ductile metamorphic terranes and within unlithified sediments affected by slumping are geometrically identical to one another, and distinguishing the origin of such folds in ancient lithified rocks is therefore challenging. Foliation is observed to lie broadly parallel to the axial planes of tectonic folds, whilst it is frequently regarded as absent in slump folds. The presence of foliation is therefore often considered as a reliable criterion for distinguishing tectonic folds from those created during slumping. To test this assertion, we have examined a series of well exposed slump folds within the late Pleistocene Lisan Formation of the Dead Sea Basin. These slumps contain a number of different foliation types, including an axial-planar grain-shape fabric and a crenulation cleavage formed via microfolding of bedding laminae. Folds also contain a spaced disjunctive foliation characterised by extensional displacements across shear fractures. This spaced foliation fans around recumbent fold hinges, with kinematics reversing across the axial plane indicating a flexural shear fold mechanism. Overall, the spaced foliation is penecontemporaneous with each individual slump where it occurs, although in detail it is pre, syn or post the local folds. The identification of foliations within undoubted slump folds indicates that the presence or absence of foliation is not in itself a robust criterion to distinguish tectonic from soft-sediment folds. Extensional shear fractures displaying a range of temporal relationships with slump folds suggests that traditional single-cell flow models, where extension is focussed at the head and contraction in the lower toe of the slump, are a gross simplification. We therefore propose a new multi-cell flow model involving coeval second-order flow cells that interact with neighbouring cells during translation of the slump.

  15. Study on Precipitation Anomalies of North of China in April and Its relationship to Sea Surface Temperature Evolvement

    NASA Astrophysics Data System (ADS)

    Song, Y.; Li, Z.; Guan, Y.

    2012-04-01

    Using monthly precipitation data in North of China for 1960-2007, American NCEP/NCAR monthly reanalysis data and NOAA SST (sea surface temperature) data, and SST indices data in Climate System Monitoring Bulletin collected by National Climate Center, this paper studied the general circulation, large-scale weather system anomalous characteristics and SSTA evolvement with more rainfall of North of China in April. The results showed that precipitation differences between months in spring in North of China were quite obvious, and the correlation coefficients between precipitation of North of China in April and that in March and in May were not significant respectively. The linear trend of precipitation in April was out of phase with that in spring. It was meaningful to study precipitation in April solely. The space pattern of first leading mode of EOF analysis for precipitation of North of China in April indicated that rainfall changed synchronously. In years of more rainfall in April showed negative phase of EU pattern in 500hPa geopotential height field of high latitude in the Northern Hemisphere, and North of China located at where cold and warm air masses met, which availed reinforcement of south wind and ascending motion. In middle and high latitudes was latitudinal circulation, and North of China was controlled by warm ridge and latitudinal large-scale front zone; In years of less rainfall, meridional circulation prevailed and large-scale front zone located northward and presented meridional pattern, and North of China was affected by cold air mass. At the same time, water vapor was transported strongly from Pacific, South China Sea and southwest of China, and reached Northeast of China. In years of less rainfall, the water vapor transportation was quite weak. The rainfall was related closely to sea surface temperature anomalies, especially to the Indian Ocean, the middle and east of Pacific, middle and south of Pacific and northwest of Pacific where there were

  16. An Examination of Coach and Player Relationships According to the Adapted LMX 7 Scale: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Caliskan, Gokhan

    2015-01-01

    The current study aims to test the reliability and validity of the Leader-Member Exchange (LMX 7) scale with regard to coach--player relationships in sports settings. A total of 330 professional soccer players from the Turkish Super League as well as from the First and Second Leagues participated in this study. Factor analyses were performed to…

  17. Merlino-Perkins Father-Daughter Relationship Inventory (MP-FDI): Construction, Reliability, Validity, and Implications for Counseling and Research

    ERIC Educational Resources Information Center

    Merlino Perkins, Rose J.

    2008-01-01

    The Merlino-Perkins Father-Daughter Relationship Inventory, a self-report instrument, assesses women's childhood interactions with supportive, doting, distant, controlling, tyrannical, physically abusive, absent, and seductive fathers. Item and scale development, psychometric findings drawn from factor analyses, reliability assessments, and…

  18. The Relationship Quality Interview: Evidence of Reliability, Convergent and Divergent Validity, and Incremental Utility

    ERIC Educational Resources Information Center

    Lawrence, Erika; Barry, Robin A.; Brock, Rebecca L.; Bunde, Mali; Langer, Amie; Ro, Eunyoe; Fazio, Emily; Mulryan, Lorin; Hunt, Sara; Madsen, Lisa; Dzankovic, Sandra

    2011-01-01

    Relationship satisfaction and adjustment have been the target outcome variables for almost all couple research and therapies. In contrast, far less attention has been paid to the assessment of relationship quality. The present study introduces the Relationship Quality Interview (RQI), a semistructured, behaviorally anchored individual interview.…

  19. Life has Evolved to Evolve

    NASA Astrophysics Data System (ADS)

    Deem, Michael

    2006-03-01

    Concomitant with the evolution of biological diversity must have been the evolution of mechanisms that facilitate evolution, due to the essentially infinite complexity of protein sequence space. We describe how evolvability can be an object of Darwinian selection, emphasizing the collective nature of the process. Rapid or dramatic environmental change leads to selection for greater evolvability. The selective pressure for large scale genetic moves, such as DNA exchange, becomes increasingly strong as the environmental conditions become more uncertain. These results demonstrate that evolvability is a selectable trait and allow for the explanation of a large body of experimental results. Many observations within evolutionary biology, heretofore considered evolutionary happenstance or accidents, are explained by selection for evolvability. As specific examaples, we discuss evolution within the immune system and evolution of drug resistant microrganisims.

  20. On the Relationship between Maximal Reliability and Maximal Validity of Linear Composites

    ERIC Educational Resources Information Center

    Penev, Spiridon; Raykov, Tenko

    2006-01-01

    A linear combination of a set of measures is often sought as an overall score summarizing subject performance. The weights in this composite can be selected to maximize its reliability or to maximize its validity, and the optimal choice of weights is in general not the same for these two optimality criteria. We explore several relationships…

  1. Reliable Attention Network Scores and Mutually Inhibited Inter-network Relationships Revealed by Mixed Design and Non-orthogonal Method.

    PubMed

    Wang, Yi-Feng; Jing, Xiu-Juan; Liu, Feng; Li, Mei-Ling; Long, Zhi-Liang; Yan, Jin H; Chen, Hua-Fu

    2015-01-01

    The attention system can be divided into alerting, orienting, and executive control networks. The efficiency and independence of attention networks have been widely tested with the attention network test (ANT) and its revised versions. However, many studies have failed to find effects of attention network scores (ANSs) and inter-network relationships (INRs). Moreover, the low reliability of ANSs can not meet the demands of theoretical and empirical investigations. Two methodological factors (the inter-trial influence in the event-related design and the inter-network interference in orthogonal contrast) may be responsible for the unreliability of ANT. In this study, we combined the mixed design and non-orthogonal method to explore ANSs and directional INRs. With a small number of trials, we obtained reliable and independent ANSs (split-half reliability of alerting: 0.684; orienting: 0.588; and executive control: 0.616), suggesting an individual and specific attention system. Furthermore, mutual inhibition was observed when two networks were operated simultaneously, indicating a differentiated but integrated attention system. Overall, the reliable and individual specific ANSs and mutually inhibited INRs provide novel insight into the understanding of the developmental, physiological and pathological mechanisms of attention networks, and can benefit future experimental and clinical investigations of attention using ANT.

  2. Reliable Attention Network Scores and Mutually Inhibited Inter-network Relationships Revealed by Mixed Design and Non-orthogonal Method

    PubMed Central

    Wang, Yi-Feng; Jing, Xiu-Juan; Liu, Feng; Li, Mei-Ling; Long, Zhi-Liang; Yan, Jin H.; Chen, Hua-Fu

    2015-01-01

    The attention system can be divided into alerting, orienting, and executive control networks. The efficiency and independence of attention networks have been widely tested with the attention network test (ANT) and its revised versions. However, many studies have failed to find effects of attention network scores (ANSs) and inter-network relationships (INRs). Moreover, the low reliability of ANSs can not meet the demands of theoretical and empirical investigations. Two methodological factors (the inter-trial influence in the event-related design and the inter-network interference in orthogonal contrast) may be responsible for the unreliability of ANT. In this study, we combined the mixed design and non-orthogonal method to explore ANSs and directional INRs. With a small number of trials, we obtained reliable and independent ANSs (split-half reliability of alerting: 0.684; orienting: 0.588; and executive control: 0.616), suggesting an individual and specific attention system. Furthermore, mutual inhibition was observed when two networks were operated simultaneously, indicating a differentiated but integrated attention system. Overall, the reliable and individual specific ANSs and mutually inhibited INRs provide novel insight into the understanding of the developmental, physiological and pathological mechanisms of attention networks, and can benefit future experimental and clinical investigations of attention using ANT. PMID:25997025

  3. An Adaptation, Validity and Reliability of the Lifespan Sibling Relationship Scale to the Turkish Adolescents

    ERIC Educational Resources Information Center

    Öz, F. Selda

    2015-01-01

    The purpose of this study is to adapt the Lifespan Sibling Relationship Scale (LSRS) developed by Riggio (2000) to Turkish. The scale with its original form in English consists of 48 items in total. The original scale was translated into Turkish by three instructors who are proficient both in the field and the language. Later, the original and…

  4. Relationship between evolving epileptiform activity and delayed loss of mitochondrial activity after asphyxia measured by near-infrared spectroscopy in preterm fetal sheep

    PubMed Central

    Bennet, L; Roelfsema, V; Pathipati, P; Quaedackers, J S; Gunn, A J

    2006-01-01

    Early onset cerebral hypoperfusion after birth is highly correlated with neurological injury in premature infants, but the relationship with the evolution of injury remains unclear. We studied changes in cerebral oxygenation, and cytochrome oxidase (CytOx) using near-infrared spectroscopy in preterm fetal sheep (103–104 days of gestation, term is 147 days) during recovery from a profound asphyxial insult (n = 7) that we have shown produces severe subcortical injury, or sham asphyxia (n = 7). From 1 h after asphyxia there was a significant secondary fall in carotid blood flow (P < 0.001), and total cerebral blood volume, as reflected by total haemoglobin (P < 0.005), which only partially recovered after 72 h. Intracerebral oxygenation (difference between oxygenated and deoxygenated haemoglobin concentrations) fell transiently at 3 and 4 h after asphyxia (P < 0.01), followed by a substantial increase to well over sham control levels (P < 0.001). CytOx levels were normal in the first hour after occlusion, was greater than sham control values at 2–3 h (P < 0.05), but then progressively fell, and became significantly suppressed from 10 h onward (P < 0.01). In the early hours after reperfusion the fetal EEG was highly suppressed, with a superimposed mixture of fast and slow epileptiform transients; overt seizures developed from 8 ± 0.5 h. These data strongly indicate that severe asphyxia leads to delayed, evolving loss of mitochondrial oxidative metabolism, accompanied by late seizures and relative luxury perfusion. In contrast, the combination of relative cerebral deoxygenation with evolving epileptiform transients in the early recovery phase raises the possibility that these early events accelerate or worsen the subsequent mitochondrial failure. PMID:16484298

  5. Relationship between evolving epileptiform activity and delayed loss of mitochondrial activity after asphyxia measured by near-infrared spectroscopy in preterm fetal sheep.

    PubMed

    Bennet, L; Roelfsema, V; Pathipati, P; Quaedackers, J S; Gunn, A J

    2006-04-01

    Early onset cerebral hypoperfusion after birth is highly correlated with neurological injury in premature infants, but the relationship with the evolution of injury remains unclear. We studied changes in cerebral oxygenation, and cytochrome oxidase (CytOx) using near-infrared spectroscopy in preterm fetal sheep (103-104 days of gestation, term is 147 days) during recovery from a profound asphyxial insult (n= 7) that we have shown produces severe subcortical injury, or sham asphyxia (n= 7). From 1 h after asphyxia there was a significant secondary fall in carotid blood flow (P < 0.001), and total cerebral blood volume, as reflected by total haemoglobin (P < 0.005), which only partially recovered after 72 h. Intracerebral oxygenation (difference between oxygenated and deoxygenated haemoglobin concentrations) fell transiently at 3 and 4 h after asphyxia (P < 0.01), followed by a substantial increase to well over sham control levels (P < 0.001). CytOx levels were normal in the first hour after occlusion, was greater than sham control values at 2-3 h (P < 0.05), but then progressively fell, and became significantly suppressed from 10 h onward (P < 0.01). In the early hours after reperfusion the fetal EEG was highly suppressed, with a superimposed mixture of fast and slow epileptiform transients; overt seizures developed from 8 +/- 0.5 h. These data strongly indicate that severe asphyxia leads to delayed, evolving loss of mitochondrial oxidative metabolism, accompanied by late seizures and relative luxury perfusion. In contrast, the combination of relative cerebral deoxygenation with evolving epileptiform transients in the early recovery phase raises the possibility that these early events accelerate or worsen the subsequent mitochondrial failure.

  6. Blood pressure and circulatory relationships with physical activity level in young normotensive individuals: IPAQ validity and reliability considerations.

    PubMed

    Alomari, Mahmoud A; Keewan, Esraa F; Qhatan, Redha; Amer, Ahmed; Khabour, Omar F; Maayah, Mikhled F; Hurtig-Wennlöf, Anita

    2011-01-01

    Physical activity (PA) reduces risk of cardiovascular diseases, including hypertension. However, the international physical activity questionnaire (IPAQ) relationships with blood pressure (BP) and flow (BF) and vascular resistance (VR) in healthy young individuals have not been studied. Therefore, BP, BF, and VR relationships with the IPAQ were evaluated in college normotensive students (18-23 yrs). Additionally, the IPAQ relationships with body fat (%BF), muscle mass (MM), body mass index (BMI), waist/hip (W/H) ratio, maximum walking distance in 6 min (6MWD), and handgrip strength (MHG) were examined to evaluate the questionnaire validity against fitness. Subsequently, the IPAQ was administered three times to examine its reliability. Walking, moderate, and total PAs correlated negatively with sysbolic blood pressure (SBP), diastolic blood pressure (DBP), and mean arterial pressure (MAP) (range: r = -3 to -0.5, p < 0.05). Additionally, all BP measures were greater in least physically active individuals. In a subgroup of 42 students, IPAQ sitting time correlated with BF (r = -0.3) and VR (r = 0.4). The intraclass correlation coefficient (ICC) for walking, moderate, vigorous, and total PAs and sitting time/week were, 0.97, 0.96, 0.97, 0.97, and 0.96, respectively. The males scored greater vigorous PA (p = 0.001) than the females, while moderate, walking, and total PAs were the same (p > 0.05). Additionally, vigorous PA correlated with %BF (r = -0.2), MM (r = 0.3), MHG (r = 0.3), and 6MWD (r = 0.3) and total PA correlated with MM (r = 0.2), MHG (r =0.2), and 6MWD (r = 0.3). The IPAQ association with the circulatory measures demonstrates PA importance for controlling BP and adds clinical value to the IPAQ. Additionally, the IPAQ is reliable, can discriminate between populations, and reasonably valid against health-related fitness.

  7. The Inventory of Teacher-Student Relationships: Factor Structure, Reliability, and Validity among African American Youth in Low-Income Urban Schools

    ERIC Educational Resources Information Center

    Murray, Christopher; Zvoch, Keith

    2011-01-01

    This study investigates the factor structure, reliability, and validity of the Inventory of Teacher-Student Relationships (IT-SR), a measure that was developed by adapting the widely used Inventory of Parent and Peer Attachments (Armsden & Greenberg, 1987) for use in the context of teacher-student relationships. The instrument was field tested…

  8. [Relationship between hope and subjective well-being: reliability and validity of the dispositional Hope Scale, Japanese version].

    PubMed

    Kato, Tsukasa; Snyder, C R

    2005-08-01

    We conducted three studies to translate the Snyder Hope Sales into Japanese, examine reliability and validity of the Japanese version, and investigate the relationship between the tendency to be hopeful and subjective well-being. In Study 1, confirmatory factor analysis was performed of the Hope Scale in the Japanese version: agency and pathways. Its test-retest reliability coefficients for the data from 113 undergraduates ranged from .81 to .84. In Study 2, concurrent validity of the Japanese version Hope Scale was examined with the data from 550 respondents, which looked at the correlations between hope and optimism, self-esteem, and self-efficacy. Results suggested that the Japanese version had high validity. In addition, the tendency to be hopeful had negative correlations with stress response, hopelessness, depressive tendency, and trait anxiety, and positive one with feeling of happiness. In Study 3, 175 undergraduates completed the Hope Scale and State-Trait Anxiety Inventory (STAI) immediately prior to final examinations. Results of regression analysis suggested that the tendency to be hopeful moderated examination anxiety. Taken together, results of the studies supported the hypothesis that hope had positive effects on subjective well-being.

  9. Relative and absolute reliability of a modified agility T-test and its relationship with vertical jump and straight sprint.

    PubMed

    Sassi, Radhouane Haj; Dardouri, Wajdi; Yahmed, Mohamed Haj; Gmada, Nabil; Mahfoudhi, Mohamed Elhedi; Gharbi, Zied

    2009-09-01

    The aims of this study were to evaluate the reliability of a modified agility T-test (MAT) and to examine its relationship to the free countermovement jump (FCMJ) and the 10-m straight sprint (10mSS). In this new version, we preserved the same nature of displacement of the T-test but we reduced the total distance to cover. A total of 86 subjects (34 women: age = 22.6 +/- 1.4 years; weight = 63.7 +/- 10.2 kg; height = 1.65 +/- 0.05 m; body mass index = 23.3 +/- 3.3 kg x m(-2) and 52 men: age = 22.4 +/- 1.5 years; weight = 68.7 +/- 8.0 kg; height = 1.77 +/- 0.06 m; body mass index = 22.0 +/- 2.0 kg x m(-2)) performed MAT, T-test, FCMJ, and 10mSS. Our results showed no difference between test-retest MAT scores. Intraclass reliability of the MAT was greater than 0.90 across the trials (0.92 and 0.95 for women and men, respectively). The mean difference (bias) +/- the 95% limits of agreement was 0.03 +/- 0.37 seconds for women and 0.03 +/- 0.33 seconds for men. MAT was correlated to the T-test (r = 0.79, p < 0.001 and r = 0.75, p < 0.001 for women and men, respectively). Significant correlations were found between both MAT and FCMJ, and MAT and 10mSS for women (r = -0.47, p < 0.01 and r = 0.34, p < 0.05, respectively). No significant correlations were found between MAT and all other tests for men. These results indicate that MAT is a reliable test to assess agility. The weak relationship between MAT and strength and straight speed suggests that agility requires other determinants of performance as coordination. Considering that field sports generally include sprints with change direction over short distance, MAT seems to be more specific than the T-test when assessing agility.

  10. How reliable are randomised controlled trials for studying the relationship between diet and disease? A narrative review.

    PubMed

    Temple, Norman J

    2016-08-01

    Large numbers of randomised controlled trials (RCT) have been carried out in order to investigate diet-disease relationships. This article examines eight sets of studies and compares the findings with those from epidemiological studies (cohort studies in seven of the cases). The studies cover the role of dietary factors in blood pressure, body weight, cancer and heart disease. In some cases, the findings from the two types of study are consistent, whereas in other cases the findings appear to be in conflict. A critical evaluation of this evidence suggests factors that may account for conflicting findings. Very often RCT recruit subjects with a history of the disease under study (or at high risk of it) and have a follow-up of only a few weeks or months. Cohort studies, in contrast, typically recruit healthy subjects and have a follow-up of 5-15 years. Owing to these differences, findings from RCT are not necessarily more reliable than those from well-designed prospective cohort studies. We cannot assume that the results of RCT can be freely applied beyond the specific features of the studies. PMID:27267302

  11. Changing and Evolving Relationships between Two- and Four-Year Colleges and Universities: They're Not Your Parents' Community Colleges Anymore

    ERIC Educational Resources Information Center

    Labov, Jay B.

    2012-01-01

    This paper describes a summit on Community Colleges in the Evolving STEM Education Landscape organized by a committee of the National Research Council (NRC) and the National Academy of Engineering (NAE) and held at the Carnegie Institution for Science on December 15, 2011. This summit followed a similar event organized by Dr. Jill Biden, spouse of…

  12. The Assessment of Positivity and Negativity in Social Networks: The Reliability and Validity of the Social Relationships Index

    ERIC Educational Resources Information Center

    Campo, Rebecca A.; Uchino, Bert N.; Holt-Lunstad, Julianne; Vaughn, Allison; Reblin, Maija; Smith, Timothy W.

    2009-01-01

    The Social Relationships Index (SRI) was designed to examine positivity and negativity in social relationships. Unique features of this scale include its brevity and the ability to examine relationship positivity and negativity at the level of the specific individual and social network. The SRI's psychometric properties were examined in three…

  13. Evolvable synthetic neural system

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A. (Inventor)

    2009-01-01

    An evolvable synthetic neural system includes an evolvable neural interface operably coupled to at least one neural basis function. Each neural basis function includes an evolvable neural interface operably coupled to a heuristic neural system to perform high-level functions and an autonomic neural system to perform low-level functions. In some embodiments, the evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy.

  14. [The reliability of reliability].

    PubMed

    Blancas Espejo, A

    1991-01-01

    The author critically analyzes an article by Rodolfo Corona Vazquez that questions the reliability of the preliminary results of the Eleventh Census of Population and Housing, conducted in Mexico in March 1990. The need to define what constitutes "reliability" for preliminary results is stressed. PMID:12317739

  15. Making Reliability Arguments in Classrooms

    ERIC Educational Resources Information Center

    Parkes, Jay; Giron, Tilia

    2006-01-01

    Reliability methodology needs to evolve as validity has done into an argument supported by theory and empirical evidence. Nowhere is the inadequacy of current methods more visible than in classroom assessment. Reliability arguments would also permit additional methodologies for evidencing reliability in classrooms. It would liberalize methodology…

  16. Reliability, Validity, and Associations with Sexual Behavior among Ghanaian Teenagers of Scales Measuring Four Dimensions Relationships with Parents and Other Adults

    PubMed Central

    Bingenheimer, Jeffrey B.; Asante, Elizabeth; Ahiadeke, Clement

    2013-01-01

    Little research has been done on the social contexts of adolescent sexual behaviors in sub-Saharan Africa. As part of a longitudinal cohort study (N=1275) of teenage girls and boys in two Ghanaian towns, interviewers administered a 26 item questionnaire module intended to assess four dimensions of youth-adult relationships: monitoring conflict, emotional support, and financial support. Confirmatory factor and traditional psychometric analyses showed the four scales to be reliable. Known-groups comparisons provided evidence of their validity. All four scales had strong bivariate associations with self-reported sexual behavior (odds ratios = 1.66, 0.74, 0.47, and 0.60 for conflict, support, monitoring, and financial support). The instrument is practical for use in sub-Saharan African settings and produces measures that are reliable, valid, and predictive of sexual behavior in youth. PMID:25821286

  17. Revised dyadic adjustment scale as a reliable tool for assessment of quality of marital relationship in patients on long-term hemodialysis.

    PubMed

    Assari, Shervin; Moghani Lankarani, Maryam; Tavallaii, Seyed Abbas

    2009-10-01

    Although the revised dyadic adjustment scale (RDAS) has been widely used as an indicator of the quality of marital relationship, no report is available on the reliability of this measure in patients on hemodialysis. We examined the internal consistency of the RDAS in a group of Iranian patients undergoing maintenance hemodialysis. A translated Persian version of the RDAS was self-administered to 135 patients. The internal consistency of the RDAS was tested using the Chronbach alpha coefficient which was 0.898, 0.683, 0.779, 0.827, and 0.836 for the RDAS total score and the dyadic consensus, affective expression, dyadic satisfaction, and dyadic cohesion subdomains, respectively. All of the Chronbach alpha scores were higher in patients with higher income and education level. Using the RDAS to examine marital relationship quality in patients on hemodialysis, the total score and almost all subscores except for dyadic consensus had adequate internal consistency.

  18. Evolving Sensitivity Balances Boolean Networks

    PubMed Central

    Luo, Jamie X.; Turner, Matthew S.

    2012-01-01

    We investigate the sensitivity of Boolean Networks (BNs) to mutations. We are interested in Boolean Networks as a model of Gene Regulatory Networks (GRNs). We adopt Ribeiro and Kauffman’s Ergodic Set and use it to study the long term dynamics of a BN. We define the sensitivity of a BN to be the mean change in its Ergodic Set structure under all possible loss of interaction mutations. Insilico experiments were used to selectively evolve BNs for sensitivity to losing interactions. We find that maximum sensitivity was often achievable and resulted in the BNs becoming topologically balanced, i.e. they evolve towards network structures in which they have a similar number of inhibitory and excitatory interactions. In terms of the dynamics, the dominant sensitivity strategy that evolved was to build BNs with Ergodic Sets dominated by a single long limit cycle which is easily destabilised by mutations. We discuss the relevance of our findings in the context of Stem Cell Differentiation and propose a relationship between pluripotent stem cells and our evolved sensitive networks. PMID:22586459

  19. Prokaryote and eukaryote evolvability.

    PubMed

    Poole, Anthony M; Phillips, Matthew J; Penny, David

    2003-05-01

    The concept of evolvability covers a broad spectrum of, often contradictory, ideas. At one end of the spectrum it is equivalent to the statement that evolution is possible, at the other end are untestable post hoc explanations, such as the suggestion that current evolutionary theory cannot explain the evolution of evolvability. We examine similarities and differences in eukaryote and prokaryote evolvability, and look for explanations that are compatible with a wide range of observations. Differences in genome organisation between eukaryotes and prokaryotes meets this criterion. The single origin of replication in prokaryote chromosomes (versus multiple origins in eukaryotes) accounts for many differences because the time to replicate a prokaryote genome limits its size (and the accumulation of junk DNA). Both prokaryotes and eukaryotes appear to switch from genetic stability to genetic change in response to stress. We examine a range of stress responses, and discuss how these impact on evolvability, particularly in unicellular organisms versus complex multicellular ones. Evolvability is also limited by environmental interactions (including competition) and we describe a model that places limits on potential evolvability. Examples are given of its application to predator competition and limits to lateral gene transfer. We suggest that unicellular organisms evolve largely through a process of metabolic change, resulting in biochemical diversity. Multicellular organisms evolve largely through morphological changes, not through extensive changes to cellular biochemistry. PMID:12689728

  20. REFLECTIONS ON EVOLVING CHANGE.

    PubMed

    Angood, Peter B

    2016-01-01

    Physician leadership is increasingly recognized as pivotal for improved change in health care. Multi-professional care teams, education and leadership are evolving trends that are important for health care's future. PMID:27295737

  1. On Component Reliability and System Reliability for Space Missions

    NASA Technical Reports Server (NTRS)

    Chen, Yuan; Gillespie, Amanda M.; Monaghan, Mark W.; Sampson, Michael J.; Hodson, Robert F.

    2012-01-01

    This paper is to address the basics, the limitations and the relationship between component reliability and system reliability through a study of flight computing architectures and related avionics components for NASA future missions. Component reliability analysis and system reliability analysis need to be evaluated at the same time, and the limitations of each analysis and the relationship between the two analyses need to be understood.

  2. Aquaporins and fluid transport: an evolving relationship.

    PubMed

    Fischbarg, J

    2006-01-01

    How epithelia transport fluid is controversial and remains undetermined. Two routes are possible: (1) via cell membranes and their aquaporins, or (2) paracellular. Our laboratory has recently developed experimental evidence and theoretical insights for fluid transport across corneal endothelium, a leaky epithelium. Aquaporin 1 (AQP1) is the only AQP present in these cells, and its deletion in AQP1 null mice significantly affects cell osmotic permeability but not fluid transport, which militates against sizable water movements across the cell. In contrast,AQP1 null mice cells have reduced regulatory volume decrease (only 60% of control), which suggests an AQP1 role in either the function or the expression of volume-sensitive membrane channels/transporters. Fluid movements can be produced by electrical currents, and the direction of the movement can be reversed by current reversal or by changing junctional electrical charges with polylysine. A mathematical model of corneal endothelium predicts experimental observations only when based on paracellular electro-osmosis. Our novel paradigm for this preparation includes: (1) paracellular fluid flow; (2) a crucial role for the junctions as a site for electro-osmosis; (3) hypotonicity of the primary secretion; (4) an AQP role in regulation and not as a significant water pathway. PMID:17543218

  3. ILZRO-sponsored field data collection and analysis to determine relationships between service conditions and reliability of VRLA batteries in stationary applications

    SciTech Connect

    Taylor, P.A.; Moseley, P.T.; Butler, P.C.

    1998-09-01

    Although valve-regulated lead-acid (VRLA) batteries have served in stationary applications for more than a decade, proprietary concerns of battery manufacturers and users and varying approaches to record-keeping have made the data available on performance and life relatively sparse and inconsistent. Such incomplete data are particularly detrimental to understanding the cause or causes of premature capacity loss (PCL) reported in VRLA batteries after as little as two years of service. The International Lead Zinc Research Organization (ILZRO), in cooperation with Sandia National Laboratories, has initiated a multi-phase project to characterize relationships between batteries, service conditions, and failure modes; establish the degree of correlation between specific operating procedures and PCL; identify operating procedures that mitigate PCL; identify best-fits between the operating requirements of specific applications and the capabilities of specific VRLA technologies; and recommend combinations of battery design, manufacturing processes, and operating conditions that enhance VRLA performance and reliability. This paper, prepared before preliminary conclusions were possible, presents the surveys distributed to manufacturers and end-users; discusses the analytic approach; presents an overview of the responses to the surveys and trends that emerge in the early analysis of the data; and previews the functionality of the database being constructed. The presentation of this paper will include preliminary results and information regarding the follow-on workshop for the study.

  4. Evolvable Neural Software System

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  5. Evolving Robust Gene Regulatory Networks

    PubMed Central

    Noman, Nasimul; Monjo, Taku; Moscato, Pablo; Iba, Hitoshi

    2015-01-01

    Design and implementation of robust network modules is essential for construction of complex biological systems through hierarchical assembly of ‘parts’ and ‘devices’. The robustness of gene regulatory networks (GRNs) is ascribed chiefly to the underlying topology. The automatic designing capability of GRN topology that can exhibit robust behavior can dramatically change the current practice in synthetic biology. A recent study shows that Darwinian evolution can gradually develop higher topological robustness. Subsequently, this work presents an evolutionary algorithm that simulates natural evolution in silico, for identifying network topologies that are robust to perturbations. We present a Monte Carlo based method for quantifying topological robustness and designed a fitness approximation approach for efficient calculation of topological robustness which is computationally very intensive. The proposed framework was verified using two classic GRN behaviors: oscillation and bistability, although the framework is generalized for evolving other types of responses. The algorithm identified robust GRN architectures which were verified using different analysis and comparison. Analysis of the results also shed light on the relationship among robustness, cooperativity and complexity. This study also shows that nature has already evolved very robust architectures for its crucial systems; hence simulation of this natural process can be very valuable for designing robust biological systems. PMID:25616055

  6. Evolvability is inevitable: increasing evolvability without the pressure to adapt.

    PubMed

    Lehman, Joel; Stanley, Kenneth O

    2013-01-01

    Why evolvability appears to have increased over evolutionary time is an important unresolved biological question. Unlike most candidate explanations, this paper proposes that increasing evolvability can result without any pressure to adapt. The insight is that if evolvability is heritable, then an unbiased drifting process across genotypes can still create a distribution of phenotypes biased towards evolvability, because evolvable organisms diffuse more quickly through the space of possible phenotypes. Furthermore, because phenotypic divergence often correlates with founding niches, niche founders may on average be more evolvable, which through population growth provides a genotypic bias towards evolvability. Interestingly, the combination of these two mechanisms can lead to increasing evolvability without any pressure to out-compete other organisms, as demonstrated through experiments with a series of simulated models. Thus rather than from pressure to adapt, evolvability may inevitably result from any drift through genotypic space combined with evolution's passive tendency to accumulate niches.

  7. Regolith Evolved Gas Analyzer

    NASA Technical Reports Server (NTRS)

    Hoffman, John H.; Hedgecock, Jud; Nienaber, Terry; Cooper, Bonnie; Allen, Carlton; Ming, Doug

    2000-01-01

    The Regolith Evolved Gas Analyzer (REGA) is a high-temperature furnace and mass spectrometer instrument for determining the mineralogical composition and reactivity of soil samples. REGA provides key mineralogical and reactivity data that is needed to understand the soil chemistry of an asteroid, which then aids in determining in-situ which materials should be selected for return to earth. REGA is capable of conducting a number of direct soil measurements that are unique to this instrument. These experimental measurements include: (1) Mass spectrum analysis of evolved gases from soil samples as they are heated from ambient temperature to 900 C; and (2) Identification of liberated chemicals, e.g., water, oxygen, sulfur, chlorine, and fluorine. REGA would be placed on the surface of a near earth asteroid. It is an autonomous instrument that is controlled from earth but does the analysis of regolith materials automatically. The REGA instrument consists of four primary components: (1) a flight-proven mass spectrometer, (2) a high-temperature furnace, (3) a soil handling system, and (4) a microcontroller. An external arm containing a scoop or drill gathers regolith samples. A sample is placed in the inlet orifice where the finest-grained particles are sifted into a metering volume and subsequently moved into a crucible. A movable arm then places the crucible in the furnace. The furnace is closed, thereby sealing the inner volume to collect the evolved gases for analysis. Owing to the very low g forces on an asteroid compared to Mars or the moon, the sample must be moved from inlet to crucible by mechanical means rather than by gravity. As the soil sample is heated through a programmed pattern, the gases evolved at each temperature are passed through a transfer tube to the mass spectrometer for analysis and identification. Return data from the instrument will lead to new insights and discoveries including: (1) Identification of the molecular masses of all of the gases

  8. Relationships

    ERIC Educational Resources Information Center

    Circle, David

    2006-01-01

    The author of this brief article asserts that one of the keys to being successful--whether one is a music teacher, a college professor, a business owner, a doctor, a lawyer, or in any other career--is his or her relationship with people. Music educators are in the people business. They do not make a tangible product. Instead, they provide a…

  9. Reliability training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  10. Our evolving universe

    NASA Astrophysics Data System (ADS)

    Longair, Malcolm S.

    Our Evolving Universe is a lucid, non-technical and infectiously enthusiastic introduction to current astronomy and cosmology. Highly illustrated throughout with the latest colour images from the world's most advanced telescopes, it also provides a colourful view of our Universe. Malcolm Longair takes us on a breathtaking tour of the most dramatic recent results astronomers have on the birth of stars, the hunt for black holes and dark matter, on gravitational lensing and the latest tests of the Big Bang. He leads the reader right up to understand the key questions that future research in astronomy and cosmology must answer. A clear and comprehensive glossary of technical terms is also provided. For the general reader, student or professional wishing to understand the key questions today's astronomers and cosmologists are trying to answer, this is an invaluable and inspiring read.

  11. Evolving synergetic interactions.

    PubMed

    Wu, Bin; Arranz, Jordi; Du, Jinming; Zhou, Da; Traulsen, Arne

    2016-07-01

    Cooperators forgo their own interests to benefit others. This reduces their fitness and thus cooperators are not likely to spread based on natural selection. Nonetheless, cooperation is widespread on every level of biological organization ranging from bacterial communities to human society. Mathematical models can help to explain under which circumstances cooperation evolves. Evolutionary game theory is a powerful mathematical tool to depict the interactions between cooperators and defectors. Classical models typically involve either pairwise interactions between individuals or a linear superposition of these interactions. For interactions within groups, however, synergetic effects may arise: their outcome is not just the sum of its parts. This is because the payoffs via a single group interaction can be different from the sum of any collection of two-player interactions. Assuming that all interactions start from pairs, how can such synergetic multiplayer games emerge from simpler pairwise interactions? Here, we present a mathematical model that captures the transition from pairwise interactions to synergetic multiplayer ones. We assume that different social groups have different breaking rates. We show that non-uniform breaking rates do foster the emergence of synergy, even though individuals always interact in pairs. Our work sheds new light on the mechanisms underlying such synergetic interactions. PMID:27466437

  12. Evolving endoscopic surgery.

    PubMed

    Sakai, Paulo; Faintuch, Joel

    2014-06-01

    Since the days of Albukasim in medieval Spain, natural orifices have been regarded not only as a rather repugnant source of bodily odors, fluids and excreta, but also as a convenient invitation to explore and treat the inner passages of the organism. However, surgical ingenuity needed to be matched by appropriate tools and devices. Lack of technologically advanced instrumentation was a strong deterrent during almost a millennium until recent decades when a quantum jump materialized. Endoscopic surgery is currently a vibrant and growing subspecialty, which successfully handles millions of patients every year. Additional opportunities lie ahead which might benefit millions more, however, requiring even more sophisticated apparatuses, particularly in the field of robotics, artificial intelligence, and tissue repair (surgical suturing). This is a particularly exciting and worthwhile challenge, namely of larger and safer endoscopic interventions, followed by seamless and scarless recovery. In synthesis, the future is widely open for those who use together intelligence and creativity to develop new prototypes, new accessories and new techniques. Yet there are many challenges in the path of endoscopic surgery. In this new era of robotic endoscopy, one will likely need a virtual simulator to train and assess the performance of younger doctors. More evidence will be essential in multiple evolving fields, particularly to elucidate whether more ambitious and complex pathways, such as intrathoracic and intraperitoneal surgery via natural orifice transluminal endoscopic surgery (NOTES), are superior or not to conventional techniques.

  13. Communicability across evolving networks

    NASA Astrophysics Data System (ADS)

    Grindrod, Peter; Parsons, Mark C.; Higham, Desmond J.; Estrada, Ernesto

    2011-04-01

    Many natural and technological applications generate time-ordered sequences of networks, defined over a fixed set of nodes; for example, time-stamped information about “who phoned who” or “who came into contact with who” arise naturally in studies of communication and the spread of disease. Concepts and algorithms for static networks do not immediately carry through to this dynamic setting. For example, suppose A and B interact in the morning, and then B and C interact in the afternoon. Information, or disease, may then pass from A to C, but not vice versa. This subtlety is lost if we simply summarize using the daily aggregate network given by the chain A-B-C. However, using a natural definition of a walk on an evolving network, we show that classic centrality measures from the static setting can be extended in a computationally convenient manner. In particular, communicability indices can be computed to summarize the ability of each node to broadcast and receive information. The computations involve basic operations in linear algebra, and the asymmetry caused by time’s arrow is captured naturally through the noncommutativity of matrix-matrix multiplication. Illustrative examples are given for both synthetic and real-world communication data sets. We also discuss the use of the new centrality measures for real-time monitoring and prediction.

  14. Evolving Concepts of Asthma.

    PubMed

    Gauthier, Marc; Ray, Anuradha; Wenzel, Sally E

    2015-09-15

    Our understanding of asthma has evolved over time from a singular disease to a complex of various phenotypes, with varied natural histories, physiologies, and responses to treatment. Early therapies treated most patients with asthma similarly, with bronchodilators and corticosteroids, but these therapies had varying degrees of success. Similarly, despite initial studies that identified an underlying type 2 inflammation in the airways of patients with asthma, biologic therapies targeted toward these type 2 pathways were unsuccessful in all patients. These observations led to increased interest in phenotyping asthma. Clinical approaches, both biased and later unbiased/statistical approaches to large asthma patient cohorts, identified a variety of patient characteristics, but they also consistently identified the importance of age of onset of disease and the presence of eosinophils in determining clinically relevant phenotypes. These paralleled molecular approaches to phenotyping that developed an understanding that not all patients share a type 2 inflammatory pattern. Using biomarkers to select patients with type 2 inflammation, repeated trials of biologics directed toward type 2 cytokine pathways saw newfound success, confirming the importance of phenotyping in asthma. Further research is needed to clarify additional clinical and molecular phenotypes, validate predictive biomarkers, and identify new areas for possible interventions.

  15. Stochastically evolving networks

    NASA Astrophysics Data System (ADS)

    Chan, Derek Y.; Hughes, Barry D.; Leong, Alex S.; Reed, William J.

    2003-12-01

    We discuss a class of models for the evolution of networks in which new nodes are recruited into the network at random times, and links between existing nodes that are not yet directly connected may also form at random times. The class contains both models that produce “small-world” networks and less tightly linked models. We produce both trees, appropriate in certain biological applications, and networks in which closed loops can appear, which model communication networks and networks of human sexual interactions. One of our models is closely related to random recursive trees, and some exact results known in that context can be exploited. The other models are more subtle and difficult to analyze. Our analysis includes a number of exact results for moments, correlations, and distributions of coordination number and network size. We report simulations and also discuss some mean-field approximations. If the system has evolved for a long time and the state of a random node (which thus has a random age) is observed, power-law distributions for properties of the system arise in some of these models.

  16. Evolving endoscopic surgery.

    PubMed

    Sakai, Paulo; Faintuch, Joel

    2014-06-01

    Since the days of Albukasim in medieval Spain, natural orifices have been regarded not only as a rather repugnant source of bodily odors, fluids and excreta, but also as a convenient invitation to explore and treat the inner passages of the organism. However, surgical ingenuity needed to be matched by appropriate tools and devices. Lack of technologically advanced instrumentation was a strong deterrent during almost a millennium until recent decades when a quantum jump materialized. Endoscopic surgery is currently a vibrant and growing subspecialty, which successfully handles millions of patients every year. Additional opportunities lie ahead which might benefit millions more, however, requiring even more sophisticated apparatuses, particularly in the field of robotics, artificial intelligence, and tissue repair (surgical suturing). This is a particularly exciting and worthwhile challenge, namely of larger and safer endoscopic interventions, followed by seamless and scarless recovery. In synthesis, the future is widely open for those who use together intelligence and creativity to develop new prototypes, new accessories and new techniques. Yet there are many challenges in the path of endoscopic surgery. In this new era of robotic endoscopy, one will likely need a virtual simulator to train and assess the performance of younger doctors. More evidence will be essential in multiple evolving fields, particularly to elucidate whether more ambitious and complex pathways, such as intrathoracic and intraperitoneal surgery via natural orifice transluminal endoscopic surgery (NOTES), are superior or not to conventional techniques. PMID:24628672

  17. Evolving paradigms in pharmacovigilance.

    PubMed

    Brewster, Wendy; Gibbs, Trevor; Lacroix, Karol; Murray, Alison; Tydeman, Michael; Almenoff, June

    2006-05-01

    All medicines have adverse effects as well as benefits. The aim of pharmacovigilance is to protect public health by monitoring medicines to identify and evaluate issues and ensure that the overall benefits outweigh the potential risks. The tools and processes used in pharmacovigilance are continually evolving. Increasingly sophisticated tools are being designed to evaluate safety data from clinical trials to enhance the likelihood of detecting safety signals ahead of product registration. Methods include integration of safety data throughout development, meta-analytical techniques, quantitative and qualitative methods for evaluation of adverse event data and graphical tools to explore laboratory and biometric data. Electronic data capture facilitates monitoring of ongoing studies so that it is possible to promptly identify potential issues and manage patient safety. In addition, GSK employs a number of proactive methods for post-marketing signal detection and knowledge management using state-of-the-art statistical and analytical tools. Using these tools, together with safety data collected through pharmacoepidemiologic studies, literature and spontaneous reporting, potential adverse drug reactions can be better identified in marketed products. In summary, the information outlined in this paper provides a valuable benchmark for risk management and pharmacovigilance in pharmaceutical development.

  18. Evolving synergetic interactions

    PubMed Central

    Wu, Bin; Arranz, Jordi; Du, Jinming; Zhou, Da; Traulsen, Arne

    2016-01-01

    Cooperators forgo their own interests to benefit others. This reduces their fitness and thus cooperators are not likely to spread based on natural selection. Nonetheless, cooperation is widespread on every level of biological organization ranging from bacterial communities to human society. Mathematical models can help to explain under which circumstances cooperation evolves. Evolutionary game theory is a powerful mathematical tool to depict the interactions between cooperators and defectors. Classical models typically involve either pairwise interactions between individuals or a linear superposition of these interactions. For interactions within groups, however, synergetic effects may arise: their outcome is not just the sum of its parts. This is because the payoffs via a single group interaction can be different from the sum of any collection of two-player interactions. Assuming that all interactions start from pairs, how can such synergetic multiplayer games emerge from simpler pairwise interactions? Here, we present a mathematical model that captures the transition from pairwise interactions to synergetic multiplayer ones. We assume that different social groups have different breaking rates. We show that non-uniform breaking rates do foster the emergence of synergy, even though individuals always interact in pairs. Our work sheds new light on the mechanisms underlying such synergetic interactions. PMID:27466437

  19. Photovoltaic performance and reliability workshop

    SciTech Connect

    Mrig, L.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986--1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the US, PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  20. A Note on the Relationship between the Number of Indicators and Their Reliability in Detecting Regression Coefficients in Latent Regression Analysis

    ERIC Educational Resources Information Center

    Dolan, Conor V.; Wicherts, Jelte M.; Molenaar, Peter C. M.

    2004-01-01

    We consider the question of how variation in the number and reliability of indicators affects the power to reject the hypothesis that the regression coefficients are zero in latent linear regression analysis. We show that power remains constant as long as the coefficient of determination remains unchanged. Any increase in the number of indicators…

  1. The relationship between the reliability of transistors with 2D AlGaN/GaN channel and organization type of nanomaterial

    NASA Astrophysics Data System (ADS)

    Emtsev, V. V.; Zavarin, E. E.; Oganesyan, G. A.; Petrov, V. N.; Sakharov, A. V.; Shmidt, N. M.; V'yuginov, V. N.; Zybin, A. A.; Parnes, Ya. M.; Vidyakin, S. I.; Gudkov, A. G.; Chernyakov, A. E.

    2016-07-01

    The first experimental results demonstrating that the carrier mobility in the AlGaN/GaN 2D channel of transistor structures (AlGaN/GaN-HEMT) is correlated with the manner in which the nanomaterial is organized and also with the operation reliability of transistor parameters are presented. It is shown that improving the nature of organization of the nanomaterials in AlGaN/GaN-HEMT structures, evaluated by the multifractal parameter characterizing the extent to which a nanomaterial is disordered (local symmetry breaking) is accompanied by a significant, several-fold increase in the electron mobility in the 2D channel and in the reliability of parameters of transistors fabricated from these structures.

  2. Hyper massive black holes in evolved galaxies

    NASA Astrophysics Data System (ADS)

    Romero-Cruz, Fernando J.

    2015-09-01

    From the SDSS DR7 we took a sample of 16733 galaxies which do not show all of the emission lines required to classify their activity according to the classical BPT diagram (Baldwin et al. 1981 PASP). Since they do not show these emission lines they are thought to be evolved enough so to host Hyper Massive Black holes. We compared their statistical properties with other galaxies from the SDSS DR7 which do show emission lines and confirmed that their M-sigma relationship correspond to HMBHs (Gutelkin et al. 2009 ApJ) and also that their SFH confirms evolution. We also analyzed them with a new Diagnostic Diagram in the IR (Coziol et al. 2015 AJ) and found that their position in the IR color space (W3W4 vs W2W3) correspond to AGN activity with current low SF, another confirmation of an evolved galaxy. The position of our final sample in the IR diagram is in the same region in which Holm 15A lies, this galaxy is considered to host the most massive BHs in the nearby universe (Lopez-Cruz et al. 2014 ApJL). The morphology of these galaxies (all of them are classified as elliptical) confirms that they are very evolved. We claim that the hyper massive BH lie in galaxies very evolved and with very low SF and without clear AGN activity in the BPT diagram.

  3. Disgust: Evolved Function and Structure

    ERIC Educational Resources Information Center

    Tybur, Joshua M.; Lieberman, Debra; Kurzban, Robert; DeScioli, Peter

    2013-01-01

    Interest in and research on disgust has surged over the past few decades. The field, however, still lacks a coherent theoretical framework for understanding the evolved function or functions of disgust. Here we present such a framework, emphasizing 2 levels of analysis: that of evolved function and that of information processing. Although there is…

  4. Evolving virtual creatures and catapults.

    PubMed

    Chaumont, Nicolas; Egli, Richard; Adami, Christoph

    2007-01-01

    We present a system that can evolve the morphology and the controller of virtual walking and block-throwing creatures (catapults) using a genetic algorithm. The system is based on Sims' work, implemented as a flexible platform with an off-the-shelf dynamics engine. Experiments aimed at evolving Sims-type walkers resulted in the emergence of various realistic gaits while using fairly simple objective functions. Due to the flexibility of the system, drastically different morphologies and functions evolved with only minor modifications to the system and objective function. For example, various throwing techniques evolved when selecting for catapults that propel a block as far as possible. Among the strategies and morphologies evolved, we find the drop-kick strategy, as well as the systematic invention of the principle behind the wheel, when allowing mutations to the projectile. PMID:17355189

  5. Evolving virtual creatures and catapults.

    PubMed

    Chaumont, Nicolas; Egli, Richard; Adami, Christoph

    2007-01-01

    We present a system that can evolve the morphology and the controller of virtual walking and block-throwing creatures (catapults) using a genetic algorithm. The system is based on Sims' work, implemented as a flexible platform with an off-the-shelf dynamics engine. Experiments aimed at evolving Sims-type walkers resulted in the emergence of various realistic gaits while using fairly simple objective functions. Due to the flexibility of the system, drastically different morphologies and functions evolved with only minor modifications to the system and objective function. For example, various throwing techniques evolved when selecting for catapults that propel a block as far as possible. Among the strategies and morphologies evolved, we find the drop-kick strategy, as well as the systematic invention of the principle behind the wheel, when allowing mutations to the projectile.

  6. Reliability Prediction

    NASA Technical Reports Server (NTRS)

    1993-01-01

    RELAV, a NASA-developed computer program, enables Systems Control Technology, Inc. (SCT) to predict performance of aircraft subsystems. RELAV provides a system level evaluation of a technology. Systems, the mechanism of a landing gear for example, are first described as a set of components performing a specific function. RELAV analyzes the total system and the individual subsystem probabilities to predict success probability, and reliability. This information is then translated into operational support and maintenance requirements. SCT provides research and development services in support of government contracts.

  7. Proposed reliability cost model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  8. The Skin Cancer and Sun Knowledge (SCSK) Scale: Validity, Reliability, and Relationship to Sun-Related Behaviors among Young Western Adults

    ERIC Educational Resources Information Center

    Day, Ashley K.; Wilson, Carlene; Roberts, Rachel M.; Hutchinson, Amanda D.

    2014-01-01

    Increasing public knowledge remains one of the key aims of skin cancer awareness campaigns, yet diagnosis rates continue to rise. It is essential we measure skin cancer knowledge adequately so as to determine the nature of its relationship to sun-related behaviors. This study investigated the psychometric properties of a new measure of skin cancer…

  9. Spacetimes containing slowly evolving horizons

    SciTech Connect

    Kavanagh, William; Booth, Ivan

    2006-08-15

    Slowly evolving horizons are trapping horizons that are ''almost'' isolated horizons. This paper reviews their definition and discusses several spacetimes containing such structures. These include certain Vaidya and Tolman-Bondi solutions as well as (perturbatively) tidally distorted black holes. Taking into account the mass scales and orders of magnitude that arise in these calculations, we conjecture that slowly evolving horizons are the norm rather than the exception in astrophysical processes that involve stellar-scale black holes.

  10. Natural selection promotes antigenic evolvability.

    PubMed

    Graves, Christopher J; Ros, Vera I D; Stevenson, Brian; Sniegowski, Paul D; Brisson, Dustin

    2013-01-01

    The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide an experimentally tractable system to test whether natural selection has favored mechanisms that increase evolvability. Many antigenic variation systems consist of paralogous unexpressed 'cassettes' that recombine into an expression site to rapidly alter the expressed protein. Importantly, the magnitude of antigenic change is a function of the genetic diversity among the unexpressed cassettes. Thus, evidence that selection favors among-cassette diversity is direct evidence that natural selection promotes antigenic evolvability. We used the Lyme disease bacterium, Borrelia burgdorferi, as a model to test the prediction that natural selection favors amino acid diversity among unexpressed vls cassettes and thereby promotes evolvability in a primary surface antigen, VlsE. The hypothesis that diversity among vls cassettes is favored by natural selection was supported in each B. burgdorferi strain analyzed using both classical (dN/dS ratios) and Bayesian population genetic analyses of genetic sequence data. This hypothesis was also supported by the conservation of highly mutable tandem-repeat structures across B. burgdorferi strains despite a near complete absence of sequence conservation. Diversification among vls cassettes due to natural selection and mutable repeat structures promotes long-term antigenic evolvability of VlsE. These findings provide a direct demonstration that molecular mechanisms that enhance evolvability of surface antigens are an evolutionary adaptation. The molecular evolutionary processes identified here can serve as a model for the evolution of antigenic evolvability in many pathogens which utilize similar strategies to establish chronic infections.

  11. Robustness to Faults Promotes Evolvability: Insights from Evolving Digital Circuits

    PubMed Central

    Nolfi, Stefano

    2016-01-01

    We demonstrate how the need to cope with operational faults enables evolving circuits to find more fit solutions. The analysis of the results obtained in different experimental conditions indicates that, in absence of faults, evolution tends to select circuits that are small and have low phenotypic variability and evolvability. The need to face operation faults, instead, drives evolution toward the selection of larger circuits that are truly robust with respect to genetic variations and that have a greater level of phenotypic variability and evolvability. Overall our results indicate that the need to cope with operation faults leads to the selection of circuits that have a greater probability to generate better circuits as a result of genetic variation with respect to a control condition in which circuits are not subjected to faults. PMID:27409589

  12. Evolving MEMS Resonator Designs for Fabrication

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.; Kraus, William F.; Lohn, Jason D.

    2008-01-01

    Because of their small size and high reliability, microelectromechanical (MEMS) devices have the potential to revolution many areas of engineering. As with conventionally-sized engineering design, there is likely to be a demand for the automated design of MEMS devices. This paper describes our current status as we progress toward our ultimate goal of using an evolutionary algorithm and a generative representation to produce designs of a MEMS device and successfully demonstrate its transfer to an actual chip. To produce designs that are likely to transfer to reality, we present two ways to modify evaluation of designs. The first is to add location noise, differences between the actual dimensions of the design and the design blueprint, which is a technique we have used for our work in evolving antennas and robots. The second method is to add prestress to model the warping that occurs during the extreme heat of fabrication. In future we expect to fabricate and test some MEMS resonators that are evolved in this way.

  13. Signing Apes and Evolving Linguistics.

    ERIC Educational Resources Information Center

    Stokoe, William C.

    Linguistics retains from its antecedents, philology and the study of sacred writings, some of their apologetic and theological bias. Thus it has not been able to face squarely the question how linguistic function may have evolved from animal communication. Chimpanzees' use of signs from American Sign Language forces re-examination of language…

  14. Thermal and evolved gas analyzer

    NASA Technical Reports Server (NTRS)

    Williams, M. S.; Boynton, W. V.; James, R. L.; Verts, W. T.; Bailey, S. H.; Hamara, D. K.

    1998-01-01

    The Thermal and Evolved Gas Analyzer (TEGA) instrument will perform calorimetry and evolved gas analysis on soil samples collected from the Martian surface. TEGA is one of three instruments, along with a robotic arm, that form the Mars Volatile and Climate Survey (MVACS) payload. The other instruments are a stereo surface imager, built by Peter Smith of the University of Arizona and a meteorological station, built by JPL. The MVACS lander will investigate a Martian landing site at approximately 70 deg south latitude. Launch will take place from Kennedy Space Center in January, 1999. The TEGA project started in February, 1996. In the intervening 24 months, a flight instrument concept has been designed, prototyped, built as an engineering model and flight model, and tested. The instrument performs laboratory-quality differential-scanning calorimetry (DSC) over the temperature range of Mars ambient to 1400K. Low-temperature volatiles (water and carbon dioxide ices) and the carbonates will be analyzed in this temperature range. Carbonates melt and evolve carbon dioxide at temperatures above 600 C. Evolved oxygen (down to a concentration of 1 ppm) is detected, and C02 and water vapor and the isotopic variations of C02 and water vapor are detected and their concentrations measured. The isotopic composition provides important tests of the theory of solar system formation.

  15. Slippery Texts and Evolving Literacies

    ERIC Educational Resources Information Center

    Mackey, Margaret

    2007-01-01

    The idea of "slippery texts" provides a useful descriptor for materials that mutate and evolve across different media. Eight adult gamers, encountering the slippery text "American McGee's Alice," demonstrate a variety of ways in which players attempt to manage their attention as they encounter a new text with many resonances. The range of their…

  16. Evolvable Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Lohn, Jason; Crawford, James; Globus, Al; Hornby, Gregory; Kraus, William; Larchev, Gregory; Pryor, Anna; Srivastava, Deepak

    2003-01-01

    This article surveys the research of the Evolvable System Group at NASA Ames Research Center. Over the past few years, our group has developed the ability to use evolutionary algorithms in a variety of NASA applications ranging from spacecraft antenna design, fault tolerance for programmable logic chips, atomic force field parameter fitting, analog circuit design, and earth observing satellite scheduling. In some of these applications, evolutionary algorithms match or improve on human performance.

  17. Evolvable Hardware for Space Applications

    NASA Technical Reports Server (NTRS)

    Lohn, Jason; Globus, Al; Hornby, Gregory; Larchev, Gregory; Kraus, William

    2004-01-01

    This article surveys the research of the Evolvable Systems Group at NASA Ames Research Center. Over the past few years, our group has developed the ability to use evolutionary algorithms in a variety of NASA applications ranging from spacecraft antenna design, fault tolerance for programmable logic chips, atomic force field parameter fitting, analog circuit design, and earth observing satellite scheduling. In some of these applications, evolutionary algorithms match or improve on human performance.

  18. When did oxygenic photosynthesis evolve?

    PubMed

    Buick, Roger

    2008-08-27

    The atmosphere has apparently been oxygenated since the 'Great Oxidation Event' ca 2.4 Ga ago, but when the photosynthetic oxygen production began is debatable. However, geological and geochemical evidence from older sedimentary rocks indicates that oxygenic photosynthesis evolved well before this oxygenation event. Fluid-inclusion oils in ca 2.45 Ga sandstones contain hydrocarbon biomarkers evidently sourced from similarly ancient kerogen, preserved without subsequent contamination, and derived from organisms producing and requiring molecular oxygen. Mo and Re abundances and sulphur isotope systematics of slightly older (2.5 Ga) kerogenous shales record a transient pulse of atmospheric oxygen. As early as ca 2.7 Ga, stromatolites and biomarkers from evaporative lake sediments deficient in exogenous reducing power strongly imply that oxygen-producing cyanobacteria had already evolved. Even at ca 3.2 Ga, thick and widespread kerogenous shales are consistent with aerobic photoautrophic marine plankton, and U-Pb data from ca 3.8 Ga metasediments suggest that this metabolism could have arisen by the start of the geological record. Hence, the hypothesis that oxygenic photosynthesis evolved well before the atmosphere became permanently oxygenated seems well supported. PMID:18468984

  19. When did oxygenic photosynthesis evolve?

    PubMed

    Buick, Roger

    2008-08-27

    The atmosphere has apparently been oxygenated since the 'Great Oxidation Event' ca 2.4 Ga ago, but when the photosynthetic oxygen production began is debatable. However, geological and geochemical evidence from older sedimentary rocks indicates that oxygenic photosynthesis evolved well before this oxygenation event. Fluid-inclusion oils in ca 2.45 Ga sandstones contain hydrocarbon biomarkers evidently sourced from similarly ancient kerogen, preserved without subsequent contamination, and derived from organisms producing and requiring molecular oxygen. Mo and Re abundances and sulphur isotope systematics of slightly older (2.5 Ga) kerogenous shales record a transient pulse of atmospheric oxygen. As early as ca 2.7 Ga, stromatolites and biomarkers from evaporative lake sediments deficient in exogenous reducing power strongly imply that oxygen-producing cyanobacteria had already evolved. Even at ca 3.2 Ga, thick and widespread kerogenous shales are consistent with aerobic photoautrophic marine plankton, and U-Pb data from ca 3.8 Ga metasediments suggest that this metabolism could have arisen by the start of the geological record. Hence, the hypothesis that oxygenic photosynthesis evolved well before the atmosphere became permanently oxygenated seems well supported.

  20. Evolving Systems and Adaptive Key Component Control

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Balas, Mark J.

    2009-01-01

    We propose a new framework called Evolving Systems to describe the self-assembly, or autonomous assembly, of actively controlled dynamical subsystems into an Evolved System with a higher purpose. An introduction to Evolving Systems and exploration of the essential topics of the control and stability properties of Evolving Systems is provided. This chapter defines a framework for Evolving Systems, develops theory and control solutions for fundamental characteristics of Evolving Systems, and provides illustrative examples of Evolving Systems and their control with adaptive key component controllers.

  1. You 3.0: The Most Important Evolving Technology

    ERIC Educational Resources Information Center

    Tamarkin, Molly; Bantz, David A.; Childs, Melody; diFilipo, Stephen; Landry, Stephen G.; LoPresti, Frances; McDonald, Robert H.; McGuthry, John W.; Meier, Tina; Rodrigo, Rochelle; Sparrow, Jennifer; Diggs, D. Teddy; Yang, Catherine W.

    2010-01-01

    That technology evolves is a given. Not as well understood is the impact of technological evolution on each individual--on oneself, one's skill development, one's career, and one's relationship with the work community. The authors believe that everyone in higher education will become an IT worker and that IT workers will be managing a growing…

  2. The emotion system promotes diversity and evolvability

    PubMed Central

    Giske, Jarl; Eliassen, Sigrunn; Fiksen, Øyvind; Jakobsen, Per J.; Aksnes, Dag L.; Mangel, Marc; Jørgensen, Christian

    2014-01-01

    Studies on the relationship between the optimal phenotype and its environment have had limited focus on genotype-to-phenotype pathways and their evolutionary consequences. Here, we study how multi-layered trait architecture and its associated constraints prescribe diversity. Using an idealized model of the emotion system in fish, we find that trait architecture yields genetic and phenotypic diversity even in absence of frequency-dependent selection or environmental variation. That is, for a given environment, phenotype frequency distributions are predictable while gene pools are not. The conservation of phenotypic traits among these genetically different populations is due to the multi-layered trait architecture, in which one adaptation at a higher architectural level can be achieved by several different adaptations at a lower level. Our results emphasize the role of convergent evolution and the organismal level of selection. While trait architecture makes individuals more constrained than what has been assumed in optimization theory, the resulting populations are genetically more diverse and adaptable. The emotion system in animals may thus have evolved by natural selection because it simultaneously enhances three important functions, the behavioural robustness of individuals, the evolvability of gene pools and the rate of evolutionary innovation at several architectural levels. PMID:25100697

  3. Bolometric Flux Estimation for Cool Evolved Stars

    NASA Astrophysics Data System (ADS)

    van Belle, Gerard T.; Creech-Eakman, Michelle J.; Ruiz-Velasco, Alma E.

    2016-07-01

    Estimation of bolometric fluxes ({F}{{BOL}}) is an essential component of stellar effective temperature determination with optical and near-infrared interferometry. Reliable estimation of {F}{{BOL}} simply from broadband K-band photometry data is a useful tool in those cases were contemporaneous and/or wide-range photometry is unavailable for a detailed spectral energy distribution (SED) fit, as was demonstrated in Dyck et al. Recalibrating the intrinsic {F}{{BOL}} versus observed {F}{{2.2}μ {{m}}} relationship of that study with modern SED fitting routines, which incorporate the significantly non-blackbody, empirical spectral templates of the INGS spectral library (an update of the library in Pickles) and estimation of reddening, serves to greatly improve the accuracy and observational utility of this relationship. We find that {F}{{BOL}} values predicted are roughly 11% less than the corresponding values predicted in Dyck et al., indicating the effects of SED absorption features across bolometric flux curves.

  4. A slowly evolving host moves first in symbiotic interactions

    NASA Astrophysics Data System (ADS)

    Damore, James; Gore, Jeff

    2011-03-01

    Symbiotic relationships, both parasitic and mutualistic, are ubiquitous in nature. Understanding how these symbioses evolve, from bacteria and their phages to humans and our gut microflora, is crucial in understanding how life operates. Often, symbioses consist of a slowly evolving host species with each host only interacting with its own sub-population of symbionts. The Red Queen hypothesis describes coevolutionary relationships as constant arms races with each species rushing to evolve an advantage over the other, suggesting that faster evolution is favored. Here, we use a simple game theoretic model of host- symbiont coevolution that includes population structure to show that if the symbionts evolve much faster than the host, the equilibrium distribution is the same as it would be if it were a sequential game where the host moves first against its symbionts. For the slowly evolving host, this will prove to be advantageous in mutualisms and a handicap in antagonisms. The model allows for symbiont adaptation to its host, a result that is robust to changes in the parameters and generalizes to continuous and multiplayer games. Our findings provide insight into a wide range of symbiotic phenomena and help to unify the field of coevolutionary theory.

  5. International Lead Zinc Research Organization-sponsored field-data collection and analysis to determine relationships between service conditions and reliability of valve-regulated lead-acid batteries in stationary applications

    NASA Astrophysics Data System (ADS)

    Taylor, P. A.; Moseley, P. T.; Butler, P. C.

    The International Lead Zinc Research Organization (ILZRO), in cooperation with Sandia National Laboratories, has initiated a multi-phase project with the following aims: to characterize relationships between valve-regulated lead-acid (VRLA) batteries, service conditions, and failure modes; to establish the degree of correlation between specific operating procedures and PCL; to identify operating procedures that mitigate PCL; to identify best-fits between the operating requirements of specific applications and the capabilities of specific VRLA technologies; to recommend combinations of battery design, manufacturing processes, and operating conditions that enhance VRLA performance and reliability. In the first phase of this project, ILZRO has contracted with Energetics to identify and survey manufacturers and users of VRLA batteries for stationary applications (including electric utilities, telecommunications companies, and government facilities). The confidential survey is collecting the service conditions of specific applications and performance records for specific VRLA technologies. From the data collected, Energetics is constructing a database of the service histories and analyzing the data to determine trends in performance for particular technologies in specific service conditions. ILZRO plans to make the final report of the analysis and a version of the database (that contains no proprietary information) available to ILZRO members, participants in the survey, and participants in a follow-on workshop for stakeholders in VRLA reliability. This paper presents the surveys distributed to manufacturers and end-users, discusses the analytic approach, presents an overview of the responses to the surveys and trends that have emerged in the early analysis of the data, and previews the functionality of the database being constructed.

  6. Sequential detection of temporal communities in evolving networks by estrangement confinement

    NASA Astrophysics Data System (ADS)

    Sreenivasan, Sameet; Kawadia, Vikas

    2013-03-01

    Temporal communities are the result of a consistent partitioning of nodes across multiple snapshots of an evolving network, and they provide insights into how dense clusters in a network emerge, combine, split and decay over time. Reliable detection of temporal communities requires finding a good community partition in a given snapshot while simultaneously ensuring that it bears some similarity to the partition(s) found in the previous snapshot(s). This is a particularly difficult task given the extreme sensitivity of community structure yielded by current methods to changes in the network structure. Motivated by the inertia of inter-node relationships, we present a new measure of partition distance called estrangement, and show that constraining estrangement enables the detection of meaningful temporal communities at various degrees of temporal smoothness in diverse real-world datasets. Estrangement confinement consequently provides a principled approach to uncovering temporal communities in evolving networks. (V. Kawadia and S. Sreenivasan, http://arxiv.org/abs/1203.5126) Supported in part by ARL NS-CTA

  7. Chromospheric activity of evolved late-type stars - Chromospheric activity in evolved stars

    NASA Astrophysics Data System (ADS)

    Pasquini, L.; Brocato, E.; Pallavicini, R.

    1990-08-01

    Ca II K emission in a homogeneous sample of late-type giants and supergiants is analyzed. The Wilson-Bappu relationship and color-temperature scales are used to construct an H-R diagram which is compared with theoretical evolutionary tracks. It is shown that in spite of the errors involved in the determination of the fundamental stellar parameters, a clear relationship between chromospheric surface activity and stellar mass is present. 5-10 solar mass stars in He burning phase show the highest levels of activity; on the other hand, less massive stars ascending along the Red Giant Branch are extremely quiet. A correlation between surface activity and rotation is found, and it is shown that a knowledge of the stellar evolutionary history is essential for understanding chromospheric emission from evolved stars.

  8. Reliability model generator

    NASA Technical Reports Server (NTRS)

    McMann, Catherine M. (Inventor); Cohen, Gerald C. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  9. canEvolve: A Web Portal for Integrative Oncogenomics

    PubMed Central

    Yan, Zhenyu; Wang, Xujun; Cao, Qingyi; Munshi, Nikhil C.; Li, Cheng

    2013-01-01

    Background & Objective Genome-wide profiles of tumors obtained using functional genomics platforms are being deposited to the public repositories at an astronomical scale, as a result of focused efforts by individual laboratories and large projects such as the Cancer Genome Atlas (TCGA) and the International Cancer Genome Consortium. Consequently, there is an urgent need for reliable tools that integrate and interpret these data in light of current knowledge and disseminate results to biomedical researchers in a user-friendly manner. We have built the canEvolve web portal to meet this need. Results canEvolve query functionalities are designed to fulfill most frequent analysis needs of cancer researchers with a view to generate novel hypotheses. canEvolve stores gene, microRNA (miRNA) and protein expression profiles, copy number alterations for multiple cancer types, and protein-protein interaction information. canEvolve allows querying of results of primary analysis, integrative analysis and network analysis of oncogenomics data. The querying for primary analysis includes differential gene and miRNA expression as well as changes in gene copy number measured with SNP microarrays. canEvolve provides results of integrative analysis of gene expression profiles with copy number alterations and with miRNA profiles as well as generalized integrative analysis using gene set enrichment analysis. The network analysis capability includes storage and visualization of gene co-expression, inferred gene regulatory networks and protein-protein interaction information. Finally, canEvolve provides correlations between gene expression and clinical outcomes in terms of univariate survival analysis. Conclusion At present canEvolve provides different types of information extracted from 90 cancer genomics studies comprising of more than 10,000 patients. The presence of multiple data types, novel integrative analysis for identifying regulators of oncogenesis, network analysis and ability

  10. Primordial evolvability: Impasses and challenges.

    PubMed

    Vasas, Vera; Fernando, Chrisantha; Szilágyi, András; Zachár, István; Santos, Mauro; Szathmáry, Eörs

    2015-09-21

    While it is generally agreed that some kind of replicating non-living compounds were the precursors of life, there is much debate over their possible chemical nature. Metabolism-first approaches propose that mutually catalytic sets of simple organic molecules could be capable of self-replication and rudimentary chemical evolution. In particular, the graded autocatalysis replication domain (GARD) model, depicting assemblies of amphiphilic molecules, has received considerable interest. The system propagates compositional information across generations and is suggested to be a target of natural selection. However, evolutionary simulations indicate that the system lacks selectability (i.e. selection has negligible effect on the equilibrium concentrations). We elaborate on the lessons learnt from the example of the GARD model and, more widely, on the issue of evolvability, and discuss the implications for similar metabolism-first scenarios. We found that simple incorporation-type chemistry based on non-covalent bonds, as assumed in GARD, is unlikely to result in alternative autocatalytic cycles when catalytic interactions are randomly distributed. An even more serious problem stems from the lognormal distribution of catalytic factors, causing inherent kinetic instability of such loops, due to the dominance of efficiently catalyzed components that fail to return catalytic aid. Accordingly, the dynamics of the GARD model is dominated by strongly catalytic, but not auto-catalytic, molecules. Without effective autocatalysis, stable hereditary propagation is not possible. Many repetitions and different scaling of the model come to no rescue. Despite all attempts to show the contrary, the GARD model is not evolvable, in contrast to reflexively autocatalytic networks, complemented by rare uncatalyzed reactions and compartmentation. The latter networks, resting on the creation and breakage of chemical bonds, can generate novel ('mutant') autocatalytic loops from a given set of

  11. Isotopic Analysis and Evolved Gases

    NASA Technical Reports Server (NTRS)

    Swindle, Timothy D.; Boynton, William V.; Chutjian, Ara; Hoffman, John H.; Jordan, Jim L.; Kargel, Jeffrey S.; McEntire, Richard W.; Nyquist, Larry

    1996-01-01

    Precise measurements of the chemical, elemental, and isotopic composition of planetary surface material and gases, and observed variations in these compositions, can contribute significantly to our knowledge of the source(s), ages, and evolution of solar system materials. The analyses discussed in this paper are mostly made by mass spectrometers or some other type of mass analyzer, and address three broad areas of interest: (1) atmospheric composition - isotopic, elemental, and molecular, (2) gases evolved from solids, and (3) solids. Current isotopic data on nine elements, mostly from in situ analysis, but also from meteorites and telescopic observations are summarized. Potential instruments for isotopic analysis of lunar, Martian, Venusian, Mercury, and Pluto surfaces, along with asteroid, cometary and icy satellites, surfaces are discussed.

  12. Speech processing: An evolving technology

    SciTech Connect

    Crochiere, R.E.; Flanagan, J.L.

    1986-09-01

    As we enter the information age, speech processing is emerging as an important technology for making machines easier and more convenient for humans to use. It is both an old and a new technology - dating back to the invention of the telephone and forward, at least in aspirations, to the capabilities of HAL in 2001. Explosive advances in microelectronics now make it possible to implement economical real-time hardware for sophisticated speech processing - processing that formerly could be demonstrated only in simulations on main-frame computers. As a result, fundamentally new product concepts - as well as new features and functions in existing products - are becoming possible and are being explored in the marketplace. As the introductory piece to this issue, the authors draw a brief perspective on the evolving field of speech processing and assess the technology in the the three constituent sectors: speech coding, synthesis, and recognition.

  13. Planets in Evolved Binary Systems

    NASA Astrophysics Data System (ADS)

    Perets, Hagai B.

    2011-03-01

    Exo-planets are typically thought to form in protoplanetary disks left over from protostellar disk of their newly formed host star. However, additional planetary formation and evolution routes may exist in old evolved binary systems. Here we discuss the implications of binary stellar evolution on planetary systems in such environments. In these binary systems stellar evolution could lead to the formation of symbiotic stars, where mass is lost from one star and could be transferred to its binary companion, and may form an accretion disk around it. This raises the possibility that such a disk could provide the necessary environment for the formation of a new, second generation of planets in both circumstellar or circumbinary configurations. Pre-existing first generation planets surviving the post-MS evolution of such systems would be dynamically effected by the mass loss in the systems and may also interact with the newly formed disk. Such planets and/or planetesimals may also serve as seeds for the formation of the second generation planets, and/or interact with them, possibly forming atypical planetary systems. Second generation planetary systems should be typically found in white dwarf binary systems, and may show various observational signatures. Most notably, second generation planets could form in environment which are inaccessible, or less favorable, for first generation planets. The orbital phase space available for the second generation planets could be forbidden (in terms of the system stability) to first generation planets in the pre-evolved progenitor binaries. In addition planets could form in metal poor environments such as globular clusters and/or in double compact object binaries. Observations of exo-planets in such forbidden or unfavorable regions could possibly serve to uniquely identify their second generation character. Finally, we point out a few observed candidate second generation planetary systems, including Gl 86, HD 27442 and all of the

  14. Reliability Generalization: "Lapsus Linguae"

    ERIC Educational Resources Information Center

    Smith, Julie M.

    2011-01-01

    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  15. Can There Be Reliability without "Reliability?"

    ERIC Educational Resources Information Center

    Mislevy, Robert J.

    2004-01-01

    An "Educational Researcher" article by Pamela Moss (1994) asks the title question, "Can there be validity without reliability?" Yes, she answers, if by reliability one means "consistency among independent observations intended as interchangeable" (Moss, 1994, p. 7), quantified by internal consistency indices such as KR-20 coefficients and…

  16. HELIOS Critical Design Review: Reliability

    NASA Technical Reports Server (NTRS)

    Benoehr, H. C.; Herholz, J.; Prem, H.; Mann, D.; Reichert, L.; Rupp, W.; Campbell, D.; Boettger, H.; Zerwes, G.; Kurvin, C.

    1972-01-01

    This paper presents Helios Critical Design Review Reliability form October 16-20, 1972. The topics include: 1) Reliability Requirement; 2) Reliability Apportionment; 3) Failure Rates; 4) Reliability Assessment; 5) Reliability Block Diagram; and 5) Reliability Information Sheet.

  17. Carl Thoresen: The Evolving Pioneer

    ERIC Educational Resources Information Center

    Harris, Alex H. S.

    2009-01-01

    This interview with Carl E. Thoresen highlights the experiences, relationships, and ideas that have influenced this pioneering psychologist throughout the past half century. His scholarly work, professional service, teaching, and mentorship have motivated many counseling psychologists to radically expand their areas of inquiry. He was among the…

  18. Evolving toward Laughter in Learning

    ERIC Educational Resources Information Center

    Strean, William B.

    2008-01-01

    Lowman (1995) described the relationship between teacher and student and student engagement as the two most important ingredients in learning in higher education. Humour builds teacher-student connection (Berk, 1998) and engages students in the learning process. The bond between student and teacher is essential for learning, satisfaction, and…

  19. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  20. Apply reliability centered maintenance to sealless pumps

    SciTech Connect

    Pradhan, S. )

    1993-01-01

    This paper reports on reliability centered maintenance (RCM) which is considered a crucial part of future reliability engineering. RCM determines the maintenance requirements of plants and equipment in their operating context. The RCM method has been applied to the management of critical sealless pumps in fire/toxic risk services, typical of the petrochemical industry. The method provides advantages from a detailed study of any critical engineering system. RCM is a team exercise and fosters team spirit in the plant environment. The maintenance strategy that evolves is based on team decisions and relies on maximizing the inherent reliability built into the equipment. RCM recommends design upgrades where this inherent reliability is being questioned. Sealless pumps of canned motor design are used as main reactor charge pumps in PVC plants. These pumps handle fresh vinyl chloride monomer (VCM), which is both carcinogenic and flammable.

  1. Magnetic fields around evolved stars

    NASA Astrophysics Data System (ADS)

    Leal-Ferreira, M.; Vlemmings, W.; Kemball, A.; Amiri, N.; Maercker, M.; Ramstedt, S.; Olofsson, G.

    2014-04-01

    A number of mechanisms, such as magnetic fields, (binary) companions and circumstellar disks have been suggested to be the cause of non-spherical PNe and in particular collimated outflows. This work investigates one of these mechanisms: the magnetic fields. While MHD simulations show that the fields can indeed be important, few observations of magnetic fields have been done so far. We used the VLBA to observe five evolved stars, with the goal of detecting the magnetic field by means of water maser polarization. The sample consists in four AGB stars (IK Tau, RT Vir, IRC+60370 and AP Lyn) and one pPN (OH231.8+4.2). In four of the five sources, several strong maser features were detected allowing us to measure the linear and/or circular polarization. Based on the circular polarization detections, we infer the strength of the component of the field along the line of sight to be between ~30 mG and ~330 mG in the water maser regions of these four sources. When extrapolated to the surface of the stars, the magnetic field strength would be between a few hundred mG and a few Gauss when assuming a toroidal field geometry and higher when assuming more complex magnetic fields. We conclude that the magnetic energy we derived in the water maser regions is higher than the thermal and kinetic energy, leading to the conclusion that, indeed, magnetic fields probably play an important role in shaping Planetary Nebulae.

  2. How do drumlin patterns evolve?

    NASA Astrophysics Data System (ADS)

    Ely, Jeremy; Clark, Chris; Spagnolo, Matteo; Hughes, Anna

    2016-04-01

    The flow of a geomorphic agent over a sediment bed creates patterns in the substrate composed of bedforms. Ice is no exception to this, organising soft sedimentary substrates into subglacial bedforms. As we are yet to fully observe their initiation and evolution beneath a contemporary ice mass, little is known about how patterns in subglacial bedforms develop. Here we study 36,222 drumlins, divided into 72 flowsets, left behind by the former British-Irish Ice sheet. These flowsets provide us with 'snapshots' of drumlin pattern development. The probability distribution functions of the size and shape metrics of drumlins within these flowsets were analysed to determine whether behaviour that is common of other patterned phenomena has occurred. Specifically, we ask whether drumlins i) are printed at a specific scale; ii) grow or shrink after they initiate; iii) stabilise at a specific size and shape; and iv) migrate. Our results indicate that drumlins initiate at a minimum size and spacing. After initiation, the log-normal distribution of drumlin size and shape metrics suggests that drumlins grow, or possibly shrink, as they develop. We find no evidence for stabilisation in drumlin length, supporting the idea of a subglacial bedform continuum. Drumlin migration is difficult to determine from the palaeo-record. However, there are some indications that a mixture of static and mobile drumlins occurs, which could potentially lead to collisions, cannibalisation and coarsening. Further images of modern drumlin fields evolving beneath ice are required to capture stages of drumlin pattern evolution.

  3. Recommendation in evolving online networks

    NASA Astrophysics Data System (ADS)

    Hu, Xiao; Zeng, An; Shang, Ming-Sheng

    2016-02-01

    Recommender system is an effective tool to find the most relevant information for online users. By analyzing the historical selection records of users, recommender system predicts the most likely future links in the user-item network and accordingly constructs a personalized recommendation list for each user. So far, the recommendation process is mostly investigated in static user-item networks. In this paper, we propose a model which allows us to examine the performance of the state-of-the-art recommendation algorithms in evolving networks. We find that the recommendation accuracy in general decreases with time if the evolution of the online network fully depends on the recommendation. Interestingly, some randomness in users' choice can significantly improve the long-term accuracy of the recommendation algorithm. When a hybrid recommendation algorithm is applied, we find that the optimal parameter gradually shifts towards the diversity-favoring recommendation algorithm, indicating that recommendation diversity is essential to keep a high long-term recommendation accuracy. Finally, we confirm our conclusions by studying the recommendation on networks with the real evolution data.

  4. Multiscale modelling of evolving foams

    NASA Astrophysics Data System (ADS)

    Saye, R. I.; Sethian, J. A.

    2016-06-01

    We present a set of multi-scale interlinked algorithms to model the dynamics of evolving foams. These algorithms couple the key effects of macroscopic bubble rearrangement, thin film drainage, and membrane rupture. For each of the mechanisms, we construct consistent and accurate algorithms, and couple them together to work across the wide range of space and time scales that occur in foam dynamics. These algorithms include second order finite difference projection methods for computing incompressible fluid flow on the macroscale, second order finite element methods to solve thin film drainage equations in the lamellae and Plateau borders, multiphase Voronoi Implicit Interface Methods to track interconnected membrane boundaries and capture topological changes, and Lagrangian particle methods for conservative liquid redistribution during rearrangement and rupture. We derive a full set of numerical approximations that are coupled via interface jump conditions and flux boundary conditions, and show convergence for the individual mechanisms. We demonstrate our approach by computing a variety of foam dynamics, including coupled evolution of three-dimensional bubble clusters attached to an anchored membrane and collapse of a foam cluster.

  5. Assuring reliability program effectiveness.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  6. Compound estimation procedures in reliability

    NASA Technical Reports Server (NTRS)

    Barnes, Ron

    1990-01-01

    At NASA, components and subsystems of components in the Space Shuttle and Space Station generally go through a number of redesign stages. While data on failures for various design stages are sometimes available, the classical procedures for evaluating reliability only utilize the failure data on the present design stage of the component or subsystem. Often, few or no failures have been recorded on the present design stage. Previously, Bayesian estimators for the reliability of a single component, conditioned on the failure data for the present design, were developed. These new estimators permit NASA to evaluate the reliability, even when few or no failures have been recorded. Point estimates for the latter evaluation were not possible with the classical procedures. Since different design stages of a component (or subsystem) generally have a good deal in common, the development of new statistical procedures for evaluating the reliability, which consider the entire failure record for all design stages, has great intuitive appeal. A typical subsystem consists of a number of different components and each component has evolved through a number of redesign stages. The present investigations considered compound estimation procedures and related models. Such models permit the statistical consideration of all design stages of each component and thus incorporate all the available failure data to obtain estimates for the reliability of the present version of the component (or subsystem). A number of models were considered to estimate the reliability of a component conditioned on its total failure history from two design stages. It was determined that reliability estimators for the present design stage, conditioned on the complete failure history for two design stages have lower risk than the corresponding estimators conditioned only on the most recent design failure data. Several models were explored and preliminary models involving bivariate Poisson distribution and the

  7. Submillimeter observations of evolved stars

    SciTech Connect

    Sopka, R.J.; Hildebrand, R.; Jaffe, D.T.; Gatley, I.; Roellig, T.; Werner, M.; Jura, M.; Zuckerman, B.

    1985-07-01

    Broad-band submillimeter observations of the thermal emission from evolved stars have been obtained with the United Kingdom Infrared Telescope on Mauna Kea, Hawaii. These observations, at an effective wavelength of 400 ..mu..m, provide the most direct method for estimating the mass loss rate in dust from these stars and also help to define the long-wavelength thermal spectrum of the dust envelopes. The mass loss rates in dust that we derive range from 10/sup -9/ to 10/sup -6/ M/sub sun/ yr/sup -1/ and are compared with mass loss rates derived from molecular line observations to estimate gas-to-dust ratios in outflowing envelopes. These values are found to be generally compatible with the interstellar gas-to-dust ratio of approx.100 if submillimeter emissivities appropriate to amorphous grain structures are assumed. Our analysis of the spectrum of IRC+10216 confirms previous suggestions that the grain emissivity varies as lambda/sup -1.2/ rather than as lambda/sup -2/ for 10

  8. Voyages Through Time: Everything Evolves

    NASA Astrophysics Data System (ADS)

    Pendleton, Y. J.; Tarter, J. C.; DeVore, E. K.; O'Sullivan, K. A.; Taylor, S. M.

    2001-12-01

    Evolutionary change is a powerful framework for studying our world and our place therein. It is a recurring theme in every realm of science: over time, the universe, the planet Earth, life, and human technologies all change, albeit on vastly different scales. Evolution offers scientific explanations for the age-old question, "Where did we come from?" In addition, historical perspectives of science show how our understanding has evolved over time. The complexities of all of these systems will never reveal a "finished" story. But it is a story of epic size, capable of inspiring awe and of expanding our sense of time and place, and eminently worthy of investigating. This story is the basis of Voyages Through Time. Voyages Through Time (VTT), provides teachers with not only background science content and pedagogy, but also with materials and resources for the teaching of evolution. The six modules, Cosmic Evolution, Planetary Evolution, Origin of Life, Evolution of Life, Hominid Evolution, and Evolution of Technology, emphasize student inquiry, and promote the nature of science, as recommended in the NSES and BSL. The modules are unified by the overarching theme of evolution and the meta questions: "What is changing?" "What is the rate of change?" and "What is the mechanism of change?" Determination of student outcomes for the project required effective collaboration of scientists, teachers, students and media specialists. The broadest curricula students outcomes are 1) an enjoyment of science, 2) an understanding of the nature of science, especially the understanding of evidence and re-evaluation, and 3) key science content. The curriculum is being developed by the SETI Institute, NASA Ames Research Center, California Academy of Sciences, and San Francisco State University, and is funded by the NSF (IMD 9730693), with support form Hewlett-Packard Company, The Foundation for Microbiology, Combined Federated Charities, NASA Astrobiology Institute, and NASA Fundamental

  9. Human Reliability Program Overview

    SciTech Connect

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  10. Power electronics reliability analysis.

    SciTech Connect

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  11. Exploring Evolving Media Discourse Through Event Cueing.

    PubMed

    Lu, Yafeng; Steptoe, Michael; Burke, Sarah; Wang, Hong; Tsai, Jiun-Yi; Davulcu, Hasan; Montgomery, Douglas; Corman, Steven R; Maciejewski, Ross

    2016-01-01

    Online news, microblogs and other media documents all contain valuable insight regarding events and responses to events. Underlying these documents is the concept of framing, a process in which communicators act (consciously or unconsciously) to construct a point of view that encourages facts to be interpreted by others in a particular manner. As media discourse evolves, how topics and documents are framed can undergo change, shifting the discussion to different viewpoints or rhetoric. What causes these shifts can be difficult to determine directly; however, by linking secondary datasets and enabling visual exploration, we can enhance the hypothesis generation process. In this paper, we present a visual analytics framework for event cueing using media data. As discourse develops over time, our framework applies a time series intervention model which tests to see if the level of framing is different before or after a given date. If the model indicates that the times before and after are statistically significantly different, this cues an analyst to explore related datasets to help enhance their understanding of what (if any) events may have triggered these changes in discourse. Our framework consists of entity extraction and sentiment analysis as lenses for data exploration and uses two different models for intervention analysis. To demonstrate the usage of our framework, we present a case study on exploring potential relationships between climate change framing and conflicts in Africa.

  12. Inherent randomness of evolving populations.

    PubMed

    Harper, Marc

    2014-03-01

    The entropy rates of the Wright-Fisher process, the Moran process, and generalizations are computed and used to compare these processes and their dependence on standard evolutionary parameters. Entropy rates are measures of the variation dependent on both short-run and long-run behaviors and allow the relationships between mutation, selection, and population size to be examined. Bounds for the entropy rate are given for the Moran process (independent of population size) and for the Wright-Fisher process (bounded for fixed population size). A generational Moran process is also presented for comparison to the Wright-Fisher Process. Results include analytic results and computational extensions.

  13. Reliable Design Versus Trust

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  14. Reliability in aposematic signaling

    PubMed Central

    2010-01-01

    In light of recent work, we will expand on the role and variability of aposematic signals. The focus of this review will be the concepts of reliability and honesty in aposematic signaling. We claim that reliable signaling can solve the problem of aposematic evolution, and that variability in reliability can shed light on the complexity of aposematic systems. PMID:20539774

  15. Viking Lander reliability program

    NASA Technical Reports Server (NTRS)

    Pilny, M. J.

    1978-01-01

    The Viking Lander reliability program is reviewed with attention given to the development of the reliability program requirements, reliability program management, documents evaluation, failure modes evaluation, production variation control, failure reporting and correction, and the parts program. Lander hardware failures which have occurred during the mission are listed.

  16. Reliability as Argument

    ERIC Educational Resources Information Center

    Parkes, Jay

    2007-01-01

    Reliability consists of both important social and scientific values and methods for evidencing those values, though in practice methods are often conflated with the values. With the two distinctly understood, a reliability argument can be made that articulates the particular reliability values most relevant to the particular measurement situation…

  17. Reliability model generator specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Mccann, Catherine

    1990-01-01

    The Reliability Model Generator (RMG), a program which produces reliability models from block diagrams for ASSIST, the interface for the reliability evaluation tool SURE is described. An account is given of motivation for RMG and the implemented algorithms are discussed. The appendices contain the algorithms and two detailed traces of examples.

  18. What Technology? Reflections on Evolving Services

    ERIC Educational Resources Information Center

    Collins, Sharon

    2009-01-01

    Each year, the members of the EDUCAUSE Evolving Technologies Committee identify and research the evolving technologies that are having--or are predicted to have--the most direct impact on higher education institutions. The committee members choose the relevant topics, write white papers, and present their findings at the EDUCAUSE annual…

  19. Evolvable Cryogenics (ECRYO) Pressure Transducer Calibration Test

    NASA Technical Reports Server (NTRS)

    Diaz, Carlos E., Jr.

    2015-01-01

    This paper provides a summary of the findings of recent activities conducted by Marshall Space Flight Center's (MSFC) In-Space Propulsion Branch and MSFC's Metrology and Calibration Lab to assess the performance of current "state of the art" pressure transducers for use in long duration storage and transfer of cryogenic propellants. A brief historical narrative in this paper describes the Evolvable Cryogenics program and the relevance of these activities to the program. This paper also provides a review of three separate test activities performed throughout this effort, including: (1) the calibration of several pressure transducer designs in a liquid nitrogen cryogenic environmental chamber, (2) the calibration of a pressure transducer in a liquid helium Dewar, and (3) the calibration of several pressure transducers at temperatures ranging from 20 to 70 degrees Kelvin (K) using a "cryostat" environmental chamber. These three separate test activities allowed for study of the sensors along a temperature range from 4 to 300 K. The combined data shows that both the slope and intercept of the sensor's calibration curve vary as a function of temperature. This homogeneous function is contrary to the linearly decreasing relationship assumed at the start of this investigation. Consequently, the data demonstrates the need for lookup tables to change the slope and intercept used by any data acquisition system. This ultimately would allow for more accurate pressure measurements at the desired temperature range. This paper concludes with a review of a request for information (RFI) survey conducted amongst different suppliers to determine the availability of current "state of the art" flight-qualified pressure transducers. The survey identifies requirements that are most difficult for the suppliers to meet, most notably the capability to validate the sensor's performance at temperatures below 70 K.

  20. Human reliability analysis

    SciTech Connect

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach.

  1. Reliability of fluid systems

    NASA Astrophysics Data System (ADS)

    Kopáček, Jaroslav; Fojtášek, Kamil; Dvořák, Lukáš

    2016-03-01

    This paper focuses on the importance of detection reliability, especially in complex fluid systems for demanding production technology. The initial criterion for assessing the reliability is the failure of object (element), which is seen as a random variable and their data (values) can be processed using by the mathematical methods of theory probability and statistics. They are defined the basic indicators of reliability and their applications in calculations of serial, parallel and backed-up systems. For illustration, there are calculation examples of indicators of reliability for various elements of the system and for the selected pneumatic circuit.

  2. Operational safety reliability research

    SciTech Connect

    Hall, R.E.; Boccio, J.L.

    1986-01-01

    Operating reactor events such as the TMI accident and the Salem automatic-trip failures raised the concern that during a plant's operating lifetime the reliability of systems could degrade from the design level that was considered in the licensing process. To address this concern, NRC is sponsoring the Operational Safety Reliability Research project. The objectives of this project are to identify the essential tasks of a reliability program and to evaluate the effectiveness and attributes of such a reliability program applicable to maintaining an acceptable level of safety during the operating lifetime at the plant.

  3. Observational studies of highly evolved cataclysmic variables

    NASA Astrophysics Data System (ADS)

    Uthas, Helena

    2011-05-01

    According to standard evolutionary theory for cataclysmic variables (CVs), angular momentum loss drives CVs to initially evolve from longer to shorter orbital periods until a minimum period is reached (approx 80 min). At roughly this stage, the donors becomes degenerate, expand in size, and the systems move towards longer Porb. Theory predicts that 70% of all CVs should have passed their minimum period and have sub-stellar donors, but until recently, no such systems were known. I present one CV showing evidence of harbouring a sub-dwarf donor, SDSS J1507+52. Due to the system's unusually short Porb of about 65 min, and very high space velocity, two origins for SDSS J1507+52 have been proposed; either the system was formed from a young WD/brown-dwarf binary, or the system is a halo CV. In order to distinguish between these two theories, I present UV spectroscopy and find a metallicity consistent with halo origin. Systems close to Pmin are expected to have low accretion rates. Some of these CVs show absorption in their spectra, implying that the underlying WD is exposed. This yields a rare opportunity to study the WD in a CV. I introduce two new systems showing WD signatures in their light curves and spectra, SDSS J1457+51 and BW Scl. Despite the fact that CVs close to Pmin should be faint, we find systems that are much too bright for their Porb. Such a system is T Pyx - a recurrent nova with an unusually high accretion rate and a photometrically determined Porb < 2 hr. T Pyx is about 2 times brighter than any other CV at its period. However, to confirm its evolutionary status, a more reliable period determination is needed. Here, I present a spectroscopic study, confirming T Pyx as a short-period CV. In this thesis, I discuss what implications these systems may have on the current understanding of CV evolution, and the importance of studying individual systems in general.

  4. Properties of artificial networks evolved to contend with natural spectra.

    PubMed

    Morgenstern, Yaniv; Rostami, Mohammad; Purves, Dale

    2014-07-22

    Understanding why spectra that are physically the same appear different in different contexts (color contrast), whereas spectra that are physically different appear similar (color constancy) presents a major challenge in vision research. Here, we show that the responses of biologically inspired neural networks evolved on the basis of accumulated experience with spectral stimuli automatically generate contrast and constancy. The results imply that these phenomena are signatures of a strategy that biological vision uses to circumvent the inverse optics problem as it pertains to light spectra, and that double-opponent neurons in early-level vision evolve to serve this purpose. This strategy provides a way of understanding the peculiar relationship between the objective world and subjective color experience, as well as rationalizing the relevant visual circuitry without invoking feature detection or image representation.

  5. Power Quality and Reliability Project

    NASA Technical Reports Server (NTRS)

    Attia, John O.

    2001-01-01

    One area where universities and industry can link is in the area of power systems reliability and quality - key concepts in the commercial, industrial and public sector engineering environments. Prairie View A&M University (PVAMU) has established a collaborative relationship with the University of'Texas at Arlington (UTA), NASA/Johnson Space Center (JSC), and EP&C Engineering and Technology Group (EP&C) a small disadvantage business that specializes in power quality and engineering services. The primary goal of this collaboration is to facilitate the development and implementation of a Strategic Integrated power/Systems Reliability and Curriculum Enhancement Program. The objectives of first phase of this work are: (a) to develop a course in power quality and reliability, (b) to use the campus of Prairie View A&M University as a laboratory for the study of systems reliability and quality issues, (c) to provide students with NASA/EPC shadowing and Internship experience. In this work, a course, titled "Reliability Analysis of Electrical Facilities" was developed and taught for two semesters. About thirty seven has benefited directly from this course. A laboratory accompanying the course was also developed. Four facilities at Prairie View A&M University were surveyed. Some tests that were performed are (i) earth-ground testing, (ii) voltage, amperage and harmonics of various panels in the buildings, (iii) checking the wire sizes to see if they were the right size for the load that they were carrying, (iv) vibration tests to test the status of the engines or chillers and water pumps, (v) infrared testing to the test arcing or misfiring of electrical or mechanical systems.

  6. Hawaii electric system reliability.

    SciTech Connect

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  7. Bayesian methods in reliability

    NASA Astrophysics Data System (ADS)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  8. Interactions between planets and evolved stars

    NASA Astrophysics Data System (ADS)

    Shengbang, Qian; Zhongtao, Han; Fernández Lajús, E.; liying, Zhu; Wenping, Liao; Miloslav, Zejda; Linjia, Li; Voloshina, Irina; Liang, Liu; Jiajia., He

    2016-07-01

    Searching for planetary companions to evolved stars (e.g., white dwarfs (WD) and Cataclysmic Variables (CV)) can provide insight into the interaction between planets and evolved stars as well as on the ultimate fate of planets. We have monitored decades of CVs and their progenitors including some detached WD binaries since 2006 to search for planets orbiting these systems. In the present paper, we will show some observational results of circumbinary planets in orbits around CVs and their progenitors. Some of our findings include planets with the shortest distance to the central evolved binaries and a few multiple planetary systems orbiting binary stars. Finally, by comparing the observational properties of planetary companions to single WDs and WD binaries, the interaction between planets and evolved stars and the ultimate fate of planets are discussed.

  9. Neural mechanisms underlying the evolvability of behaviour

    PubMed Central

    Katz, Paul S.

    2011-01-01

    The complexity of nervous systems alters the evolvability of behaviour. Complex nervous systems are phylogenetically constrained; nevertheless particular species-specific behaviours have repeatedly evolved, suggesting a predisposition towards those behaviours. Independently evolved behaviours in animals that share a common neural architecture are generally produced by homologous neural structures, homologous neural pathways and even in the case of some invertebrates, homologous identified neurons. Such parallel evolution has been documented in the chromatic sensitivity of visual systems, motor behaviours and complex social behaviours such as pair-bonding. The appearance of homoplasious behaviours produced by homologous neural substrates suggests that there might be features of these nervous systems that favoured the repeated evolution of particular behaviours. Neuromodulation may be one such feature because it allows anatomically defined neural circuitry to be re-purposed. The developmental, genetic and physiological mechanisms that contribute to nervous system complexity may also bias the evolution of behaviour, thereby affecting the evolvability of species-specific behaviour. PMID:21690127

  10. Evolving communicative complexity: insights from rodents and beyond.

    PubMed

    Pollard, Kimberly A; Blumstein, Daniel T

    2012-07-01

    Social living goes hand in hand with communication, but the details of this relationship are rarely simple. Complex communication may be described by attributes as diverse as a species' entire repertoire, signallers' individualistic signatures, or complex acoustic phenomena within single calls. Similarly, attributes of social complexity are diverse and may include group size, social role diversity, or networks of interactions and relationships. How these different attributes of social and communicative complexity co-evolve is an active question in behavioural ecology. Sciurid rodents (ground squirrels, prairie dogs and marmots) provide an excellent model system for studying these questions. Sciurid studies have found that demographic role complexity predicts alarm call repertoire size, while social group size predicts alarm call individuality. Along with other taxa, sciurids reveal an important insight: different attributes of sociality are linked to different attributes of communication. By breaking social and communicative complexity down to different attributes, focused studies can better untangle the underlying evolutionary relationships and move us closer to a comprehensive theory of how sociality and communication evolve. PMID:22641825

  11. The transcriptomics of an experimentally evolved plant-virus interaction

    PubMed Central

    Hillung, Julia; García-García, Francisco; Dopazo, Joaquín; Cuevas, José M.; Elena, Santiago F.

    2016-01-01

    Models of plant-virus interaction assume that the ability of a virus to infect a host genotype depends on the matching between virulence and resistance genes. Recently, we evolved tobacco etch potyvirus (TEV) lineages on different ecotypes of Arabidopsis thaliana, and found that some ecotypes selected for specialist viruses whereas others selected for generalists. Here we sought to evaluate the transcriptomic basis of such relationships. We have characterized the transcriptomic responses of five ecotypes infected with the ancestral and evolved viruses. Genes and functional categories differentially expressed by plants infected with local TEV isolates were identified, showing heterogeneous responses among ecotypes, although significant parallelism existed among lineages evolved in the same ecotype. Although genes involved in immune responses were altered upon infection, other functional groups were also pervasively over-represented, suggesting that plant resistance genes were not the only drivers of viral adaptation. Finally, the transcriptomic consequences of infection with the generalist and specialist lineages were compared. Whilst the generalist induced very similar perturbations in the transcriptomes of the different ecotypes, the perturbations induced by the specialist were divergent. Plant defense mechanisms were activated when the infecting virus was specialist but they were down-regulated when infecting with generalist. PMID:27113435

  12. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  13. Zygomorphy evolved from disymmetry in Fumarioideae (Papaveraceae, Ranunculales): new evidence from an expanded molecular phylogenetic framework

    PubMed Central

    Sauquet, Hervé; Carrive, Laetitia; Poullain, Noëlie; Sannier, Julie; Damerval, Catherine; Nadot, Sophie

    2015-01-01

    Background and Aims Fumarioideae (20 genera, 593 species) is a clade of Papaveraceae (Ranunculales) characterized by flowers that are either disymmetric (i.e. two perpendicular planes of bilateral symmetry) or zygomorphic (i.e. one plane of bilateral symmetry). In contrast, the other subfamily of Papaveraceae, Papaveroideae (23 genera, 230 species), has actinomorphic flowers (i.e. more than two planes of symmetry). Understanding of the evolution of floral symmetry in this clade has so far been limited by the lack of a reliable phylogenetic framework. Pteridophyllum (one species) shares similarities with Fumarioideae but has actinomorphic flowers, and the relationships among Pteridophyllum, Papaveroideae and Fumarioideae have remained unclear. This study reassesses the evolution of floral symmetry in Papaveraceae based on new molecular phylogenetic analyses of the family. Methods Maximum likelihood, Bayesian and maximum parsimony phylogenetic analyses of Papaveraceae were conducted using six plastid markers and one nuclear marker, sampling Pteridophyllum, 18 (90 %) genera and 73 species of Fumarioideae, 11 (48 %) genera and 11 species of Papaveroideae, and a wide selection of outgroup taxa. Floral characters recorded from the literature were then optimized onto phylogenetic trees to reconstruct ancestral states using parsimony, maximum likelihood and reversible-jump Bayesian approaches. Key Results Pteridophyllum is not nested in Fumarioideae. Fumarioideae are monophyletic and Hypecoum (18 species) is the sister group of the remaining genera. Relationships within the core Fumarioideae are well resolved and supported. Dactylicapnos and all zygomorphic genera form a well-supported clade nested among disymmetric taxa. Conclusions Disymmetry of the corolla is a synapomorphy of Fumarioideae and is strongly correlated with changes in the androecium and differentiation of middle and inner tepal shape (basal spurs on middle tepals). Zygomorphy subsequently evolved from

  14. Quantifying evolvability in small biological networks

    SciTech Connect

    Nemenman, Ilya; Mugler, Andrew; Ziv, Etay; Wiggins, Chris H

    2008-01-01

    The authors introduce a quantitative measure of the capacity of a small biological network to evolve. The measure is applied to a stochastic description of the experimental setup of Guet et al. (Science 2002, 296, pp. 1466), treating chemical inducers as functional inputs to biochemical networks and the expression of a reporter gene as the functional output. The authors take an information-theoretic approach, allowing the system to set parameters that optimise signal processing ability, thus enumerating each network's highest-fidelity functions. All networks studied are highly evolvable by the measure, meaning that change in function has little dependence on change in parameters. Moreover, each network's functions are connected by paths in the parameter space along which information is not significantly lowered, meaning a network may continuously change its functionality without completely losing it along the way. This property further underscores the evolvability of the networks.

  15. Metanetworks of artificially evolved regulatory networks

    NASA Astrophysics Data System (ADS)

    Danacı, Burçin; Erzan, Ayşe

    2016-04-01

    We study metanetworks arising in genotype and phenotype spaces, in the context of a model population of Boolean graphs evolved under selection for short dynamical attractors. We define the adjacency matrix of a graph as its genotype, which gets mutated in the course of evolution, while its phenotype is its set of dynamical attractors. Metanetworks in the genotype and phenotype spaces are formed, respectively, by genetic proximity and by phenotypic similarity, the latter weighted by the sizes of the basins of attraction of the shared attractors. We find that evolved populations of Boolean graphs form tree-like giant clusters in genotype space, while random populations of Boolean graphs are typically so far removed from each other genetically that they cannot form a metanetwork. In phenotype space, the metanetworks of evolved populations are super robust both under the elimination of weak connections and random removal of nodes.

  16. Mission Reliability Estimation for Repairable Robot Teams

    NASA Technical Reports Server (NTRS)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  17. Scaled CMOS Technology Reliability Users Guide

    NASA Technical Reports Server (NTRS)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  18. JavaGenes: Evolving Graphs with Crossover

    NASA Technical Reports Server (NTRS)

    Globus, Al; Atsatt, Sean; Lawton, John; Wipke, Todd

    2000-01-01

    Genetic algorithms usually use string or tree representations. We have developed a novel crossover operator for a directed and undirected graph representation, and used this operator to evolve molecules and circuits. Unlike strings or trees, a single point in the representation cannot divide every possible graph into two parts, because graphs may contain cycles. Thus, the crossover operator is non-trivial. A steady-state, tournament selection genetic algorithm code (JavaGenes) was written to implement and test the graph crossover operator. All runs were executed by cycle-scavagging on networked workstations using the Condor batch processing system. The JavaGenes code has evolved pharmaceutical drug molecules and simple digital circuits. Results to date suggest that JavaGenes can evolve moderate sized drug molecules and very small circuits in reasonable time. The algorithm has greater difficulty with somewhat larger circuits, suggesting that directed graphs (circuits) are more difficult to evolve than undirected graphs (molecules), although necessary differences in the crossover operator may also explain the results. In principle, JavaGenes should be able to evolve other graph-representable systems, such as transportation networks, metabolic pathways, and computer networks. However, large graphs evolve significantly slower than smaller graphs, presumably because the space-of-all-graphs explodes combinatorially with graph size. Since the representation strongly affects genetic algorithm performance, adding graphs to the evolutionary programmer's bag-of-tricks should be beneficial. Also, since graph evolution operates directly on the phenotype, the genotype-phenotype translation step, common in genetic algorithm work, is eliminated.

  19. NASA's new breakup model of evolve 4.0

    NASA Astrophysics Data System (ADS)

    Johnson, N. L.; Krisko, P. H.; Liou, J.-C.; Anz-Meador, P. D.

    2001-01-01

    Analyses of the fragmentation (due to explosions and collisions) of spacecraft and rocket bodies in low Earth orbit (LEO) have been performed this year at NASA/JSC. The overall goals of this study have been to achieve a better understanding of the results of fragmentations on the orbital debris environment and then to implement this understanding into the breakup model of EVOLVE 4.0. The previous breakup model implemented in EVOLVE 3.0 and other long-term orbital debris environment models was known to be inadequate in two major areas. First, it treated all fragmentational debris as spheres of a density which varied as a function of fragment diameter, where diameter was directly related to mass. Second, it underestimated the generation of fragments smaller than 10-cm in the majority of explosions. Without reliable data from both ground tests and on-orbit breakups, these inadequacies were unavoidable. Recent years, however, have brought additional data and related analyses: results of three ground tests, better on-orbit size and mass estimation techniques, more regular orbital tracking and reporting, additional radar resources dedicated to the observation of small objects, and simply a longer time period with which to observe the debris and their decay. Together these studies and data are applied to the reanalysis of the breakup model. In this paper we compare the new breakup model to the old breakup model in detail, including the size distributions for explosions and collisions, the area-to-mass and impact velocity assignments and distributions, and the delta-velocity distributions. These comparisons demonstrate a significantly better understanding of the fragmentation process as compared to previous versions of EVOLVE.

  20. A guide for the design of evolve and resequencing studies.

    PubMed

    Kofler, Robert; Schlötterer, Christian

    2014-02-01

    Standing genetic variation provides a rich reservoir of potentially useful mutations facilitating the adaptation to novel environments. Experimental evolution studies have demonstrated that rapid and strong phenotypic responses to selection can also be obtained in the laboratory. When combined with the next-generation sequencing technology, these experiments promise to identify the individual loci contributing to adaption. Nevertheless, until now, very little is known about the design of such evolve & resequencing (E&R) studies. Here, we use forward simulations of entire genomes to evaluate different experimental designs that aim to maximize the power to detect selected variants. We show that low linkage disequilibrium in the starting population, population size, duration of the experiment, and the number of replicates are the key factors in determining the power and accuracy of E&R studies. Furthermore, replication of E&R is more important for detecting the targets of selection than increasing the population size. Using an optimized design, beneficial loci with a selective advantage as low as s = 0.005 can be identified at the nucleotide level. Even when a large number of loci are selected simultaneously, up to 56% can be reliably detected without incurring large numbers of false positives. Our computer simulations suggest that, with an adequate experimental design, E&R studies are a powerful tool to identify adaptive mutations from standing genetic variation and thereby provide an excellent means to analyze the trajectories of selected alleles in evolving populations.

  1. Structural Analysis of an Evolved Transketolase Reveals Divergent Binding Modes

    PubMed Central

    Affaticati, Pierre E.; Dai, Shao-Bo; Payongsri, Panwajee; Hailes, Helen C.; Tittmann, Kai; Dalby, Paul A.

    2016-01-01

    The S385Y/D469T/R520Q variant of E. coli transketolase was evolved previously with three successive smart libraries, each guided by different structural, bioinformatical or computational methods. Substrate-walking progressively shifted the target acceptor substrate from phosphorylated aldehydes, towards a non-phosphorylated polar aldehyde, a non-polar aliphatic aldehyde, and finally a non-polar aromatic aldehyde. Kinetic evaluations on three benzaldehyde derivatives, suggested that their active-site binding was differentially sensitive to the S385Y mutation. Docking into mutants generated in silico from the wild-type crystal structure was not wholly satisfactory, as errors accumulated with successive mutations, and hampered further smart-library designs. Here we report the crystal structure of the S385Y/D469T/R520Q variant, and molecular docking of three substrates. This now supports our original hypothesis that directed-evolution had generated an evolutionary intermediate with divergent binding modes for the three aromatic aldehydes tested. The new active site contained two binding pockets supporting π-π stacking interactions, sterically separated by the D469T mutation. While 3-formylbenzoic acid (3-FBA) preferred one pocket, and 4-FBA the other, the less well-accepted substrate 3-hydroxybenzaldehyde (3-HBA) was caught in limbo with equal preference for the two pockets. This work highlights the value of obtaining crystal structures of evolved enzyme variants, for continued and reliable use of smart library strategies. PMID:27767080

  2. An Evolvable Multi-Agent Approach to Space Operations Engineering

    NASA Technical Reports Server (NTRS)

    Mandutianu, Sanda; Stoica, Adrian

    1999-01-01

    A complex system of spacecraft and ground tracking stations, as well as a constellation of satellites or spacecraft, has to be able to reliably withstand sudden environment changes, resource fluctuations, dynamic resource configuration, limited communication bandwidth, etc., while maintaining the consistency of the system as a whole. It is not known in advance when a change in the environment might occur or when a particular exchange will happen. A higher degree of sophistication for the communication mechanisms between different parts of the system is required. The actual behavior has to be determined while the system is performing and the course of action can be decided at the individual level. Under such circumstances, the solution will highly benefit from increased on-board and on the ground adaptability and autonomy. An evolvable architecture based on intelligent agents that communicate and cooperate with each other can offer advantages in this direction. This paper presents an architecture of an evolvable agent-based system (software and software/hardware hybrids) as well as some plans for further implementation.

  3. A Stefan problem on an evolving surface

    PubMed Central

    Alphonse, Amal; Elliott, Charles M.

    2015-01-01

    We formulate a Stefan problem on an evolving hypersurface and study the well posedness of weak solutions given L1 data. To do this, we first develop function spaces and results to handle equations on evolving surfaces in order to give a natural treatment of the problem. Then, we consider the existence of solutions for data; this is done by regularization of the nonlinearity. The regularized problem is solved by a fixed point theorem and then uniform estimates are obtained in order to pass to the limit. By using a duality method, we show continuous dependence, which allows us to extend the results to L1 data. PMID:26261364

  4. How the first biopolymers could have evolved.

    PubMed Central

    Abkevich, V I; Gutin, A M; Shakhnovich, E I

    1996-01-01

    In this work, we discuss a possible origin of the first biopolymers with stable unique structures. We suggest that at the prebiotic stage of evolution, long organic polymers had to be compact to avoid hydrolysis and had to be soluble and thus must not be exceedingly hydrophobic. We present an algorithm that generates such sequences for model proteins. The evolved sequences turn out to have a stable unique structure, into which they quickly fold. This result illustrates the idea that the unique three-dimensional native structures of first biopolymers could have evolved as a side effect of nonspecific physicochemical factors acting at the prebiotic stage of evolution. PMID:8570645

  5. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  6. Orbiter Autoland reliability analysis

    NASA Technical Reports Server (NTRS)

    Welch, D. Phillip

    1993-01-01

    The Space Shuttle Orbiter is the only space reentry vehicle in which the crew is seated upright. This position presents some physiological effects requiring countermeasures to prevent a crewmember from becoming incapacitated. This also introduces a potential need for automated vehicle landing capability. Autoland is a primary procedure that was identified as a requirement for landing following and extended duration orbiter mission. This report documents the results of the reliability analysis performed on the hardware required for an automated landing. A reliability block diagram was used to evaluate system reliability. The analysis considers the manual and automated landing modes currently available on the Orbiter. (Autoland is presently a backup system only.) Results of this study indicate a +/- 36 percent probability of successfully extending a nominal mission to 30 days. Enough variations were evaluated to verify that the reliability could be altered with missions planning and procedures. If the crew is modeled as being fully capable after 30 days, the probability of a successful manual landing is comparable to that of Autoland because much of the hardware is used for both manual and automated landing modes. The analysis indicates that the reliability for the manual mode is limited by the hardware and depends greatly on crew capability. Crew capability for a successful landing after 30 days has not been determined yet.

  7. Photovoltaic module reliability workshop

    SciTech Connect

    Mrig, L.

    1990-01-01

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  8. Surveying The Digital Landscape: Evolving Technologies 2004. The EDUCAUSE Evolving Technologies Committee

    ERIC Educational Resources Information Center

    EDUCAUSE Review, 2004

    2004-01-01

    Each year, the members of the EDUCAUSE Evolving Technologies Committee identify and research the evolving technologies that are having the most direct impact on higher education institutions. The committee members choose the relevant topics, write white papers, and present their findings at the EDUCAUSE annual conference. This year, under the…

  9. Software reliability perspectives

    NASA Technical Reports Server (NTRS)

    Wilson, Larry; Shen, Wenhui

    1987-01-01

    Software which is used in life critical functions must be known to be highly reliable before installation. This requires a strong testing program to estimate the reliability, since neither formal methods, software engineering nor fault tolerant methods can guarantee perfection. Prior to the final testing software goes through a debugging period and many models have been developed to try to estimate reliability from the debugging data. However, the existing models are poorly validated and often give poor performance. This paper emphasizes the fact that part of their failures can be attributed to the random nature of the debugging data given to these models as input, and it poses the problem of correcting this defect as an area of future research.

  10. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  11. Project Evolve User-Adopter Manual.

    ERIC Educational Resources Information Center

    Joiner, Lee M.

    An adult basic education (ABE) program for mentally retarded young adults between the ages of 14 and 26 years, Project Evolve can provide education agencies for educationally handicapped children with detailed information concerning an innovative program. The manual format was developed through interviews with professional educators concerning the…

  12. The Evolving Leadership Path of Visual Analytics

    SciTech Connect

    Kluse, Michael; Peurrung, Anthony J.; Gracio, Deborah K.

    2012-01-02

    This is a requested book chapter for an internationally authored book on visual analytics and related fields, coordianted by a UK university and to be published by Springer in 2012. This chapter is an overview of the leadship strategies that PNNL's Jim Thomas and other stakeholders used to establish visual analytics as a field, and how those strategies may evolve in the future.

  13. A Course Evolves-Physical Anthropology.

    ERIC Educational Resources Information Center

    O'Neil, Dennis

    2001-01-01

    Describes the development of an online physical anthropology course at Palomar College (California) that evolved from online tutorials. Discusses the ability to update materials on the Web more quickly than in traditional textbooks; creating Web pages that are readable by most Web browsers; test security issues; and clarifying ownership of online…

  14. Evolving dimensions in medical case reporting

    PubMed Central

    2011-01-01

    Medical case reports (MCRs) have been undervalued in the literature to date. It seems that while case series emphasize what is probable, case reports describe what is possible and what can go wrong. MCRs transfer medical knowledge and act as educational tools. We outline evolving aspects of the MCR in current practice. PMID:21524284

  15. Antibody therapeutics - the evolving patent landscape.

    PubMed

    Petering, Jenny; McManamny, Patrick; Honeyman, Jane

    2011-09-01

    The antibody patent landscape has evolved dramatically over the past 30 years, particularly in areas of technology relating to antibody modification to reduce immunogenicity in humans or improve antibody function. In some cases antibody techniques that were developed in the 1980s are still the subject of patent protection in the United States or Canada.

  16. The Evolving Office of the Registrar

    ERIC Educational Resources Information Center

    Pace, Harold L.

    2011-01-01

    A healthy registrar's office will continue to evolve as it considers student, faculty, and institutional needs; staff talents and expectations; technological opportunities; economic realities; space issues; work environments; and where the strategic plan is taking the institution in support of the mission. Several recognized leaders in the field…

  17. Apollo 16 Evolved Lithology Sodic Ferrogabbro

    NASA Technical Reports Server (NTRS)

    Zeigler, Ryan; Jolliff, B. L.; Korotev, R. L.

    2014-01-01

    Evolved lunar igneous lithologies, often referred to as the alkali suite, are a minor but important component of the lunar crust. These evolved samples are incompatible-element rich samples, and are, not surprisingly, most common in the Apollo sites in (or near) the incompatible-element rich region of the Moon known as the Procellarum KREEP Terrane (PKT). The most commonly occurring lithologies are granites (A12, A14, A15, A17), monzogabbro (A14, A15), alkali anorthosites (A12, A14), and KREEP basalts (A15, A17). The Feldspathic Highlands Terrane is not entirely devoid of evolved lithologies, and rare clasts of alkali gabbronorite and sodic ferrogabbro (SFG) have been identified in Apollo 16 station 11 breccias 67915 and 67016. Curiously, nearly all pristine evolved lithologies have been found as small clasts or soil particles, exceptions being KREEP basalts 15382/6 and granitic sample 12013 (which is itself a breccia). Here we reexamine the petrography and geochemistry of two SFG-like particles found in a survey of Apollo 16 2-4 mm particles from the Cayley Plains 62283,7-15 and 62243,10-3 (hereafter 7-15 and 10-3 respectively). We will compare these to previously reported SFG samples, including recent analyses on the type specimen of SFG from lunar breccia 67915.

  18. Gearbox Reliability Collaborative Update (Presentation)

    SciTech Connect

    Sheng, S.

    2013-10-01

    This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.

  19. Materials reliability issues in microelectronics

    SciTech Connect

    Lloyd, J.R. ); Yost, F.G. ); Ho, P.S. )

    1991-01-01

    This book covers the proceedings of a MRS symposium on materials reliability in microelectronics. Topics include: electromigration; stress effects on reliability; stress and packaging; metallization; device, oxide and dielectric reliability; new investigative techniques; and corrosion.

  20. Reliability Degradation Due to Stockpile Aging

    SciTech Connect

    Robinson, David G.

    1999-04-01

    The objective of this reseach is the investigation of alternative methods for characterizing the reliability of systems with time dependent failure modes associated with stockpile aging. Reference to 'reliability degradation' has, unfortunately, come to be associated with all types of aging analyes: both deterministic and stochastic. In this research, in keeping with the true theoretical definition, reliability is defined as a probabilistic description of system performance as a funtion of time. Traditional reliability methods used to characterize stockpile reliability depend on the collection of a large number of samples or observations. Clearly, after the experiments have been performed and the data has been collected, critical performance problems can be identified. A Major goal of this research is to identify existing methods and/or develop new mathematical techniques and computer analysis tools to anticipate stockpile problems before they become critical issues. One of the most popular methods for characterizing the reliability of components, particularly electronic components, assumes that failures occur in a completely random fashion, i.e. uniformly across time. This method is based primarily on the use of constant failure rates for the various elements that constitute the weapon system, i.e. the systems do not degrade while in storage. Experience has shown that predictions based upon this approach should be regarded with great skepticism since the relationship between the life predicted and the observed life has been difficult to validate. In addition to this fundamental problem, the approach does not recognize that there are time dependent material properties and variations associated with the manufacturing process and the operational environment. To appreciate the uncertainties in predicting system reliability a number of alternative methods are explored in this report. All of the methods are very different from those currently used to assess stockpile

  1. Quantifying Human Performance Reliability.

    ERIC Educational Resources Information Center

    Askren, William B.; Regulinski, Thaddeus L.

    Human performance reliability for tasks in the time-space continuous domain is defined and a general mathematical model presented. The human performance measurement terms time-to-error and time-to-error-correction are defined. The model and measurement terms are tested using laboratory vigilance and manual control tasks. Error and error-correction…

  2. Grid reliability management tools

    SciTech Connect

    Eto, J.; Martinez, C.; Dyer, J.; Budhraja, V.

    2000-10-01

    To summarize, Consortium for Electric Reliability Technology Solutions (CERTS) is engaged in a multi-year program of public interest R&D to develop and prototype software tools that will enhance system reliability during the transition to competitive markets. The core philosophy embedded in the design of these tools is the recognition that in the future reliability will be provided through market operations, not the decisions of central planners. Embracing this philosophy calls for tools that: (1) Recognize that the game has moved from modeling machine and engineering analysis to simulating markets to understand the impacts on reliability (and vice versa); (2) Provide real-time data and support information transparency toward enhancing the ability of operators and market participants to quickly grasp, analyze, and act effectively on information; (3) Allow operators, in particular, to measure, monitor, assess, and predict both system performance as well as the performance of market participants; and (4) Allow rapid incorporation of the latest sensing, data communication, computing, visualization, and algorithmic techniques and technologies.

  3. Parametric Mass Reliability Study

    NASA Technical Reports Server (NTRS)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  4. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  5. Reliable solar cookers

    SciTech Connect

    Magney, G.K.

    1992-12-31

    The author describes the activities of SERVE, a Christian relief and development agency, to introduce solar ovens to the Afghan refugees in Pakistan. It has provided 5,000 solar cookers since 1984. The experience has demonstrated the potential of the technology and the need for a durable and reliable product. Common complaints about the cookers are discussed and the ideal cooker is described.

  6. Bioharness™ Multivariable Monitoring Device: Part. II: Reliability

    PubMed Central

    Johnstone, James A.; Ford, Paul A.; Hughes, Gerwyn; Watson, Tim; Garrett, Andrew T.

    2012-01-01

    The Bioharness™ monitoring system may provide physiological information on human performance but the reliability of this data is fundamental for confidence in the equipment being used. The objective of this study was to assess the reliability of each of the 5 Bioharness™ variables using a treadmill based protocol. 10 healthy males participated. A between and within subject design to assess the reliability of Heart rate (HR), Breathing Frequency (BF), Accelerometry (ACC) and Infra-red skin temperature (ST) was completed via a repeated, discontinuous, incremental treadmill protocol. Posture (P) was assessed by a tilt table, moved through 160°. Between subject data reported low Coefficient of Variation (CV) and strong correlations(r) for ACC and P (CV< 7.6; r = 0.99, p < 0.01). In contrast, HR and BF (CV~19.4; r~0.70, p < 0.01) and ST (CV 3.7; r = 0.61, p < 0.01), present more variable data. Intra and inter device data presented strong relationships (r > 0.89, p < 0.01) and low CV (<10.1) for HR, ACC, P and ST. BF produced weaker relationships (r < 0.72) and higher CV (<17.4). In comparison to the other variables BF variable consistently presents less reliability. Global results suggest that the Bioharness™ is a reliable multivariable monitoring device during laboratory testing within the limits presented. Key pointsHeart rate and breathing frequency data increased in variance at higher velocities (i.e. ≥ 10 km.h-1)In comparison to the between subject testing, the intra and inter reliability presented good reliability in data suggesting placement or position of device relative to performer could be important for data collectionUnderstanding a devices variability in measurement is important before it can be used within an exercise testing or monitoring setting PMID:24149347

  7. The Ames Philosophical Belief Inventory: Reliability and Validity

    ERIC Educational Resources Information Center

    Sawyer, R. N.

    1971-01-01

    This study investigated the reliability and validity of the Philosophical Belief Inventory (PBI). With the exception of the relationship between idealism and pragmatism and realism and existentialism, the PBI scales appear to be assessing independent facets of belief. (Author)

  8. Conceptualizing Essay Tests' Reliability and Validity: From Research to Theory

    ERIC Educational Resources Information Center

    Badjadi, Nour El Imane

    2013-01-01

    The current paper on writing assessment surveys the literature on the reliability and validity of essay tests. The paper aims to examine the two concepts in relationship with essay testing as well as to provide a snapshot of the current understandings of the reliability and validity of essay tests as drawn in recent research studies. Bearing in…

  9. An Evolved Orthogonal Enzyme/Cofactor Pair.

    PubMed

    Reynolds, Evan W; McHenry, Matthew W; Cannac, Fabien; Gober, Joshua G; Snow, Christopher D; Brustad, Eric M

    2016-09-28

    We introduce a strategy that expands the functionality of hemoproteins through orthogonal enzyme/heme pairs. By exploiting the ability of a natural heme transport protein, ChuA, to promiscuously import heme derivatives, we have evolved a cytochrome P450 (P450BM3) that selectively incorporates a nonproteinogenic cofactor, iron deuteroporphyrin IX (Fe-DPIX), even in the presence of endogenous heme. Crystal structures show that selectivity gains are due to mutations that introduce steric clash with the heme vinyl groups while providing a complementary binding surface for the smaller Fe-DPIX cofactor. Furthermore, the evolved orthogonal enzyme/cofactor pair is active in non-natural carbenoid-mediated olefin cyclopropanation. This methodology for the generation of orthogonal enzyme/cofactor pairs promises to expand cofactor diversity in artificial metalloenzymes.

  10. Evolving neural networks through augmenting topologies.

    PubMed

    Stanley, Kenneth O; Miikkulainen, Risto

    2002-01-01

    An important question in neuroevolution is how to gain an advantage from evolving neural network topologies along with weights. We present a method, NeuroEvolution of Augmenting Topologies (NEAT), which outperforms the best fixed-topology method on a challenging benchmark reinforcement learning task. We claim that the increased efficiency is due to (1) employing a principled method of crossover of different topologies, (2) protecting structural innovation using speciation, and (3) incrementally growing from minimal structure. We test this claim through a series of ablation studies that demonstrate that each component is necessary to the system as a whole and to each other. What results is significantly faster learning. NEAT is also an important contribution to GAs because it shows how it is possible for evolution to both optimize and complexify solutions simultaneously, offering the possibility of evolving increasingly complex solutions over generations, and strengthening the analogy with biological evolution. PMID:12180173

  11. Evolving neural models of path integration.

    PubMed

    Vickerstaff, R J; Di Paolo, E A

    2005-09-01

    We use a genetic algorithm to evolve neural models of path integration, with particular emphasis on reproducing the homing behaviour of Cataglyphis fortis ants. This is done within the context of a complete model system, including an explicit representation of the animal's movements within its environment. We show that it is possible to produce a neural network without imposing a priori any particular system for the internal representation of the animal's home vector. The best evolved network obtained is analysed in detail and is found to resemble the bicomponent model of Mittelstaedt. Because of the presence of leaky integration, the model can reproduce the systematic navigation errors found in desert ants. The model also naturally mimics the searching behaviour that ants perform once they have reached their estimate of the nest location. The results support possible roles for leaky integration and cosine-shaped compass response functions in path integration.

  12. Evolved gas analysis of secondary organic aerosols

    SciTech Connect

    Grosjean, D.; Williams, E.L. II; Grosjean, E. ); Novakov, T. )

    1994-11-01

    Secondary organic aerosols have been characterized by evolved gas analysis (EGA). Hydrocarbons selected as aerosol precursors were representative of anthropogenic emissions (cyclohexene, cyclopentene, 1-decene and 1-dodecene, n-dodecane, o-xylene, and 1,3,5-trimethylbenzene) and of biogenic emissions (the terpenes [alpha]-pinene, [beta]-pinene and d-limonene and the sesquiterpene trans-caryophyllene). Also analyzed by EGA were samples of secondary, primary (highway tunnel), and ambient (urban) aerosols before and after exposure to ozone and other photochemical oxidants. The major features of the EGA thermograms (amount of CO[sub 2] evolved as a function of temperature) are described. The usefulness and limitations of EGA data for source apportionment of atmospheric particulate carbon are briefly discussed. 28 refs., 7 figs., 4 tabs.

  13. The evolving definition of systemic arterial hypertension.

    PubMed

    Ram, C Venkata S; Giles, Thomas D

    2010-05-01

    Systemic hypertension is an important risk factor for premature cardiovascular disease. Hypertension also contributes to excessive morbidity and mortality. Whereas excellent therapeutic options are available to treat hypertension, there is an unsettled issue about the very definition of hypertension. At what level of blood pressure should we treat hypertension? Does the definition of hypertension change in the presence of co-morbid conditions? This article covers in detail the evolving concepts in the diagnosis and management of hypertension.

  14. Quantum games on evolving random networks

    NASA Astrophysics Data System (ADS)

    Pawela, Łukasz

    2016-09-01

    We study the advantages of quantum strategies in evolutionary social dilemmas on evolving random networks. We focus our study on the two-player games: prisoner's dilemma, snowdrift and stag-hunt games. The obtained result show the benefits of quantum strategies for the prisoner's dilemma game. For the other two games, we obtain regions of parameters where the quantum strategies dominate, as well as regions where the classical strategies coexist.

  15. Evolving networks-Using past structure to predict the future

    NASA Astrophysics Data System (ADS)

    Shang, Ke-ke; Yan, Wei-sheng; Small, Michael

    2016-08-01

    Many previous studies on link prediction have focused on using common neighbors to predict the existence of links between pairs of nodes. More broadly, research into the structural properties of evolving temporal networks and temporal link prediction methods have recently attracted increasing attention. In this study, for the first time, we examine the use of links between a pair of nodes to predict their common neighbors and analyze the relationship between the weight and the structure in static networks, evolving networks, and in the corresponding randomized networks. We propose both new unweighted and weighted prediction methods and use six kinds of real networks to test our algorithms. In unweighted networks, we find that if a pair of nodes connect to each other in the current network, they will have a higher probability to connect common nodes both in the current and the future networks-and the probability will decrease with the increase of the number of neighbors. Furthermore, we find that the original networks have their particular structure and statistical characteristics which benefit link prediction. In weighted networks, the prediction algorithm performance of networks which are dominated by human factors decrease with the decrease of weight and are in general better in static networks. Furthermore, we find that geographical position and link weight both have significant influence on the transport network. Moreover, the evolving financial network has the lowest predictability. In addition, we find that the structure of non-social networks has more robustness than social networks. The structure of engineering networks has both best predictability and also robustness.

  16. Emergence of complexity in evolving niche-model food webs.

    PubMed

    Guill, Christian; Drossel, Barbara

    2008-03-01

    We have analysed mechanisms that promote the emergence of complex structures in evolving model food webs. The niche model is used to determine predator-prey relationships. Complexity is measured by species richness as well as trophic level structure and link density. Adaptive dynamics that allow predators to concentrate on the prey species they are best adapted to lead to a strong increase in species number but have only a small effect on the number and relative occupancy of trophic levels. The density of active links also remains small but a high number of potential links allows the network to adjust to changes in the species composition (emergence and extinction of species). Incorporating effects of body size on individual metabolism leads to a more complex trophic level structure: both the maximum and the average trophic level increase. So does the density of active links. Taking body size effects into consideration does not have a measurable influence on species richness. If species are allowed to adjust their foraging behaviour, the complexity of the evolving networks can also be influenced by the size of the external resources. The larger the resources, the larger and more complex is the food web it can sustain. Body size effects and increasing resources do not change size and the simple structure of the evolving networks if adaptive foraging is prohibited. This leads to the conclusion that in the framework of the niche model adaptive foraging is a necessary but not sufficient condition for the emergence of complex networks. It is found that despite the stabilising effect of foraging adaptation the system displays elements of self-organised critical behaviour.

  17. Reliability of photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1986-01-01

    In order to assess the reliability of photovoltaic modules, four categories of known array failure and degradation mechanisms are discussed, and target reliability allocations have been developed within each category based on the available technology and the life-cycle-cost requirements of future large-scale terrestrial applications. Cell-level failure mechanisms associated with open-circuiting or short-circuiting of individual solar cells generally arise from cell cracking or the fatigue of cell-to-cell interconnects. Power degradation mechanisms considered include gradual power loss in cells, light-induced effects, and module optical degradation. Module-level failure mechanisms and life-limiting wear-out mechanisms are also explored.

  18. Reliability and durability problems

    NASA Astrophysics Data System (ADS)

    Bojtsov, B. V.; Kondrashov, V. Z.

    The papers presented in this volume focus on methods for determining the stress-strain state of structures and machines and evaluating their reliability and service life. Specific topics discussed include a method for estimating the service life of thin-sheet automotive structures, stressed state at the tip of small cracks in anisotropic plates under biaxial tension, evaluation of the elastic-dissipative characteristics of joints by vibrational diagnostics methods, and calculation of the reliability of ceramic structures for arbitrary long-term loading programs. Papers are also presented on the effect of prior plastic deformation on fatigue damage kinetics, axisymmetric and local deformation of cylindrical parts during finishing-hardening treatments, and adhesion of polymers to diffusion coatings on steels.

  19. Human Reliability Program Workshop

    SciTech Connect

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  20. Reliable broadcast protocols

    NASA Technical Reports Server (NTRS)

    Joseph, T. A.; Birman, Kenneth P.

    1989-01-01

    A number of broadcast protocols that are reliable subject to a variety of ordering and delivery guarantees are considered. Developing applications that are distributed over a number of sites and/or must tolerate the failures of some of them becomes a considerably simpler task when such protocols are available for communication. Without such protocols the kinds of distributed applications that can reasonably be built will have a very limited scope. As the trend towards distribution and decentralization continues, it will not be surprising if reliable broadcast protocols have the same role in distributed operating systems of the future that message passing mechanisms have in the operating systems of today. On the other hand, the problems of engineering such a system remain large. For example, deciding which protocol is the most appropriate to use in a certain situation or how to balance the latency-communication-storage costs is not an easy question.

  1. Space Shuttle Propulsion System Reliability

    NASA Technical Reports Server (NTRS)

    Welzyn, Ken; VanHooser, Katherine; Moore, Dennis; Wood, David

    2011-01-01

    This session includes the following sessions: (1) External Tank (ET) System Reliability and Lessons, (2) Space Shuttle Main Engine (SSME), Reliability Validated by a Million Seconds of Testing, (3) Reusable Solid Rocket Motor (RSRM) Reliability via Process Control, and (4) Solid Rocket Booster (SRB) Reliability via Acceptance and Testing.

  2. Continuous evaluation of evolving behavioral intervention technologies.

    PubMed

    Mohr, David C; Cheung, Ken; Schueller, Stephen M; Hendricks Brown, C; Duan, Naihua

    2013-10-01

    Behavioral intervention technologies (BITs) are web-based and mobile interventions intended to support patients and consumers in changing behaviors related to health, mental health, and well-being. BITs are provided to patients and consumers in clinical care settings and commercial marketplaces, frequently with little or no evaluation. Current evaluation methods, including RCTs and implementation studies, can require years to validate an intervention. This timeline is fundamentally incompatible with the BIT environment, where technology advancement and changes in consumer expectations occur quickly, necessitating rapidly evolving interventions. However, BITs can routinely and iteratively collect data in a planned and strategic manner and generate evidence through systematic prospective analyses, thereby creating a system that can "learn." A methodologic framework, Continuous Evaluation of Evolving Behavioral Intervention Technologies (CEEBIT), is proposed that can support the evaluation of multiple BITs or evolving versions, eliminating those that demonstrate poorer outcomes, while allowing new BITs to be entered at any time. CEEBIT could be used to ensure the effectiveness of BITs provided through deployment platforms in clinical care organizations or BIT marketplaces. The features of CEEBIT are described, including criteria for the determination of inferiority, determination of BIT inclusion, methods of assigning consumers to BITs, definition of outcomes, and evaluation of the usefulness of the system. CEEBIT offers the potential to collapse initial evaluation and postmarketing surveillance, providing ongoing assurance of safety and efficacy to patients and consumers, payers, and policymakers. PMID:24050429

  3. Transistor Level Circuit Experiments using Evolvable Hardware

    NASA Technical Reports Server (NTRS)

    Stoica, A.; Zebulum, R. S.; Keymeulen, D.; Ferguson, M. I.; Daud, Taher; Thakoor, A.

    2005-01-01

    The Jet Propulsion Laboratory (JPL) performs research in fault tolerant, long life, and space survivable electronics for the National Aeronautics and Space Administration (NASA). With that focus, JPL has been involved in Evolvable Hardware (EHW) technology research for the past several years. We have advanced the technology not only by simulation and evolution experiments, but also by designing, fabricating, and evolving a variety of transistor-based analog and digital circuits at the chip level. EHW refers to self-configuration of electronic hardware by evolutionary/genetic search mechanisms, thereby maintaining existing functionality in the presence of degradations due to aging, temperature, and radiation. In addition, EHW has the capability to reconfigure itself for new functionality when required for mission changes or encountered opportunities. Evolution experiments are performed using a genetic algorithm running on a DSP as the reconfiguration mechanism and controlling the evolvable hardware mounted on a self-contained circuit board. Rapid reconfiguration allows convergence to circuit solutions in the order of seconds. The paper illustrates hardware evolution results of electronic circuits and their ability to perform under 230 C temperature as well as radiations of up to 250 kRad.

  4. Continuous Evaluation of Evolving Behavioral Intervention Technologies

    PubMed Central

    Mohr, David C.; Cheung, Ken; Schueller, Stephen M.; Brown, C. Hendricks; Duan, Naihua

    2013-01-01

    Behavioral intervention technologies (BITs) are web-based and mobile interventions intended to support patients and consumers in changing behaviors related to health, mental health, and well-being. BITs are provided to patients and consumers in clinical care settings and commercial marketplaces, frequently with little or no evaluation. Current evaluation methods, including RCTs and implementation studies, can require years to validate an intervention. This timeline is fundamentally incompatible with the BIT environment, where technology advancement and changes in consumer expectations occur quickly, necessitating rapidly evolving interventions. However, BITs can routinely and iteratively collect data in a planned and strategic manner and generate evidence through systematic prospective analyses, thereby creating a system that can “learn.” A methodologic framework, Continuous Evaluation of Evolving Behavioral Intervention Technologies (CEEBIT), is proposed that can support the evaluation of multiple BITs or evolving versions, eliminating those that demonstrate poorer outcomes, while allowing new BITs to be entered at any time. CEEBIT could be used to ensure the effectiveness of BITs provided through deployment platforms in clinical care organizations or BIT marketplaces. The features of CEEBIT are described, including criteria for the determination of inferiority, determination of BIT inclusion, methods of assigning consumers to BITs, definition of outcomes, and evaluation of the usefulness of the system. CEEBIT offers the potential to collapse initial evaluation and postmarketing surveillance, providing ongoing assurance of safety and efficacy to patients and consumers, payers, and policymakers. PMID:24050429

  5. Evolving specialization of the arthropod nervous system

    PubMed Central

    Jarvis, Erin; Bruce, Heather S.; Patel, Nipam H.

    2012-01-01

    The diverse array of body plans possessed by arthropods is created by generating variations upon a design of repeated segments formed during development, using a relatively small “toolbox” of conserved patterning genes. These attributes make the arthropod body plan a valuable model for elucidating how changes in development create diversity of form. As increasingly specialized segments and appendages evolved in arthropods, the nervous systems of these animals also evolved to control the function of these structures. Although there is a remarkable degree of conservation in neural development both between individual segments in any given species and between the nervous systems of different arthropod groups, the differences that do exist are informative for inferring general principles about the holistic evolution of body plans. This review describes developmental processes controlling neural segmentation and regionalization, highlighting segmentation mechanisms that create both ectodermal and neural segments, as well as recent studies of the role of Hox genes in generating regional specification within the central nervous system. We argue that this system generates a modular design that allows the nervous system to evolve in concert with the body segments and their associated appendages. This information will be useful in future studies of macroevolutionary changes in arthropod body plans, especially in understanding how these transformations can be made in a way that retains the function of appendages during evolutionary transitions in morphology. PMID:22723369

  6. Spacecraft transmitter reliability

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A workshop on spacecraft transmitter reliability was held at the NASA Lewis Research Center on September 25 and 26, 1979, to discuss present knowledge and to plan future research areas. Since formal papers were not submitted, this synopsis was derived from audio tapes of the workshop. The following subjects were covered: users' experience with space transmitters; cathodes; power supplies and interfaces; and specifications and quality assurance. A panel discussion ended the workshop.

  7. ATLAS reliability analysis

    SciTech Connect

    Bartsch, R.R.

    1995-09-01

    Key elements of the 36 MJ ATLAS capacitor bank have been evaluated for individual probabilities of failure. These have been combined to estimate system reliability which is to be greater than 95% on each experimental shot. This analysis utilizes Weibull or Weibull-like distributions with increasing probability of failure with the number of shots. For transmission line insulation, a minimum thickness is obtained and for the railgaps, a method for obtaining a maintenance interval from forthcoming life tests is suggested.

  8. Compact, Reliable EEPROM Controller

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Kleyner, Igor

    2010-01-01

    A compact, reliable controller for an electrically erasable, programmable read-only memory (EEPROM) has been developed specifically for a space-flight application. The design may be adaptable to other applications in which there are requirements for reliability in general and, in particular, for prevention of inadvertent writing of data in EEPROM cells. Inadvertent writes pose risks of loss of reliability in the original space-flight application and could pose such risks in other applications. Prior EEPROM controllers are large and complex and do not provide all reasonable protections (in many cases, few or no protections) against inadvertent writes. In contrast, the present controller provides several layers of protection against inadvertent writes. The controller also incorporates a write-time monitor, enabling determination of trends in the performance of an EEPROM through all phases of testing. The controller has been designed as an integral subsystem of a system that includes not only the controller and the controlled EEPROM aboard a spacecraft but also computers in a ground control station, relatively simple onboard support circuitry, and an onboard communication subsystem that utilizes the MIL-STD-1553B protocol. (MIL-STD-1553B is a military standard that encompasses a method of communication and electrical-interface requirements for digital electronic subsystems connected to a data bus. MIL-STD- 1553B is commonly used in defense and space applications.) The intent was to both maximize reliability while minimizing the size and complexity of onboard circuitry. In operation, control of the EEPROM is effected via the ground computers, the MIL-STD-1553B communication subsystem, and the onboard support circuitry, all of which, in combination, provide the multiple layers of protection against inadvertent writes. There is no controller software, unlike in many prior EEPROM controllers; software can be a major contributor to unreliability, particularly in fault

  9. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  10. Where the Wild Things Are: The Evolving Iconography of Rural Fauna

    ERIC Educational Resources Information Center

    Buller, Henry

    2004-01-01

    This paper explores the changing relationship between "nature" and rurality through an examination of the shifting iconography of animals, and particularly "wild" animals, in a rural setting. Drawing upon a set of examples, the paper argues that the faunistic icons of rural areas are evolving as alternative conceptions of the countryside, of…

  11. Reliability-based design optimization under stationary stochastic process loads

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Du, Xiaoping

    2016-08-01

    Time-dependent reliability-based design ensures the satisfaction of reliability requirements for a given period of time, but with a high computational cost. This work improves the computational efficiency by extending the sequential optimization and reliability analysis (SORA) method to time-dependent problems with both stationary stochastic process loads and random variables. The challenge of the extension is the identification of the most probable point (MPP) associated with time-dependent reliability targets. Since a direct relationship between the MPP and reliability target does not exist, this work defines the concept of equivalent MPP, which is identified by the extreme value analysis and the inverse saddlepoint approximation. With the equivalent MPP, the time-dependent reliability-based design optimization is decomposed into two decoupled loops: deterministic design optimization and reliability analysis, and both are performed sequentially. Two numerical examples are used to show the efficiency of the proposed method.

  12. Salt tolerance evolves more frequently in C4 grass lineages.

    PubMed

    Bromham, L; Bennett, T H

    2014-03-01

    Salt tolerance has evolved many times in the grass family, and yet few cereal crops are salt tolerant. Why has it been so difficult to develop crops tolerant of saline soils when salt tolerance has evolved so frequently in nature? One possible explanation is that some grass lineages have traits that predispose them to developing salt tolerance and that without these background traits, salt tolerance is harder to achieve. One candidate background trait is photosynthetic pathway, which has also been remarkably labile in grasses. At least 22 independent origins of the C4 photosynthetic pathway have been suggested to occur within the grass family. It is possible that the evolution of C4 photosynthesis aids exploitation of saline environments, because it reduces transpiration, increases water-use efficiency and limits the uptake of toxic ions. But the observed link between the evolution of C4 photosynthesis and salt tolerance could simply be due to biases in phylogenetic distribution of halophytes or C4 species. Here, we use a phylogenetic analysis to investigate the association between photosynthetic pathway and salt tolerance in the grass family Poaceae. We find that salt tolerance is significantly more likely to occur in lineages with C4 photosynthesis than in C3 lineages. We discuss the possible links between C4 photosynthesis and salt tolerance and consider the limitations of inferring the direction of causality of this relationship.

  13. General Aviation Aircraft Reliability Study

    NASA Technical Reports Server (NTRS)

    Pettit, Duane; Turnbull, Andrew; Roelant, Henk A. (Technical Monitor)

    2001-01-01

    This reliability study was performed in order to provide the aviation community with an estimate of Complex General Aviation (GA) Aircraft System reliability. To successfully improve the safety and reliability for the next generation of GA aircraft, a study of current GA aircraft attributes was prudent. This was accomplished by benchmarking the reliability of operational Complex GA Aircraft Systems. Specifically, Complex GA Aircraft System reliability was estimated using data obtained from the logbooks of a random sample of the Complex GA Aircraft population.

  14. Wild Origins: The Evolving Nature of Animal Behavior

    NASA Astrophysics Data System (ADS)

    Flores, Ifigenia

    For billions of years, evolution has been the driving force behind the incredible range of biodiversity on our planet. Wild Origins is a concept plan for an exhibition at the National Zoo that uses case studies of animal behavior to explain the theory of evolution. Behaviors evolve, just as physical forms do. Understanding natural selection can help us interpret animal behavior and vice-versa. A living collection, digital media, interactives, fossils, and photographs will relay stories of social behavior, sex, navigation and migration, foraging, domestication, and relationships between different species. The informal learning opportunities visitors are offered at the zoo will create a connection with the exhibition's teaching points. Visitors will leave with an understanding and sense of wonder at the evolutionary view of life.

  15. The organization and control of an evolving interdependent population

    PubMed Central

    Vural, Dervis C.; Isakov, Alexander; Mahadevan, L.

    2015-01-01

    Starting with Darwin, biologists have asked how populations evolve from a low fitness state that is evolutionarily stable to a high fitness state that is not. Specifically of interest is the emergence of cooperation and multicellularity where the fitness of individuals often appears in conflict with that of the population. Theories of social evolution and evolutionary game theory have produced a number of fruitful results employing two-state two-body frameworks. In this study, we depart from this tradition and instead consider a multi-player, multi-state evolutionary game, in which the fitness of an agent is determined by its relationship to an arbitrary number of other agents. We show that populations organize themselves in one of four distinct phases of interdependence depending on one parameter, selection strength. Some of these phases involve the formation of specialized large-scale structures. We then describe how the evolution of independence can be manipulated through various external perturbations. PMID:26040593

  16. The organization and control of an evolving interdependent population.

    PubMed

    Vural, Dervis C; Isakov, Alexander; Mahadevan, L

    2015-07-01

    Starting with Darwin, biologists have asked how populations evolve from a low fitness state that is evolutionarily stable to a high fitness state that is not. Specifically of interest is the emergence of cooperation and multicellularity where the fitness of individuals often appears in conflict with that of the population. Theories of social evolution and evolutionary game theory have produced a number of fruitful results employing two-state two-body frameworks. In this study, we depart from this tradition and instead consider a multi-player, multi-state evolutionary game, in which the fitness of an agent is determined by its relationship to an arbitrary number of other agents. We show that populations organize themselves in one of four distinct phases of interdependence depending on one parameter, selection strength. Some of these phases involve the formation of specialized large-scale structures. We then describe how the evolution of independence can be manipulated through various external perturbations.

  17. Administration to innovation: the evolving management challenge in primary care.

    PubMed

    Laing, A; Marnoch, G; McKee, L; Joshi, R; Reid, J

    1997-01-01

    The concept of the primary health-care team involving an increasingly diverse range of health care professionals is widely recognized as central to the pursuit of a primary care-led health service in the UK. Although GPs are formally recognized as the team leaders, there is little by way of policy prescription as to how team roles and relationships should be developed, or evidence as to how their roles have in fact evolved. Thus the notion of the primary health-care team while commonly employed, is in reality lacking definition with the current contribution of practice managers to the operation of this team being poorly understood. Focusing on the career backgrounds of practice managers, their range of responsibilities, and their involvement in innovation in general practice, presents a preliminary account of a chief scientist office-funded project examining the role being played by practice managers in primary health-care innovation. More specifically, utilizing data gained from the ongoing study, contextualizes the role played by practice managers in the primary health-care team. By exploring the business environment surrounding the NHS general practice, the research seeks to understand the evolving world of the practice manager. Drawing on questionnaire data, reinforced by qualitative data from the current interview phase, describes the role played by practice managers in differing practice contexts. This facilitates a discussion of a set of ideal type general practice organizational and managerial structures. Discusses the relationships and skills required by practice managers in each of these organizational types with reference to data gathered to date in the research.

  18. Interrelation Between Safety Factors and Reliability

    NASA Technical Reports Server (NTRS)

    Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.

  19. Ultimately Reliable Pyrotechnic Systems

    NASA Technical Reports Server (NTRS)

    Scott, John H.; Hinkel, Todd

    2015-01-01

    This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing

  20. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection.

    PubMed

    Janković, Srdja; Ćirković, Milan M

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  1. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection

    NASA Astrophysics Data System (ADS)

    Janković, Srdja; Ćirković, Milan M.

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  2. Crystalline-silicon reliability lessons for thin-film modules

    NASA Astrophysics Data System (ADS)

    Ross, R. G., Jr.

    1985-10-01

    The reliability of crystalline silicon modules has been brought to a high level with lifetimes approaching 20 years, and excellent industry credibility and user satisfaction. The transition from crystalline modules to thin film modules is comparable to the transition from discrete transistors to integrated circuits. New cell materials and monolithic structures will require new device processing techniques, but the package function and design will evolve to a lesser extent. Although there will be new encapsulants optimized to take advantage of the mechanical flexibility and low temperature processing features of thin films, the reliability and life degradation stresses and mechanisms will remain mostly unchanged. Key reliability technologies in common between crystalline and thin film modules include hot spot heating, galvanic and electrochemical corrosion, hail impact stresses, glass breakage, mechanical fatigue, photothermal degradation of encapsulants, operating temperature, moisture sorption, circuit design strategies, product safety issues, and the process required to achieve a reliable product from a laboratory prototype.

  3. Crystalline-silicon reliability lessons for thin-film modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1985-01-01

    The reliability of crystalline silicon modules has been brought to a high level with lifetimes approaching 20 years, and excellent industry credibility and user satisfaction. The transition from crystalline modules to thin film modules is comparable to the transition from discrete transistors to integrated circuits. New cell materials and monolithic structures will require new device processing techniques, but the package function and design will evolve to a lesser extent. Although there will be new encapsulants optimized to take advantage of the mechanical flexibility and low temperature processing features of thin films, the reliability and life degradation stresses and mechanisms will remain mostly unchanged. Key reliability technologies in common between crystalline and thin film modules include hot spot heating, galvanic and electrochemical corrosion, hail impact stresses, glass breakage, mechanical fatigue, photothermal degradation of encapsulants, operating temperature, moisture sorption, circuit design strategies, product safety issues, and the process required to achieve a reliable product from a laboratory prototype.

  4. Blade reliability collaborative :

    SciTech Connect

    Ashwill, Thomas D.; Ogilvie, Alistair B.; Paquette, Joshua A.

    2013-04-01

    The Blade Reliability Collaborative (BRC) was started by the Wind Energy Technologies Department of Sandia National Laboratories and DOE in 2010 with the goal of gaining insight into planned and unplanned O&M issues associated with wind turbine blades. A significant part of BRC is the Blade Defect, Damage and Repair Survey task, which will gather data from blade manufacturers, service companies, operators and prior studies to determine details about the largest sources of blade unreliability. This report summarizes the initial findings from this work.

  5. Survivability Is More Fundamental Than Evolvability

    PubMed Central

    Palmer, Michael E.; Feldman, Marcus W.

    2012-01-01

    For a lineage to survive over long time periods, it must sometimes change. This has given rise to the term evolvability, meaning the tendency to produce adaptive variation. One lineage may be superior to another in terms of its current standing variation, or it may tend to produce more adaptive variation. However, evolutionary outcomes depend on more than standing variation and produced adaptive variation: deleterious variation also matters. Evolvability, as most commonly interpreted, is not predictive of evolutionary outcomes. Here, we define a predictive measure of the evolutionary success of a lineage that we call the k-survivability, defined as the probability that the lineage avoids extinction for k generations. We estimate the k-survivability using multiple experimental replicates. Because we measure evolutionary outcomes, the initial standing variation, the full spectrum of generated variation, and the heritability of that variation are all incorporated. Survivability also accounts for the decreased joint likelihood of extinction of sub-lineages when they 1) disperse in space, or 2) diversify in lifestyle. We illustrate measurement of survivability with in silico models, and suggest that it may also be measured in vivo using multiple longitudinal replicates. The k-survivability is a metric that enables the quantitative study of, for example, the evolution of 1) mutation rates, 2) dispersal mechanisms, 3) the genotype-phenotype map, and 4) sexual reproduction, in temporally and spatially fluctuating environments. Although these disparate phenomena evolve by well-understood microevolutionary rules, they are also subject to the macroevolutionary constraint of long-term survivability. PMID:22723844

  6. Evolvable circuit with transistor-level reconfigurability

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian (Inventor); Salazar-Lazaro, Carlos Harold (Inventor)

    2004-01-01

    An evolvable circuit includes a plurality of reconfigurable switches, a plurality of transistors within a region of the circuit, the plurality of transistors having terminals, the plurality of transistors being coupled between a power source terminal and a power sink terminal so as to be capable of admitting power between the power source terminal and the power sink terminal, the plurality of transistors being coupled so that every transistor terminal to transistor terminal coupling within the region of the circuit comprises a reconfigurable switch.

  7. Present weather and climate: evolving conditions

    USGS Publications Warehouse

    Hoerling, Martin P; Dettinger, Michael; Wolter, Klaus; Lukas, Jeff; Eischeid, Jon K.; Nemani, Rama; Liebmann, Brant; Kunkel, Kenneth E.

    2013-01-01

    This chapter assesses weather and climate variability and trends in the Southwest, using observed climate and paleoclimate records. It analyzes the last 100 years of climate variability in comparison to the last 1,000 years, and links the important features of evolving climate conditions to river flow variability in four of the region’s major drainage basins. The chapter closes with an assessment of the monitoring and scientific research needed to increase confidence in understanding when climate episodes, events, and phenomena are attributable to human-caused climate change.

  8. The Evolving Theory of Evolutionary Radiations.

    PubMed

    Simões, M; Breitkreuz, L; Alvarado, M; Baca, S; Cooper, J C; Heins, L; Herzog, K; Lieberman, B S

    2016-01-01

    Evolutionary radiations have intrigued biologists for more than 100 years, and our understanding of the patterns and processes associated with these radiations continues to grow and evolve. Recently it has been recognized that there are many different types of evolutionary radiation beyond the well-studied adaptive radiations. We focus here on multifarious types of evolutionary radiations, paying special attention to the abiotic factors that might trigger diversification in clades. We integrate concepts such as exaptation, species selection, coevolution, and the turnover-pulse hypothesis (TPH) into the theoretical framework of evolutionary radiations. We also discuss other phenomena that are related to, but distinct from, evolutionary radiations that have relevance for evolutionary biology.

  9. Evolving Black Holes with Wavy Initial Data

    NASA Astrophysics Data System (ADS)

    Kelly, Bernard; Tichy, Wolfgang; Zlochower, Yosef; Campanelli, Manuela; Whiting, Bernard

    2009-05-01

    In Kelly et al. [Phys. Rev. D v. 76, 024008 (2007)], we presented new binary black-hole initial data adapted to puncture evolutions in numerical relativity. This data satisfies the constraint equations to 2.5 post-Newtonian order, and contains a transverse-traceless ``wavy'' metric contribution, violating the standard assumption of conformal flatness. We report on progress in evolving this data with a modern moving-puncture implementation of the BSSN equations in several numerical codes. We will discuss the effect of the new metric terms on junk radiation and continuity of physical radiation extracted.

  10. Mobile computing acceptance grows as applications evolve.

    PubMed

    Porn, Louis M; Patrick, Kelly

    2002-01-01

    Handheld devices are becoming more cost-effective to own, and their use in healthcare environments is increasing. Handheld devices currently are being used for e-prescribing, charge capture, and accessing daily schedules and reference tools. Future applications may include education on medications, dictation, order entry, and test-results reporting. Selecting the right handheld device requires careful analysis of current and future applications, as well as vendor expertise. It is important to recognize the technology will continue to evolve over the next three years. PMID:11806321

  11. Mobile computing acceptance grows as applications evolve.

    PubMed

    Porn, Louis M; Patrick, Kelly

    2002-01-01

    Handheld devices are becoming more cost-effective to own, and their use in healthcare environments is increasing. Handheld devices currently are being used for e-prescribing, charge capture, and accessing daily schedules and reference tools. Future applications may include education on medications, dictation, order entry, and test-results reporting. Selecting the right handheld device requires careful analysis of current and future applications, as well as vendor expertise. It is important to recognize the technology will continue to evolve over the next three years.

  12. Earth As an Evolving Planetary System

    NASA Astrophysics Data System (ADS)

    Meert, Joseph G.

    2005-05-01

    ``System'' is an overused buzzword in textbooks covering geological sciences. Describing the Earth as a system of component parts is a reasonable concept, but providing a comprehensive framework for detailing the system is a more formidable task. Kent Condie lays out the systems approach in an easy-to-read introductory chapter in Earth as an Evolving Planetary System. In the book, Condie makes a valiant attempt at taking the mélange of diverse subjects in the solid Earth sciences and weaving them into a coherent tapestry.

  13. Fault Tree Reliability Analysis and Design-for-reliability

    1998-05-05

    WinR provides a fault tree analysis capability for performing systems reliability and design-for-reliability analyses. The package includes capabilities for sensitivity and uncertainity analysis, field failure data analysis, and optimization.

  14. Improve relief valve reliability

    SciTech Connect

    Nelson, W.E.

    1993-01-01

    This paper reports on careful evaluation of safety relief valves and their service conditions which can improve reliability and permit more time between testing. Some factors that aid in getting long-run results are: Use of valves suitable for service, Attention to design of the relieving system (including use of block valves) and Close attention to repair procedures. Use these procedures for each installation, applying good engineering practices. The Clean Air Act of 1990 and other legislation limiting allowable fugitive emissions in a hydrocarbon processing plant will greatly impact safety relief valve installations. Normal leakage rate from a relief valve will require that it be connected to a closed vent system connected to a recovery or control device. Tying the outlet of an existing valve into a header system can cause accelerated corrosion and operating difficulties. Reliability of many existing safety relief valves may be compromised when they are connected to an outlet header without following good engineering practices. The law has been enacted but all the rules have not been promulgated.

  15. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1990-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  16. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1988-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  17. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  18. Load Control System Reliability

    SciTech Connect

    Trudnowski, Daniel

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  19. Understanding the Elements of Operational Reliability: A Key for Achieving High Reliability

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    2010-01-01

    This viewgraph presentation reviews operational reliability and its role in achieving high reliability through design and process reliability. The topics include: 1) Reliability Engineering Major Areas and interfaces; 2) Design Reliability; 3) Process Reliability; and 4) Reliability Applications.

  20. Have plants evolved to self-immolate?

    PubMed

    Bowman, David M J S; French, Ben J; Prior, Lynda D

    2014-01-01

    By definition fire prone ecosystems have highly combustible plants, leading to the hypothesis, first formally stated by Mutch in 1970, that community flammability is the product of natural selection of flammable traits. However, proving the "Mutch hypothesis" has presented an enormous challenge for fire ecologists given the difficulty in establishing cause and effect between landscape fire and flammable plant traits. Individual plant traits (such as leaf moisture content, retention of dead branches and foliage, oil rich foliage) are known to affect the flammability of plants but there is no evidence these characters evolved specifically to self-immolate, although some of these traits may have been secondarily modified to increase the propensity to burn. Demonstrating individual benefits from self-immolation is extraordinarily difficult, given the intersection of the physical environmental factors that control landscape fire (fuel production, dryness and ignitions) with community flammability properties that emerge from numerous traits of multiple species (canopy cover and litter bed bulk density). It is more parsimonious to conclude plants have evolved mechanisms to tolerate, but not promote, landscape fire. PMID:25414710

  1. Evolving resistance to obesity in an insect.

    PubMed

    Warbrick-Smith, James; Behmer, Spencer T; Lee, Kwang Pum; Raubenheimer, David; Simpson, Stephen J

    2006-09-19

    Failure to adapt to a changing nutritional environment comes at a cost, as evidenced by the modern human obesity crisis. Consumption of energy-rich diets can lead to obesity and is associated with deleterious consequences not only in humans but also in many other animals, including insects. The question thus arises whether animals restricted over multiple generations to high-energy diets can evolve mechanisms to limit the deposition of adverse levels of body fat. We show that Plutella xylostella caterpillars reared for multiple generations on carbohydrate-rich foods (either a chemically defined artificial diet or a high-starch Arabidopsis mutant) progressively developed the ability to eat excess carbohydrate without laying it down as fat, providing strong evidence that excess fat storage has a fitness cost. In contrast, caterpillars reared in carbohydrate-scarce environments (a chemically defined artificial diet or a low-starch Arabidopsis mutant) had a greater propensity to store ingested carbohydrate as fat. Additionally, insects reared on the low-starch Arabidopsis mutant evolved a preference for laying their eggs on this plant, whereas those selected on the high-starch Arabidopsis mutant showed no preference. Our results provide an experimental example of metabolic adaptation in the face of changes in the nutritional environment and suggest that changes in plant macronutrient profiles may promote host-associated population divergence.

  2. Collapse of cooperation in evolving games

    PubMed Central

    Stewart, Alexander J.; Plotkin, Joshua B.

    2014-01-01

    Game theory provides a quantitative framework for analyzing the behavior of rational agents. The Iterated Prisoner’s Dilemma in particular has become a standard model for studying cooperation and cheating, with cooperation often emerging as a robust outcome in evolving populations. Here we extend evolutionary game theory by allowing players’ payoffs as well as their strategies to evolve in response to selection on heritable mutations. In nature, many organisms engage in mutually beneficial interactions and individuals may seek to change the ratio of risk to reward for cooperation by altering the resources they commit to cooperative interactions. To study this, we construct a general framework for the coevolution of strategies and payoffs in arbitrary iterated games. We show that, when there is a tradeoff between the benefits and costs of cooperation, coevolution often leads to a dramatic loss of cooperation in the Iterated Prisoner’s Dilemma. The collapse of cooperation is so extreme that the average payoff in a population can decline even as the potential reward for mutual cooperation increases. Depending upon the form of tradeoffs, evolution may even move away from the Iterated Prisoner’s Dilemma game altogether. Our work offers a new perspective on the Prisoner’s Dilemma and its predictions for cooperation in natural populations; and it provides a general framework to understand the coevolution of strategies and payoffs in iterated interactions. PMID:25422421

  3. Caterpillars evolved from onychophorans by hybridogenesis.

    PubMed

    Williamson, Donald I

    2009-11-24

    I reject the Darwinian assumption that larvae and their adults evolved from a single common ancestor. Rather I posit that, in animals that metamorphose, the basic types of larvae originated as adults of different lineages, i.e., larvae were transferred when, through hybridization, their genomes were acquired by distantly related animals. "Caterpillars," the name for eruciforms with thoracic and abdominal legs, are larvae of lepidopterans, hymenopterans, and mecopterans (scorpionflies). Grubs and maggots, including the larvae of beetles, bees, and flies, evolved from caterpillars by loss of legs. Caterpillar larval organs are dismantled and reconstructed in the pupal phase. Such indirect developmental patterns (metamorphoses) did not originate solely by accumulation of random mutations followed by natural selection; rather they are fully consistent with my concept of evolution by hybridogenesis. Members of the phylum Onychophora (velvet worms) are proposed as the evolutionary source of caterpillars and their grub or maggot descendants. I present a molecular biological research proposal to test my thesis. By my hypothesis 2 recognizable sets of genes are detectable in the genomes of all insects with caterpillar grub- or maggot-like larvae: (i) onychophoran genes that code for proteins determining larval morphology/physiology and (ii) sequentially expressed insect genes that code for adult proteins. The genomes of insects and other animals that, by contrast, entirely lack larvae comprise recognizable sets of genes from single animal common ancestors.

  4. Collapse of cooperation in evolving games.

    PubMed

    Stewart, Alexander J; Plotkin, Joshua B

    2014-12-01

    Game theory provides a quantitative framework for analyzing the behavior of rational agents. The Iterated Prisoner's Dilemma in particular has become a standard model for studying cooperation and cheating, with cooperation often emerging as a robust outcome in evolving populations. Here we extend evolutionary game theory by allowing players' payoffs as well as their strategies to evolve in response to selection on heritable mutations. In nature, many organisms engage in mutually beneficial interactions and individuals may seek to change the ratio of risk to reward for cooperation by altering the resources they commit to cooperative interactions. To study this, we construct a general framework for the coevolution of strategies and payoffs in arbitrary iterated games. We show that, when there is a tradeoff between the benefits and costs of cooperation, coevolution often leads to a dramatic loss of cooperation in the Iterated Prisoner's Dilemma. The collapse of cooperation is so extreme that the average payoff in a population can decline even as the potential reward for mutual cooperation increases. Depending upon the form of tradeoffs, evolution may even move away from the Iterated Prisoner's Dilemma game altogether. Our work offers a new perspective on the Prisoner's Dilemma and its predictions for cooperation in natural populations; and it provides a general framework to understand the coevolution of strategies and payoffs in iterated interactions.

  5. Caterpillars evolved from onychophorans by hybridogenesis.

    PubMed

    Williamson, Donald I

    2009-11-24

    I reject the Darwinian assumption that larvae and their adults evolved from a single common ancestor. Rather I posit that, in animals that metamorphose, the basic types of larvae originated as adults of different lineages, i.e., larvae were transferred when, through hybridization, their genomes were acquired by distantly related animals. "Caterpillars," the name for eruciforms with thoracic and abdominal legs, are larvae of lepidopterans, hymenopterans, and mecopterans (scorpionflies). Grubs and maggots, including the larvae of beetles, bees, and flies, evolved from caterpillars by loss of legs. Caterpillar larval organs are dismantled and reconstructed in the pupal phase. Such indirect developmental patterns (metamorphoses) did not originate solely by accumulation of random mutations followed by natural selection; rather they are fully consistent with my concept of evolution by hybridogenesis. Members of the phylum Onychophora (velvet worms) are proposed as the evolutionary source of caterpillars and their grub or maggot descendants. I present a molecular biological research proposal to test my thesis. By my hypothesis 2 recognizable sets of genes are detectable in the genomes of all insects with caterpillar grub- or maggot-like larvae: (i) onychophoran genes that code for proteins determining larval morphology/physiology and (ii) sequentially expressed insect genes that code for adult proteins. The genomes of insects and other animals that, by contrast, entirely lack larvae comprise recognizable sets of genes from single animal common ancestors. PMID:19717430

  6. Evolving resistance to obesity in an insect

    PubMed Central

    Warbrick-Smith, James; Behmer, Spencer T.; Lee, Kwang Pum; Raubenheimer, David; Simpson, Stephen J.

    2006-01-01

    Failure to adapt to a changing nutritional environment comes at a cost, as evidenced by the modern human obesity crisis. Consumption of energy-rich diets can lead to obesity and is associated with deleterious consequences not only in humans but also in many other animals, including insects. The question thus arises whether animals restricted over multiple generations to high-energy diets can evolve mechanisms to limit the deposition of adverse levels of body fat. We show that Plutella xylostella caterpillars reared for multiple generations on carbohydrate-rich foods (either a chemically defined artificial diet or a high-starch Arabidopsis mutant) progressively developed the ability to eat excess carbohydrate without laying it down as fat, providing strong evidence that excess fat storage has a fitness cost. In contrast, caterpillars reared in carbohydrate-scarce environments (a chemically defined artificial diet or a low-starch Arabidopsis mutant) had a greater propensity to store ingested carbohydrate as fat. Additionally, insects reared on the low-starch Arabidopsis mutant evolved a preference for laying their eggs on this plant, whereas those selected on the high-starch Arabidopsis mutant showed no preference. Our results provide an experimental example of metabolic adaptation in the face of changes in the nutritional environment and suggest that changes in plant macronutrient profiles may promote host-associated population divergence. PMID:16968774

  7. Shaping the outflows of evolved stars

    NASA Astrophysics Data System (ADS)

    Mohamed, Shazrene

    2015-08-01

    Both hot and cool evolved stars, e.g., red (super)giants and Wolf-Rayet stars, lose copious amounts of mass, momentum and mechanical energy through powerful, dense stellar winds. The interaction of these outflows with their surroundings results in highly structured and complex circumstellar environments, often featuring knots, arcs, shells and spirals. Recent improvements in computational power and techniques have led to the development of detailed, multi-dimensional simulations that have given new insight into the origin of these structures, and better understanding of the physical mechanisms driving their formation. In this talk, I will discuss three of the main mechanisms that shape the outflows of evolved stars:- interaction with the interstellar medium (ISM), i.e., wind-ISM interactions- interaction with a stellar wind, either from a previous phase of evolution or the wind from a companion star, i.e., wind-wind interactions- and interaction with a companion star that has a weak or insignicant outflow (e.g., a compact companion such as a neutron star or black hole), i.e., wind-companion interactions.I will also highlight the broader implications and impact of these stellar wind interactions for other phenomena, e.g, for symbiotic and X-ray binaries, supernovae and Gamma-ray bursts.

  8. Early formation of evolved asteroidal crust.

    PubMed

    Day, James M D; Ash, Richard D; Liu, Yang; Bellucci, Jeremy J; Rumble, Douglas; McDonough, William F; Walker, Richard J; Taylor, Lawrence A

    2009-01-01

    Mechanisms for the formation of crust on planetary bodies remain poorly understood. It is generally accepted that Earth's andesitic continental crust is the product of plate tectonics, whereas the Moon acquired its feldspar-rich crust by way of plagioclase flotation in a magma ocean. Basaltic meteorites provide evidence that, like the terrestrial planets, some asteroids generated crust and underwent large-scale differentiation processes. Until now, however, no evolved felsic asteroidal crust has been sampled or observed. Here we report age and compositional data for the newly discovered, paired and differentiated meteorites Graves Nunatak (GRA) 06128 and GRA 06129. These meteorites are feldspar-rich, with andesite bulk compositions. Their age of 4.52 +/- 0.06 Gyr demonstrates formation early in Solar System history. The isotopic and elemental compositions, degree of metamorphic re-equilibration and sulphide-rich nature of the meteorites are most consistent with an origin as partial melts from a volatile-rich, oxidized asteroid. GRA 06128 and 06129 are the result of a newly recognized style of evolved crust formation, bearing witness to incomplete differentiation of their parent asteroid and to previously unrecognized diversity of early-formed materials in the Solar System. PMID:19129845

  9. Early formation of evolved asteroidal crust

    NASA Astrophysics Data System (ADS)

    Day, James M. D.; Ash, Richard D.; Liu, Yang; Bellucci, Jeremy J.; Rumble, Douglas, III; McDonough, William F.; Walker, Richard J.; Taylor, Lawrence A.

    2009-01-01

    Mechanisms for the formation of crust on planetary bodies remain poorly understood. It is generally accepted that Earth's andesitic continental crust is the product of plate tectonics, whereas the Moon acquired its feldspar-rich crust by way of plagioclase flotation in a magma ocean. Basaltic meteorites provide evidence that, like the terrestrial planets, some asteroids generated crust and underwent large-scale differentiation processes. Until now, however, no evolved felsic asteroidal crust has been sampled or observed. Here we report age and compositional data for the newly discovered, paired and differentiated meteorites Graves Nunatak (GRA) 06128 and GRA 06129. These meteorites are feldspar-rich, with andesite bulk compositions. Their age of 4.52+/-0.06Gyr demonstrates formation early in Solar System history. The isotopic and elemental compositions, degree of metamorphic re-equilibration and sulphide-rich nature of the meteorites are most consistent with an origin as partial melts from a volatile-rich, oxidized asteroid. GRA 06128 and 06129 are the result of a newly recognized style of evolved crust formation, bearing witness to incomplete differentiation of their parent asteroid and to previously unrecognized diversity of early-formed materials in the Solar System.

  10. Early formation of evolved asteroidal crust.

    PubMed

    Day, James M D; Ash, Richard D; Liu, Yang; Bellucci, Jeremy J; Rumble, Douglas; McDonough, William F; Walker, Richard J; Taylor, Lawrence A

    2009-01-01

    Mechanisms for the formation of crust on planetary bodies remain poorly understood. It is generally accepted that Earth's andesitic continental crust is the product of plate tectonics, whereas the Moon acquired its feldspar-rich crust by way of plagioclase flotation in a magma ocean. Basaltic meteorites provide evidence that, like the terrestrial planets, some asteroids generated crust and underwent large-scale differentiation processes. Until now, however, no evolved felsic asteroidal crust has been sampled or observed. Here we report age and compositional data for the newly discovered, paired and differentiated meteorites Graves Nunatak (GRA) 06128 and GRA 06129. These meteorites are feldspar-rich, with andesite bulk compositions. Their age of 4.52 +/- 0.06 Gyr demonstrates formation early in Solar System history. The isotopic and elemental compositions, degree of metamorphic re-equilibration and sulphide-rich nature of the meteorites are most consistent with an origin as partial melts from a volatile-rich, oxidized asteroid. GRA 06128 and 06129 are the result of a newly recognized style of evolved crust formation, bearing witness to incomplete differentiation of their parent asteroid and to previously unrecognized diversity of early-formed materials in the Solar System.

  11. Netgram: Visualizing Communities in Evolving Networks.

    PubMed

    Mall, Raghvendra; Langone, Rocco; Suykens, Johan A K

    2015-01-01

    Real-world complex networks are dynamic in nature and change over time. The change is usually observed in the interactions within the network over time. Complex networks exhibit community like structures. A key feature of the dynamics of complex networks is the evolution of communities over time. Several methods have been proposed to detect and track the evolution of these groups over time. However, there is no generic tool which visualizes all the aspects of group evolution in dynamic networks including birth, death, splitting, merging, expansion, shrinkage and continuation of groups. In this paper, we propose Netgram: a tool for visualizing evolution of communities in time-evolving graphs. Netgram maintains evolution of communities over 2 consecutive time-stamps in tables which are used to create a query database using the sql outer-join operation. It uses a line-based visualization technique which adheres to certain design principles and aesthetic guidelines. Netgram uses a greedy solution to order the initial community information provided by the evolutionary clustering technique such that we have fewer line cross-overs in the visualization. This makes it easier to track the progress of individual communities in time evolving graphs. Netgram is a generic toolkit which can be used with any evolutionary community detection algorithm as illustrated in our experiments. We use Netgram for visualization of topic evolution in the NIPS conference over a period of 11 years and observe the emergence and merging of several disciplines in the field of information processing systems. PMID:26356538

  12. Environmental stress and evolvability in microbial systems.

    PubMed

    Baquero, F

    2009-01-01

    The sustainability of life on the planet depends on the preservation of the existing microbial systems, which constitutes our major "biological atmosphere". The detection of variations in microbial systems as a result of anthropogenic or natural changes is critical both to detect and assess risks and to programme specific interventions. Changes in microbial systems provokes stress, probably altering the local evolutionary time by changing evolvability (the possibilities of microbes to evolve). Methods should be refined to properly assess diversity in microbial systems. We propose that such diversity estimations should be done on a multi-hierarchical scale, encompassing not only organisms, but sub-cellular entities (e.g. chromosomal domains, plasmids, transposons, integrons, genes, gene modules) and supra-cellular organizations (e.g. clones, populations, communities, ecosystems), applying Hamiltonian criteria of inclusive fitness for the different ensembles. In any of these entities, we can generally identify, in a fractal manner, constant and variable parts. Variation in these entities and ensembles is probably both reduced and increased by environmental stress. Because of that, variation in microbial systems might serve as mirrors or symptoms of the health of the planet. PMID:19220344

  13. Evolvability of an Optimal Recombination Rate.

    PubMed

    Lobkovsky, Alexander E; Wolf, Yuri I; Koonin, Eugene V

    2015-12-10

    Evolution and maintenance of genetic recombination and its relation to the mutational process is a long-standing, fundamental problem in evolutionary biology that is linked to the general problem of evolution of evolvability. We explored a stochastic model of the evolution of recombination using additive fitness and infinite allele assumptions but no assumptions on the sign or magnitude of the epistasis and the distribution of mutation effects. In this model, fluctuating negative epistasis and predominantly deleterious mutations arise naturally as a consequence of the additive fitness and a reservoir from which new alleles arrive with a fixed distribution of fitness effects. Analysis of the model revealed a nonmonotonic effect of recombination intensity on fitness, with an optimal recombination rate value which maximized fitness in steady state. The optimal recombination rate depended on the mutation rate and was evolvable, that is, subject to selection. The predictions of the model were compatible with the observations on the dependence between genome rearrangement rate and gene flux in microbial genomes.

  14. Collapse of cooperation in evolving games.

    PubMed

    Stewart, Alexander J; Plotkin, Joshua B

    2014-12-01

    Game theory provides a quantitative framework for analyzing the behavior of rational agents. The Iterated Prisoner's Dilemma in particular has become a standard model for studying cooperation and cheating, with cooperation often emerging as a robust outcome in evolving populations. Here we extend evolutionary game theory by allowing players' payoffs as well as their strategies to evolve in response to selection on heritable mutations. In nature, many organisms engage in mutually beneficial interactions and individuals may seek to change the ratio of risk to reward for cooperation by altering the resources they commit to cooperative interactions. To study this, we construct a general framework for the coevolution of strategies and payoffs in arbitrary iterated games. We show that, when there is a tradeoff between the benefits and costs of cooperation, coevolution often leads to a dramatic loss of cooperation in the Iterated Prisoner's Dilemma. The collapse of cooperation is so extreme that the average payoff in a population can decline even as the potential reward for mutual cooperation increases. Depending upon the form of tradeoffs, evolution may even move away from the Iterated Prisoner's Dilemma game altogether. Our work offers a new perspective on the Prisoner's Dilemma and its predictions for cooperation in natural populations; and it provides a general framework to understand the coevolution of strategies and payoffs in iterated interactions. PMID:25422421

  15. Have plants evolved to self-immolate?

    PubMed Central

    Bowman, David M. J. S.; French, Ben J.; Prior, Lynda D.

    2014-01-01

    By definition fire prone ecosystems have highly combustible plants, leading to the hypothesis, first formally stated by Mutch in 1970, that community flammability is the product of natural selection of flammable traits. However, proving the “Mutch hypothesis” has presented an enormous challenge for fire ecologists given the difficulty in establishing cause and effect between landscape fire and flammable plant traits. Individual plant traits (such as leaf moisture content, retention of dead branches and foliage, oil rich foliage) are known to affect the flammability of plants but there is no evidence these characters evolved specifically to self-immolate, although some of these traits may have been secondarily modified to increase the propensity to burn. Demonstrating individual benefits from self-immolation is extraordinarily difficult, given the intersection of the physical environmental factors that control landscape fire (fuel production, dryness and ignitions) with community flammability properties that emerge from numerous traits of multiple species (canopy cover and litter bed bulk density). It is more parsimonious to conclude plants have evolved mechanisms to tolerate, but not promote, landscape fire. PMID:25414710

  16. Viedma ripening: a reliable crystallisation method to reach single chirality.

    PubMed

    Sögütoglu, Leyla-Cann; Steendam, René R E; Meekes, Hugo; Vlieg, Elias; Rutjes, Floris P J T

    2015-10-01

    Crystallisation processes have evolved to practical methods that allow isolation of an enantiopure product in high yield. Viedma ripening in particular enables access to enantiopure products in a reliable way, simply through grinding of crystals in a solution. This tutorial review covers the basic principles behind asymmetric crystallisation processes, with an emphasis on Viedma ripening, and shows that to date many novel organic molecules can be obtained in enantiopure solid form. PMID:26165858

  17. Fatigue Reliability of Gas Turbine Engine Structures

    NASA Technical Reports Server (NTRS)

    Cruse, Thomas A.; Mahadevan, Sankaran; Tryon, Robert G.

    1997-01-01

    The results of an investigation are described for fatigue reliability in engine structures. The description consists of two parts. Part 1 is for method development. Part 2 is a specific case study. In Part 1, the essential concepts and practical approaches to damage tolerance design in the gas turbine industry are summarized. These have evolved over the years in response to flight safety certification requirements. The effect of Non-Destructive Evaluation (NDE) methods on these methods is also reviewed. Assessment methods based on probabilistic fracture mechanics, with regard to both crack initiation and crack growth, are outlined. Limit state modeling techniques from structural reliability theory are shown to be appropriate for application to this problem, for both individual failure mode and system-level assessment. In Part 2, the results of a case study for the high pressure turbine of a turboprop engine are described. The response surface approach is used to construct a fatigue performance function. This performance function is used with the First Order Reliability Method (FORM) to determine the probability of failure and the sensitivity of the fatigue life to the engine parameters for the first stage disk rim of the two stage turbine. A hybrid combination of regression and Monte Carlo simulation is to use incorporate time dependent random variables. System reliability is used to determine the system probability of failure, and the sensitivity of the system fatigue life to the engine parameters of the high pressure turbine. 'ne variation in the primary hot gas and secondary cooling air, the uncertainty of the complex mission loading, and the scatter in the material data are considered.

  18. Initial value sensitivity of the Chinese stock market and its relationship with the investment psychology

    NASA Astrophysics Data System (ADS)

    Ying, Shangjun; Li, Xiaojun; Zhong, Xiuqin

    2015-04-01

    This paper discusses the initial value sensitivity (IVS) of Chinese stock market, including the single stock market and the Chinese A-share stock market, with respect to real markets and evolving models. The aim is to explore the relationship between IVS of the Chinese A-share stock market and the investment psychology based on the evolving model of genetic cellular automaton (GCA). We find: (1) The Chinese stock market is sensitively dependent on the initial conditions. (2) The GCA model provides a considerable reliability in complexity simulation (e.g. the IVS). (3) The IVS of stock market is positively correlated with the imitation probability when the intensity of the imitation psychology reaches a certain threshold. The paper suggests that the government should seek to keep the imitation psychology under a certain level, otherwise it may induce severe fluctuation to the market.

  19. Business of reliability

    NASA Astrophysics Data System (ADS)

    Engel, Pierre

    1999-12-01

    The presentation is organized around three themes: (1) The decrease of reception equipment costs allows non-Remote Sensing organization to access a technology until recently reserved to scientific elite. What this means is the rise of 'operational' executive agencies considering space-based technology and operations as a viable input to their daily tasks. This is possible thanks to totally dedicated ground receiving entities focusing on one application for themselves, rather than serving a vast community of users. (2) The multiplication of earth observation platforms will form the base for reliable technical and financial solutions. One obstacle to the growth of the earth observation industry is the variety of policies (commercial versus non-commercial) ruling the distribution of the data and value-added products. In particular, the high volume of data sales required for the return on investment does conflict with traditional low-volume data use for most applications. Constant access to data sources supposes monitoring needs as well as technical proficiency. (3) Large volume use of data coupled with low- cost equipment costs is only possible when the technology has proven reliable, in terms of application results, financial risks and data supply. Each of these factors is reviewed. The expectation is that international cooperation between agencies and private ventures will pave the way for future business models. As an illustration, the presentation proposes to use some recent non-traditional monitoring applications, that may lead to significant use of earth observation data, value added products and services: flood monitoring, ship detection, marine oil pollution deterrent systems and rice acreage monitoring.

  20. Testing for PV Reliability (Presentation)

    SciTech Connect

    Kurtz, S.; Bansal, S.

    2014-09-01

    The DOE SUNSHOT workshop is seeking input from the community about PV reliability and how the DOE might address gaps in understanding. This presentation describes the types of testing that are needed for PV reliability and introduces a discussion to identify gaps in our understanding of PV reliability testing.

  1. Factor reliability into load management

    SciTech Connect

    Feight, G.R.

    1983-07-01

    Hardware reliability is a major factor to consider when selecting a direct-load-control system. The author outlines a method of estimating present-value costs associated with system reliability. He points out that small differences in receiver reliability make a significant difference in owning cost. 4 figures.

  2. Optimum Reliability of Gain Scores.

    ERIC Educational Resources Information Center

    Sharma, K. K.; Gupta, J. K.

    1986-01-01

    This paper gives a mathematical treatment to findings of Zimmerman and Williams and establishes a minimum reliability for gain scores when the pretest and posttest have equal reliabilities and equal standard deviations. It discusses the behavior of the reliability of gain scores in terms of variations in other test parameters. (Author/LMO)

  3. Reliability prediction of corroding pipelines

    SciTech Connect

    Strutt, J.E.; Allsopp, K.; Newman, D.; Trille, C.

    1996-12-01

    Recent data collection studies relating to loss of containment of pipeline and risers indicate that corrosion is now the dominant failure mode for steel pipelines in the North Sea area. As the North Sea pipeline infrastructure ages, it is expected that the proportion of pipelines failing by corrosion will increase further and this raises the question of the relationship between probability of pipeline corrosion failure and the reliability of the corrosion control and monitoring systems used by operators to prevent corrosion failures. This paper describes a methodology for predicting the probability of corrosion failure of a specific submarine pipeline or riser system. The paper illustrates how the model can be used to predict the safe life of a pipeline, given knowledge of the underlying corrosion behavior and corrosion control system and how the time to failure can be updated in the light of inspection and monitoring results enabling inspection policy to be evaluated for its impact on risk. The paper also shows how different assumptions concerning the underlying cause of failure influences the estimation of the probability of failure.

  4. Evolving spiking networks with variable resistive memories.

    PubMed

    Howard, Gerard; Bull, Larry; de Lacy Costello, Ben; Gale, Ella; Adamatzky, Andrew

    2014-01-01

    Neuromorphic computing is a brainlike information processing paradigm that requires adaptive learning mechanisms. A spiking neuro-evolutionary system is used for this purpose; plastic resistive memories are implemented as synapses in spiking neural networks. The evolutionary design process exploits parameter self-adaptation and allows the topology and synaptic weights to be evolved for each network in an autonomous manner. Variable resistive memories are the focus of this research; each synapse has its own conductance profile which modifies the plastic behaviour of the device and may be altered during evolution. These variable resistive networks are evaluated on a noisy robotic dynamic-reward scenario against two static resistive memories and a system containing standard connections only. The results indicate that the extra behavioural degrees of freedom available to the networks incorporating variable resistive memories enable them to outperform the comparative synapse types. PMID:23614774

  5. Renal cell carcinoma: Evolving and emerging subtypes

    PubMed Central

    Crumley, Suzanne M; Divatia, Mukul; Truong, Luan; Shen, Steven; Ayala, Alberto G; Ro, Jae Y

    2013-01-01

    Our knowledge of renal cell carcinoma (RCC) is rapidly expanding. For those who diagnose and treat RCC, it is important to understand the new developments. In recent years, many new renal tumors have been described and defined, and our understanding of the biology and clinical correlates of these tumors is changing. Evolving concepts in Xp11 translocation carcinoma, mucinous tubular and spindle cell carcinoma, multilocular cystic clear cell RCC, and carcinoma associated with neuroblastoma are addressed within this review. Tubulocystic carcinoma, thyroid-like follicular carcinoma of kidney, acquired cystic disease-associated RCC, and clear cell papillary RCC are also described. Finally, candidate entities, including RCC with t(6;11) translocation, hybrid oncocytoma/chromophobe RCC, hereditary leiomyomatosis and RCC syndrome, and renal angiomyoadenomatous tumor are reviewed. Knowledge of these new entities is important for diagnosis, treatment and subsequent prognosis. This review provides a targeted summary of new developments in RCC. PMID:24364021

  6. Evolving olfactory systems on the fly.

    PubMed

    Ramdya, Pavan; Benton, Richard

    2010-07-01

    The detection of odour stimuli in the environment is universally important for primal behaviours such as feeding, mating, kin interactions and escape responses. Given the ubiquity of many airborne chemical signals and the similar organisation of animal olfactory circuits, a fundamental question in our understanding of the sense of smell is how species-specific behavioural responses to odorants can evolve. Recent comparative genomic, developmental and physiological studies are shedding light on this problem by providing insights into the genetic mechanisms that underlie anatomical and functional evolution of the olfactory system. Here we synthesise these data, with a particular focus on insect olfaction, to address how new olfactory receptors and circuits might arise and diverge, offering glimpses into how odour-evoked behaviours could adapt to an ever-changing chemosensory world.

  7. Evolving unipolar memristor spiking neural networks

    NASA Astrophysics Data System (ADS)

    Howard, David; Bull, Larry; De Lacy Costello, Ben

    2015-10-01

    Neuromorphic computing - brain-like computing in hardware - typically requires myriad complimentary metal oxide semiconductor spiking neurons interconnected by a dense mesh of nanoscale plastic synapses. Memristors are frequently cited as strong synapse candidates due to their statefulness and potential for low-power implementations. To date, plentiful research has focused on the bipolar memristor synapse, which is capable of incremental weight alterations and can provide adaptive self-organisation under a Hebbian learning scheme. In this paper, we consider the unipolar memristor synapse - a device capable of non-Hebbian switching between only two states (conductive and resistive) through application of a suitable input voltage - and discuss its suitability for neuromorphic systems. A self-adaptive evolutionary process is used to autonomously find highly fit network configurations. Experimentation on two robotics tasks shows that unipolar memristor networks evolve task-solving controllers faster than both bipolar memristor networks and networks containing constant non-plastic connections whilst performing at least comparably.

  8. Synchronization in evolving snowdrift game model

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Wu, L.; Zhu, S. Q.

    2009-06-01

    The interaction between the evolution of the game and the underlying network structure with evolving snowdrift game model is investigated. The constructed network follows a power-law degree distribution typically showing scale-free feature. The topological features of average path length, clustering coefficient, degree-degree correlations and the dynamical feature of synchronizability are studied. The synchronizability of the constructed networks changes by the interaction. It will converge to a certain value when sufficient new nodes are added. It is found that initial payoffs of nodes greatly affect the synchronizability. When initial payoffs for players are equal, low common initial payoffs may lead to more heterogeneity of the network and good synchronizability. When initial payoffs follow certain distributions, better synchronizability is obtained compared to equal initial payoff. The result is also true for phase synchronization of nonidentical oscillators.

  9. Hyperhidrosis: evolving concepts and a comprehensive review.

    PubMed

    Vorkamp, Tobias; Foo, Fung Joon; Khan, Sidra; Schmitto, Jan D; Wilson, Paul

    2010-10-01

    Hyperhidrosis (primary or secondary) describes a disorder of excessive sweating. It has a significant negative impact on quality of life and affects nearly 1% of the population living in the United Kingdom (UK). Axillary involvement is the most common affecting 80% of cases. A common link to these disorders is an extreme non-thermoregulatory sympathetic stimulus of exocrine sweat glands, mostly due to emotional stimuli. Non-surgical treatment involves topical medication, iontophoresis and systemic anti-cholinergics. More recently the use of intradermal botulinum toxin has gained popularity. Surgical treatment reserved for severe cases, not responding to conservative management involves local excision, curettage and thoracoscopic sympathectomy. Evolving concepts for treatment, risks and benefits are discussed in the paper herein. PMID:20709287

  10. A local-world evolving hypernetwork model

    NASA Astrophysics Data System (ADS)

    Yang, Guang-Yong; Liu, Jian-Guo

    2014-01-01

    Complex hypernetworks are ubiquitous in the real system. It is very important to investigate the evolution mechanisms. In this paper, we present a local-world evolving hypernetwork model by taking into account the hyperedge growth and local-world hyperedge preferential attachment mechanisms. At each time step, a newly added hyperedge encircles a new coming node and a number of nodes from a randomly selected local world. The number of the selected nodes from the local world obeys the uniform distribution and its mean value is m. The analytical and simulation results show that the hyperdegree approximately obeys the power-law form and the exponent of hyperdegree distribution is γ = 2 + 1/m. Furthermore, we numerically investigate the node degree, hyperedge degree, clustering coefficient, as well as the average distance, and find that the hypernetwork model shares the scale-free and small-world properties, which shed some light for deeply understanding the evolution mechanism of the real systems.

  11. The Evolving Theory of Evolutionary Radiations.

    PubMed

    Simões, M; Breitkreuz, L; Alvarado, M; Baca, S; Cooper, J C; Heins, L; Herzog, K; Lieberman, B S

    2016-01-01

    Evolutionary radiations have intrigued biologists for more than 100 years, and our understanding of the patterns and processes associated with these radiations continues to grow and evolve. Recently it has been recognized that there are many different types of evolutionary radiation beyond the well-studied adaptive radiations. We focus here on multifarious types of evolutionary radiations, paying special attention to the abiotic factors that might trigger diversification in clades. We integrate concepts such as exaptation, species selection, coevolution, and the turnover-pulse hypothesis (TPH) into the theoretical framework of evolutionary radiations. We also discuss other phenomena that are related to, but distinct from, evolutionary radiations that have relevance for evolutionary biology. PMID:26632984

  12. Evolving resistance among Gram-positive pathogens.

    PubMed

    Munita, Jose M; Bayer, Arnold S; Arias, Cesar A

    2015-09-15

    Antimicrobial therapy is a key component of modern medical practice and a cornerstone for the development of complex clinical interventions in critically ill patients. Unfortunately, the increasing problem of antimicrobial resistance is now recognized as a major public health threat jeopardizing the care of thousands of patients worldwide. Gram-positive pathogens exhibit an immense genetic repertoire to adapt and develop resistance to virtually all antimicrobials clinically available. As more molecules become available to treat resistant gram-positive infections, resistance emerges as an evolutionary response. Thus, antimicrobial resistance has to be envisaged as an evolving phenomenon that demands constant surveillance and continuous efforts to identify emerging mechanisms of resistance to optimize the use of antibiotics and create strategies to circumvent this problem. Here, we will provide a broad perspective on the clinical aspects of antibiotic resistance in relevant gram-positive pathogens with emphasis on the mechanistic strategies used by these organisms to avoid being killed by commonly used antimicrobial agents.

  13. Life cycle planning: An evolving concept

    SciTech Connect

    Moore, P.J.R.; Gorman, I.G.

    1994-12-31

    Life-cycle planning is an evolving concept in the management of oil and gas projects. BHP Petroleum now interprets this idea to include all development planning from discovery and field appraisal to final abandonment and includes safety, environmental, technical, plant, regulatory, and staffing issues. This article describes in the context of the Timor Sea, how despite initial successes and continuing facilities upgrades, BHPP came to perceive that current operations could be the victim of early development successes, particularly in the areas of corrosion and maintenance. The search for analogies elsewhere lead to the UK North Sea, including the experiences of Britoil and BP, both of which performed detailed Life of Field studies in the later eighties. These materials have been used to construct a format and content for total Life-cycle plans in general and the social changes required to ensure their successful application in Timor Sea operations and deployment throughout Australia.

  14. Resiliently evolving supply-demand networks

    NASA Astrophysics Data System (ADS)

    Rubido, Nicolás; Grebogi, Celso; Baptista, Murilo S.

    2014-01-01

    The ability to design a transport network such that commodities are brought from suppliers to consumers in a steady, optimal, and stable way is of great importance for distribution systems nowadays. In this work, by using the circuit laws of Kirchhoff and Ohm, we provide the exact capacities of the edges that an optimal supply-demand network should have to operate stably under perturbations, i.e., without overloading. The perturbations we consider are the evolution of the connecting topology, the decentralization of hub sources or sinks, and the intermittence of supplier and consumer characteristics. We analyze these conditions and the impact of our results, both on the current United Kingdom power-grid structure and on numerically generated evolving archetypal network topologies.

  15. Language as an evolving word web.

    PubMed

    Dorogovtsev, S N; Mendes, J F

    2001-12-22

    Human language may be described as a complex network of linked words. In such a treatment, each distinct word in language is a vertex of this web, and interacting words in sentences are connected by edges. The empirical distribution of the number of connections of words in this network is of a peculiar form that includes two pronounced power-law regions. Here we propose a theory of the evolution of language, which treats language as a self-organizing network of interacting words. In the framework of this concept, we completely describe the observed word web structure without any fitting. We show that the two regimes in the distribution naturally emerge from the evolutionary dynamics of the word web. It follows from our theory that the size of the core part of language, the 'kernel lexicon', does not vary as language evolves.

  16. Microbial communities evolve faster in extreme environments

    PubMed Central

    Li, Sheng-Jin; Hua, Zheng-Shuang; Huang, Li-Nan; Li, Jie; Shi, Su-Hua; Chen, Lin-Xing; Kuang, Jia-Liang; Liu, Jun; Hu, Min; Shu, Wen-Sheng

    2014-01-01

    Evolutionary analysis of microbes at the community level represents a new research avenue linking ecological patterns to evolutionary processes, but remains insufficiently studied. Here we report a relative evolutionary rates (rERs) analysis of microbial communities from six diverse natural environments based on 40 metagenomic samples. We show that the rERs of microbial communities are mainly shaped by environmental conditions, and the microbes inhabiting extreme habitats (acid mine drainage, saline lake and hot spring) evolve faster than those populating benign environments (surface ocean, fresh water and soil). These findings were supported by the observation of more relaxed purifying selection and potentially frequent horizontal gene transfers in communities from extreme habitats. The mechanism of high rERs was proposed as high mutation rates imposed by stressful conditions during the evolutionary processes. This study brings us one stage closer to an understanding of the evolutionary mechanisms underlying the adaptation of microbes to extreme environments. PMID:25158668

  17. Evolving circuits in seconds: experiments with a stand-alone board-level evolvable system

    NASA Technical Reports Server (NTRS)

    Stoica, A.; Zebulum, R. S.; Ferguson, M. I.; Keymeulen, D.; Duong, V.; Guo, X.

    2002-01-01

    The purpose of this paper is twofold: first, to illustrate a stand-alone board-level evolvable system (SABLES) and its performance, and second to illustrate some problems that occur during evolution with real hardware in the loop, or when the intention of the user is not completely reflected in the fitness function.

  18. How evolved psychological mechanisms empower cultural group selection.

    PubMed

    Henrich, Joseph; Boyd, Robert

    2016-01-01

    Driven by intergroup competition, social norms, beliefs, and practices can evolve in ways that more effectively tap into a wide variety of evolved psychological mechanisms to foster group-beneficial behavior. The more powerful such evolved mechanisms are, the more effectively culture can potentially harness and manipulate them to generate greater phenotypic variation across groups, thereby fueling cultural group selection. PMID:27561383

  19. Evolved mechanisms in depression: the role and interaction of attachment and social rank in depression.

    PubMed

    Sloman, L; Gilbert, P; Hasey, G

    2003-04-01

    Evolved mechanisms underpinning attachment and social rank behavior may be the basis for some forms of major depression, especially those associated with chronic stress. We note the heterogeneity of depression, but suggest that some of its core symptoms, such as behavioral withdrawal, low self-esteem and anhedonia, may have evolved in order to regulate behavior and mood and convey sensitivity to threats and safety. Focusing on the evolved mental mechanisms for attachment and social rank helps to make sense of (1) depression's common early vulnerability factors (e.g., attachment disruptions, neglect and abuse), (2) the triggering events (e.g., loss of close relationships, being defeated and/or trapped in low socially rewarding or hostile environments), and (3) the psychological preoccupations of depressed people (e.g., sense of unlovableness, self as inferior and a failure). This focus offers clues as to how these two systems interact and on how to intervene.

  20. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  1. Sustaining an International Partnership: An Evolving Collaboration

    ERIC Educational Resources Information Center

    Pierson, Melinda R.; Myck-Wayne, Janice; Stang, Kristin K.; Basinska, Anna

    2015-01-01

    Universities across the United States have an increasing interest in international education. Increasing global awareness through educational collaborations will promote greater cross-cultural understanding and build effective relationships with diverse communities. This paper documents one university's effort to build an effective international…

  2. Mercury-T: Tidally evolving multi-planet systems code

    NASA Astrophysics Data System (ADS)

    Bolmont, Emeline; Raymond, Sean N.; Leconte, Jeremy; Hersant, Franck; Correia, Alexandre C. M.

    2015-11-01

    Mercury-T calculates the evolution of semi-major axis, eccentricity, inclination, rotation period and obliquity of the planets as well as the rotation period evolution of the host body; it is based on the N-body code Mercury (Chambers 1999, ascl:1201.008). It is flexible, allowing computation of the tidal evolution of systems orbiting any non-evolving object (if its mass, radius, dissipation factor and rotation period are known), but also evolving brown dwarfs (BDs) of mass between 0.01 and 0.08 M⊙, an evolving M-dwarf of 0.1 M⊙, an evolving Sun-like star, and an evolving Jupiter.

  3. Heritability and evolvability of fitness and nonfitness traits: Lessons from livestock.

    PubMed

    Hoffmann, Ary A; Merilä, Juha; Kristensen, Torsten N

    2016-08-01

    Data from natural populations have suggested a disconnection between trait heritability (variance standardized additive genetic variance, VA ) and evolvability (mean standardized VA ) and emphasized the importance of environmental variation as a determinant of trait heritability but not evolvability. However, these inferences are based on heterogeneous and often small datasets across species from different environments. We surveyed the relationship between evolvability and heritability in >100 traits in farmed cattle, taking advantage of large sample sizes and consistent genetic approaches. Heritability and evolvability estimates were positively correlated (r = 0.37/0.54 on untransformed/log scales) reflecting a substantial impact of VA on both measures. Furthermore, heritabilities and residual variances were uncorrelated. The differences between this and previously described patterns may reflect lower environmental variation experienced in farmed systems, but also low and heterogeneous quality of data from natural populations. Similar to studies on wild populations, heritabilities for life-history and behavioral traits were lower than for other traits. Traits having extremely low heritabilities and evolvabilities (17% of the studied traits) were almost exclusively life-history or behavioral traits, suggesting that evolutionary constraints stemming from lack of genetic variability are likely to be most common for classical "fitness" (cf. life-history) rather than for "nonfitness" (cf. morphological) traits.

  4. Molecular line emission in asymmetric envelopes of evolved stars

    NASA Astrophysics Data System (ADS)

    Sanchez, Andres Felipe Perez

    2014-06-01

    Stars with initial masses of 0.8 < M⊙ < 9M⊙ eject most of their mass when evolving along the asymptotic giant branch (AGB) phase. The ejected material eventually cools down, which leads it to condensate and to form dust grains and molecular gas around the star, creating an extended circumstellar envelope (CSE). The mechanism responsible for the expansion of the dusty and dense CSEs is not completely understood. It is suggested that stellar radiation pressure on the dust particles can accelerate them outwards. Then, by collisional exchange of momentum, the dust particles drag along the molecular gas. However, this scenario cannot explain the onset of asymmetries in the CSEs observed towards more evolved sources such as post-AGB sources and Planetary nebulae. Part of the research in this thesis is focused on the study of the role that the stellar magnetic field plays on the formation of the collimated high-velocity outflows observed towards post-AGB sources. Polarized maser emission towards (post-)AGB stars has become an useful tool to determine the properties of the stellar magnetic fields permeating their CSEs. However, the polarization fraction detected can be affected by non-Zeeman effects. Here I present the results of our analysis of the polarization properties of SiO, H2O and HCN maser emission in the (sub-)millimetre wavelength range. The goal of this analysis is to determine whether polarized maser emission of these molecular species can be used as reliable tracer of the magnetic field from observations at (sub-)millimetre wavelengths. I also present the results of radio interferometric observations of both continuum and polarized maser emission towards post-AGB stars. The sources observed are characterized by H2O maser emission arising from their collimated, high-velocity outflows. The observations have been carried out with the Australian Telescope Compact Array aiming to detect both polarized maser emission and non-thermal radio continuum emission

  5. Ethical Implications of Validity-vs.-Reliability Trade-Offs in Educational Research

    ERIC Educational Resources Information Center

    Fendler, Lynn

    2016-01-01

    In educational research that calls itself empirical, the relationship between validity and reliability is that of trade-off: the stronger the bases for validity, the weaker the bases for reliability (and vice versa). Validity and reliability are widely regarded as basic criteria for evaluating research; however, there are ethical implications of…

  6. Nuclear weapon reliability evaluation methodology

    SciTech Connect

    Wright, D.L.

    1993-06-01

    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  7. Speciation genetics: current status and evolving approaches

    PubMed Central

    Wolf, Jochen B. W.; Lindell, Johan; Backström, Niclas

    2010-01-01

    The view of species as entities subjected to natural selection and amenable to change put forth by Charles Darwin and Alfred Wallace laid the conceptual foundation for understanding speciation. Initially marred by a rudimental understanding of hereditary principles, evolutionists gained appreciation of the mechanistic underpinnings of speciation following the merger of Mendelian genetic principles with Darwinian evolution. Only recently have we entered an era where deciphering the molecular basis of speciation is within reach. Much focus has been devoted to the genetic basis of intrinsic postzygotic isolation in model organisms and several hybrid incompatibility genes have been successfully identified. However, concomitant with the recent technological advancements in genome analysis and a newfound interest in the role of ecology in the differentiation process, speciation genetic research is becoming increasingly open to non-model organisms. This development will expand speciation research beyond the traditional boundaries and unveil the genetic basis of speciation from manifold perspectives and at various stages of the splitting process. This review aims at providing an extensive overview of speciation genetics. Starting from key historical developments and core concepts of speciation genetics, we focus much of our attention on evolving approaches and introduce promising methodological approaches for future research venues. PMID:20439277

  8. The Evolving Role of Midwives as Laborists.

    PubMed

    DeJoy, Susan A; Sankey, Heather Z; Dickerson, Anissa E; Psaltis, Audrey; Galli, Amy; Burkman, Ronald T

    2015-01-01

    This article examines the history and present state of the midwife as laborist. The role of the midwife and obstetrician laborist/hospitalist is rapidly evolving due to the need to improve patient safety and provide direct care due to reduced resident work hours, as well as practice demands experienced by community providers and other factors. Models under development are customized to meet the needs of different communities and hospitals. Midwives are playing a prominent role in many laborist/hospitalist practices as the first-line hospital provider or as part of a team with physicians. Some models incorporate certified nurse-midwives/certified midwives as faculty to residents and medical students. The midwifery laborist/hospitalist practices at Baystate Medical Center in Springfield, Massachusetts, are presented as an example of how midwives are functioning as laborists. Essential components of a successful midwife laborist program include interdisciplinary planning, delineation of problems the model should solve, establishment of program metrics, clear practice guidelines and role definitions, and a plan for sustained funding. This article is part of a special series of articles that address midwifery innovations in clinical practice, education, interprofessional collaboration, health policy, and global health. PMID:26619374

  9. Are Electronic Cardiac Devices Still Evolving?

    PubMed Central

    Mabo, P.

    2014-01-01

    Summary Objectives The goal of this paper is to review some important issues occurring during the past year in Implantable devices. Methods First cardiac implantable device was proposed to maintain an adequate heart rate, either because the heart’s natural pacemaker is not fast enough, or there is a block in the heart’s electrical conduction system. During the last forty years, pacemakers have evolved considerably and become programmable and allow to configure specific patient optimum pacing modes. Various technological aspects (electrodes, connectors, algorithms diagnosis, therapies, …) have been progressed and cardiac implants address several clinical applications: management of arrhythmias, cardioversion / defibrillation and cardiac resynchronization therapy. Results Observed progress was the miniaturization of device, increased longevity, coupled with efficient pacing functions, multisite pacing modes, leadless pacing and also a better recognition of supraventricular or ventricular tachycardia’s in order to deliver appropriate therapy. Subcutaneous implant, new modes of stimulation (leadless implant or ultrasound lead), quadripolar lead and new sensor or new algorithm for the hemodynamic management are introduced and briefly described. Each times, the main result occurring during the two past years are underlined and repositioned from the history, remaining limitations are also addressed. Conclusion Some important technological improvements were described. Nevertheless, news trends for the future are also considered in a specific session such as the remote follow-up of the patient or the treatment of heart failure by neuromodulation. PMID:25123732

  10. Women and Ischemic Heart Disease: Evolving Knowledge

    PubMed Central

    Shaw, Leslee J.; Bugiardini, Raffaelle; Merz, C. Noel Bairey

    2009-01-01

    Evolving knowledge regarding sex differences in coronary heart disease (CHD) is emerging. Given the lower burden of obstructive coronary artery disease (CAD) and preserved systolic function in women contrasted by higher rates of myocardial ischemia and near-term mortality compared to men, we propose the term ischemic heart disease (IHD) as appropriate for this discussion specific to women, rather than CAD or CHD. This paradoxical difference where women have lower rates of anatomical CAD but more symptoms, ischemia, and outcomes appear linked to coronary reactivity which includes microvascular dysfunction. Novel risk factors can improve the Framingham risk score, including inflammatory markers and reproductive hormones, as well as noninvasive imaging and functional capacity measurements. Risk for women with obstructive CAD is elevated compared to men, yet women are less likely to receive guideline-indicated therapies. In the setting of non-ST elevation acute myocardial infarction, interventional strategies are equally effective in biomarker positive women and men, while conservative management is indicated for biomarker negative women. For women with evidence of ischemia but no obstructive CAD, anti-anginal and anti-ischemic therapies can improve symptoms, endothelial function, and quality of life; however trials evaluating adverse outcomes are needed. We hypothesize that women experience more adverse outcomes compared to men because obstructive CAD remains the current focus of therapeutic strategies. Continued research is indicated to devise therapeutic regimens to improve symptom burden and reduce risk in women with IHD. PMID:19833255

  11. Extreme insular dwarfism evolved in a mammoth.

    PubMed

    Herridge, Victoria L; Lister, Adrian M

    2012-08-22

    The insular dwarfism seen in Pleistocene elephants has come to epitomize the island rule; yet our understanding of this phenomenon is hampered by poor taxonomy. For Mediterranean dwarf elephants, where the most extreme cases of insular dwarfism are observed, a key systematic question remains unresolved: are all taxa phyletic dwarfs of a single mainland species Palaeoloxodon antiquus (straight-tusked elephant), or are some referable to Mammuthus (mammoths)? Ancient DNA and geochronological evidence have been used to support a Mammuthus origin for the Cretan 'Palaeoloxodon' creticus, but these studies have been shown to be flawed. On the basis of existing collections and recent field discoveries, we present new, morphological evidence for the taxonomic status of 'P'. creticus, and show that it is indeed a mammoth, most probably derived from Early Pleistocene Mammuthus meridionalis or possibly Late Pliocene Mammuthus rumanus. We also show that Mammuthus creticus is smaller than other known insular dwarf mammoths, and is similar in size to the smallest dwarf Palaeoloxodon species from Sicily and Malta, making it the smallest mammoth species known to have existed. These findings indicate that extreme insular dwarfism has evolved to a similar degree independently in two elephant lineages.

  12. Evolving Earth Models and Nuclear Explosion Monitoring

    NASA Astrophysics Data System (ADS)

    Wallace, T. C.

    2003-12-01

    One of the most important problems in nuclear explosion monitoring is accurate seismic event location. Traditional location methods rely on estimating travel times in a known Earth model and accounting for heterogeneity through various empirical corrections. The history of location accuracy and precision is closely coupled to evolving theories of the nature of the Earth's interior. LONG SHOT was a 80 Kt explosion conducted on Amchitka Island on October 29, 1965. The travel times recorded from LONG SHOT deviated strongly from a radially symmetric Earth, and in fact showed a pattern consistent a tabular body of relatively high seismic velocity (the subducting North American Plate) validating certain concepts of the then new theory of plate tectonics. Each subsequent advance in conceptual models for the dynamics of the Earth's interior has impacted explosion monitoring. Many of the advances in the theory of the Earth's interior have been spurred by the ideas and work of Don Anderson. These include anelasticity, anisotropy, tomography, the Lehmann discontinuity, and mantle plumes (or lack of). The present state-of-the-art monitoring paradigm incorporates a dynamic Earth model, and the synergy between verification research and basic research on the Earth's interior is quite important.

  13. Evolving Resistance Among Gram-positive Pathogens

    PubMed Central

    Munita, Jose M.; Bayer, Arnold S.; Arias, Cesar A.

    2015-01-01

    Antimicrobial therapy is a key component of modern medical practice and a cornerstone for the development of complex clinical interventions in critically ill patients. Unfortunately, the increasing problem of antimicrobial resistance is now recognized as a major public health threat jeopardizing the care of thousands of patients worldwide. Gram-positive pathogens exhibit an immense genetic repertoire to adapt and develop resistance to virtually all antimicrobials clinically available. As more molecules become available to treat resistant gram-positive infections, resistance emerges as an evolutionary response. Thus, antimicrobial resistance has to be envisaged as an evolving phenomenon that demands constant surveillance and continuous efforts to identify emerging mechanisms of resistance to optimize the use of antibiotics and create strategies to circumvent this problem. Here, we will provide a broad perspective on the clinical aspects of antibiotic resistance in relevant gram-positive pathogens with emphasis on the mechanistic strategies used by these organisms to avoid being killed by commonly used antimicrobial agents. PMID:26316558

  14. Extreme insular dwarfism evolved in a mammoth.

    PubMed

    Herridge, Victoria L; Lister, Adrian M

    2012-08-22

    The insular dwarfism seen in Pleistocene elephants has come to epitomize the island rule; yet our understanding of this phenomenon is hampered by poor taxonomy. For Mediterranean dwarf elephants, where the most extreme cases of insular dwarfism are observed, a key systematic question remains unresolved: are all taxa phyletic dwarfs of a single mainland species Palaeoloxodon antiquus (straight-tusked elephant), or are some referable to Mammuthus (mammoths)? Ancient DNA and geochronological evidence have been used to support a Mammuthus origin for the Cretan 'Palaeoloxodon' creticus, but these studies have been shown to be flawed. On the basis of existing collections and recent field discoveries, we present new, morphological evidence for the taxonomic status of 'P'. creticus, and show that it is indeed a mammoth, most probably derived from Early Pleistocene Mammuthus meridionalis or possibly Late Pliocene Mammuthus rumanus. We also show that Mammuthus creticus is smaller than other known insular dwarf mammoths, and is similar in size to the smallest dwarf Palaeoloxodon species from Sicily and Malta, making it the smallest mammoth species known to have existed. These findings indicate that extreme insular dwarfism has evolved to a similar degree independently in two elephant lineages. PMID:22572206

  15. Tearing Mode Stability of Evolving Toroidal Equilibria

    NASA Astrophysics Data System (ADS)

    Pletzer, A.; McCune, D.; Manickam, J.; Jardin, S. C.

    2000-10-01

    There are a number of toroidal equilibrium (such as JSOLVER, ESC, EFIT, and VMEC) and transport codes (such as TRANSP, BALDUR, and TSC) in our community that utilize differing equilibrium representations. There are also many heating and current drive (LSC and TORRAY), and stability (PEST1-3, GATO, NOVA, MARS, DCON, M3D) codes that require this equilibrium information. In an effort to provide seamless compatibility between the codes that produce and need these equilibria, we have developed two Fortran 90 modules, MEQ and XPLASMA, that serve as common interfaces between these two classes of codes. XPLASMA provides a common equilibrium representation for the heating and current drive applications while MEQ provides common equilibrium and associated metric information needed by MHD stability codes. We illustrate the utility of this approach by presenting results of PEST-3 tearing stability calculations of an NSTX discharge performed on profiles provided by the TRANSP code. Using the MEQ module, the TRANSP equilibrium data are stored in a Fortran 90 derived type and passed to PEST3 as a subroutine argument. All calculations are performed on the fly, as the profiles evolve.

  16. Origins of stereoselectivity in evolved ketoreductases

    PubMed Central

    Noey, Elizabeth L.; Tibrewal, Nidhi; Jiménez-Osés, Gonzalo; Osuna, Sílvia; Park, Jiyong; Bond, Carly M.; Cascio, Duilio; Liang, Jack; Zhang, Xiyun; Huisman, Gjalt W.; Tang, Yi; Houk, Kendall N.

    2015-01-01

    Mutants of Lactobacillus kefir short-chain alcohol dehydrogenase, used here as ketoreductases (KREDs), enantioselectively reduce the pharmaceutically relevant substrates 3-thiacyclopentanone and 3-oxacyclopentanone. These substrates differ by only the heteroatom (S or O) in the ring, but the KRED mutants reduce them with different enantioselectivities. Kinetic studies show that these enzymes are more efficient with 3-thiacyclopentanone than with 3-oxacyclopentanone. X-ray crystal structures of apo- and NADP+-bound selected mutants show that the substrate-binding loop conformational preferences are modified by these mutations. Quantum mechanical calculations and molecular dynamics (MD) simulations are used to investigate the mechanism of reduction by the enzyme. We have developed an MD-based method for studying the diastereomeric transition state complexes and rationalize different enantiomeric ratios. This method, which probes the stability of the catalytic arrangement within the theozyme, shows a correlation between the relative fractions of catalytically competent poses for the enantiomeric reductions and the experimental enantiomeric ratio. Some mutations, such as A94F and Y190F, induce conformational changes in the active site that enlarge the small binding pocket, facilitating accommodation of the larger S atom in this region and enhancing S-selectivity with 3-thiacyclopentanone. In contrast, in the E145S mutant and the final variant evolved for large-scale production of the intermediate for the antibiotic sulopenem, R-selectivity is promoted by shrinking the small binding pocket, thereby destabilizing the pro-S orientation. PMID:26644568

  17. An evolving model of online bipartite networks

    NASA Astrophysics Data System (ADS)

    Zhang, Chu-Xu; Zhang, Zi-Ke; Liu, Chuang

    2013-12-01

    Understanding the structure and evolution of online bipartite networks is a significant task since they play a crucial role in various e-commerce services nowadays. Recently, various attempts have been tried to propose different models, resulting in either power-law or exponential degree distributions. However, many empirical results show that the user degree distribution actually follows a shifted power-law distribution, the so-called Mandelbrot’s law, which cannot be fully described by previous models. In this paper, we propose an evolving model, considering two different user behaviors: random and preferential attachment. Extensive empirical results on two real bipartite networks, Delicious and CiteULike, show that the theoretical model can well characterize the structure of real networks for both user and object degree distributions. In addition, we introduce a structural parameter p, to demonstrate that the hybrid user behavior leads to the shifted power-law degree distribution, and the region of power-law tail will increase with the increment of p. The proposed model might shed some lights in understanding the underlying laws governing the structure of real online bipartite networks.

  18. Regulatory mechanisms link phenotypic plasticity to evolvability

    PubMed Central

    van Gestel, Jordi; Weissing, Franz J.

    2016-01-01

    Organisms have a remarkable capacity to respond to environmental change. They can either respond directly, by means of phenotypic plasticity, or they can slowly adapt through evolution. Yet, how phenotypic plasticity links to evolutionary adaptability is largely unknown. Current studies of plasticity tend to adopt a phenomenological reaction norm (RN) approach, which neglects the mechanisms underlying plasticity. Focusing on a concrete question – the optimal timing of bacterial sporulation – we here also consider a mechanistic approach, the evolution of a gene regulatory network (GRN) underlying plasticity. Using individual-based simulations, we compare the RN and GRN approach and find a number of striking differences. Most importantly, the GRN model results in a much higher diversity of responsive strategies than the RN model. We show that each of the evolved strategies is pre-adapted to a unique set of unseen environmental conditions. The regulatory mechanisms that control plasticity therefore critically link phenotypic plasticity to the adaptive potential of biological populations. PMID:27087393

  19. Gastric cancer: current and evolving treatment landscape.

    PubMed

    Sun, Weijing; Yan, Li

    2016-01-01

    Gastric (including gastroesophageal junction) cancer is the third leading cause of cancer-related death in the world. In China, an estimated 420,000 patients were diagnosed with gastric cancer in 2011, ranking this malignancy the second most prevalent cancer type and resulting in near 300,000 deaths. The treatment landscape of gastric cancer has evolved in recent years. Although systemic chemotherapy is still the mainstay treatment of metastatic disease, the introduction of agents targeting human epidermal growth factor receptor 2 and vascular endothelial growth factor/vascular endothelia growth factor receptor has brought this disease into the molecular and personalized medicine era. The preliminary yet encouraging clinical efficacy observed with immune checkpoint inhibitors, e.g., anti-programmed cell death protein 1/programmed death-ligand 1, will further shape the treatment landscape for gastric cancer. Molecular characterization of patients will play a critical role in developing new agents, as well as in implementing new treatment options for this disease. PMID:27581465

  20. Metapopulation capacity of evolving fluvial landscapes

    NASA Astrophysics Data System (ADS)

    Bertuzzo, Enrico; Rodriguez-Iturbe, Ignacio; Rinaldo, Andrea

    2015-04-01

    The form of fluvial landscapes is known to attain stationary network configurations that settle in dynamically accessible minima of total energy dissipation by landscape-forming discharges. Recent studies have highlighted the role of the dendritic structure of river networks in controlling population dynamics of the species they host and large-scale biodiversity patterns. Here, we systematically investigate the relation between energy dissipation, the physical driver for the evolution of river networks, and the ecological dynamics of their embedded biota. To that end, we use the concept of metapopulation capacity, a measure to link landscape structures with the population dynamics they host. Technically, metapopulation capacity is the leading eigenvalue λM of an appropriate "landscape" matrix subsuming whether a given species is predicted to persist in the long run. λM can conveniently be used to rank different landscapes in terms of their capacity to support viable metapopulations. We study how λM changes in response to the evolving network configurations of spanning trees. Such sequence of configurations is theoretically known to relate network selection to general landscape evolution equations through imperfect searches for dynamically accessible states frustrated by the vagaries of Nature. Results show that the process shaping the metric and the topological properties of river networks, prescribed by physical constraints, leads to a progressive increase in the corresponding metapopulation capacity and therefore on the landscape capacity to support metapopulations—with implications on biodiversity in fluvial ecosystems.

  1. Generative Representations for Evolving Families of Designs

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2003-01-01

    Since typical evolutionary design systems encode only a single artifact with each individual, each time the objective changes a new set of individuals must be evolved. When this objective varies in a way that can be parameterized, a more general method is to use a representation in which a single individual encodes an entire class of artifacts. In addition to saving time by preventing the need for multiple evolutionary runs, the evolution of parameter-controlled designs can create families of artifacts with the same style and a reuse of parts between members of the family. In this paper an evolutionary design system is described which uses a generative representation to encode families of designs. Because a generative representation is an algorithmic encoding of a design, its input parameters are a way to control aspects of the design it generates. By evaluating individuals multiple times with different input parameters the evolutionary design system creates individuals in which the input parameter controls specific aspects of a design. This system is demonstrated on two design substrates: neural-networks which solve the 3/5/7-parity problem and three-dimensional tables of varying heights.

  2. How does cognition evolve? Phylogenetic comparative psychology

    PubMed Central

    Matthews, Luke J.; Hare, Brian A.; Nunn, Charles L.; Anderson, Rindy C.; Aureli, Filippo; Brannon, Elizabeth M.; Call, Josep; Drea, Christine M.; Emery, Nathan J.; Haun, Daniel B. M.; Herrmann, Esther; Jacobs, Lucia F.; Platt, Michael L.; Rosati, Alexandra G.; Sandel, Aaron A.; Schroepfer, Kara K.; Seed, Amanda M.; Tan, Jingzhi; van Schaik, Carel P.; Wobber, Victoria

    2014-01-01

    Now more than ever animal studies have the potential to test hypotheses regarding how cognition evolves. Comparative psychologists have developed new techniques to probe the cognitive mechanisms underlying animal behavior, and they have become increasingly skillful at adapting methodologies to test multiple species. Meanwhile, evolutionary biologists have generated quantitative approaches to investigate the phylogenetic distribution and function of phenotypic traits, including cognition. In particular, phylogenetic methods can quantitatively (1) test whether specific cognitive abilities are correlated with life history (e.g., lifespan), morphology (e.g., brain size), or socio-ecological variables (e.g., social system), (2) measure how strongly phylogenetic relatedness predicts the distribution of cognitive skills across species, and (3) estimate the ancestral state of a given cognitive trait using measures of cognitive performance from extant species. Phylogenetic methods can also be used to guide the selection of species comparisons that offer the strongest tests of a priori predictions of cognitive evolutionary hypotheses (i.e., phylogenetic targeting). Here, we explain how an integration of comparative psychology and evolutionary biology will answer a host of questions regarding the phylogenetic distribution and history of cognitive traits, as well as the evolutionary processes that drove their evolution. PMID:21927850

  3. Evolving application of biomimetic nanostructured hydroxyapatite

    PubMed Central

    Roveri, Norberto; Iafisco, Michele

    2010-01-01

    By mimicking Nature, we can design and synthesize inorganic smart materials that are reactive to biological tissues. These smart materials can be utilized to design innovative third-generation biomaterials, which are able to not only optimize their interaction with biological tissues and environment, but also mimic biogenic materials in their functionalities. The biomedical applications involve increasing the biomimetic levels from chemical composition, structural organization, morphology, mechanical behavior, nanostructure, and bulk and surface chemical–physical properties until the surface becomes bioreactive and stimulates cellular materials. The chemical–physical characteristics of biogenic hydroxyapatites from bone and tooth have been described, in order to point out the elective sides, which are important to reproduce the design of a new biomimetic synthetic hydroxyapatite. This review outlines the evolving applications of biomimetic synthetic calcium phosphates, details the main characteristics of bone and tooth, where the calcium phosphates are present, and discusses the chemical–physical characteristics of biomimetic calcium phosphates, methods of synthesizing them, and some of their biomedical applications. PMID:24198477

  4. Speciation genetics: current status and evolving approaches.

    PubMed

    Wolf, Jochen B W; Lindell, Johan; Backström, Niclas

    2010-06-12

    The view of species as entities subjected to natural selection and amenable to change put forth by Charles Darwin and Alfred Wallace laid the conceptual foundation for understanding speciation. Initially marred by a rudimental understanding of hereditary principles, evolutionists gained appreciation of the mechanistic underpinnings of speciation following the merger of Mendelian genetic principles with Darwinian evolution. Only recently have we entered an era where deciphering the molecular basis of speciation is within reach. Much focus has been devoted to the genetic basis of intrinsic postzygotic isolation in model organisms and several hybrid incompatibility genes have been successfully identified. However, concomitant with the recent technological advancements in genome analysis and a newfound interest in the role of ecology in the differentiation process, speciation genetic research is becoming increasingly open to non-model organisms. This development will expand speciation research beyond the traditional boundaries and unveil the genetic basis of speciation from manifold perspectives and at various stages of the splitting process. This review aims at providing an extensive overview of speciation genetics. Starting from key historical developments and core concepts of speciation genetics, we focus much of our attention on evolving approaches and introduce promising methodological approaches for future research venues.

  5. Evolving paradigms in multifocal breast cancer.

    PubMed

    Salgado, Roberto; Aftimos, Philippe; Sotiriou, Christos; Desmedt, Christine

    2015-04-01

    The 7th edition of the TNM defines multifocal breast cancer as multiple simultaneous ipsilateral and synchronous breast cancer lesions, provided they are macroscopically distinct and measurable using current traditional pathological and clinical tools. According to the College of American Pathologists (CAP), the characterization of only the largest lesion is considered sufficient, unless the grade and/or histology are different between the lesions. Here, we review three potentially clinically relevant aspects of multifocal breast cancers: first, the importance of a different intrinsic breast cancer subtype of the various lesions; second, the emerging awareness of inter-lesion heterogeneity; and last but not least, the potential introduction of bias in clinical trials due to the unrecognized biological diversity of these cancers. Although the current strategy to assess the lesion with the largest diameter has clearly its advantages in terms of costs and feasibility, this recommendation may not be sustainable in time and might need to be adapted to be compliant with new evolving paradigms in breast cancer.

  6. Origins and evolvability of the PAX family.

    PubMed

    Paixão-Côrtes, Vanessa R; Salzano, Francisco M; Bortolini, Maria Cátira

    2015-08-01

    The paired box (PAX) family of transcription/developmental genes plays a key role in numerous stages of embryonic development, as well as in adult organogenesis. There is evidence linking the acquisition of a paired-like DNA binding domain (PD) to domestication of a Tc1/mariner transposon. Further duplication/deletion processes led to at least five paralogous metazoan protein groups, which can be classified into two supergroups, PAXB-like or PAXD-like, using ancestral defining structures; the PD plus an octapeptide motif (OP) and a paired-type homeobox DNA binding domain (PTHD), producing the PD-OP-PTHD structure characteristic of the PAXB-like group, whereas an additional domain, the paired-type homeodomain tail (PHT), is present in the PAXD-like group, producing a PD-OP-PTHD-PHT structure. We examined their patterns of distribution in various species, using both available data and new bioinformatic analyses, including vertebrate PAX genes and their shared and specific functions, as well as inter- and intraspecific variability of PAX in primates. These analyses revealed a relatively conserved PAX network, accompanied by specific changes that led to adaptive novelties. Therefore, both stability and evolvability shaped the molecular evolution of this key transcriptional network. PMID:26321496

  7. Evolving role of MRI in Crohn's disease.

    PubMed

    Yacoub, Joseph H; Obara, Piotr; Oto, Aytekin

    2013-06-01

    MR enterography is playing an evolving role in the evaluation of small bowel Crohn's disease (CD). Standard MR enterography includes a combination of rapidly acquired T2 sequence, balanced steady-state acquisition, and contrast enhanced T1-weighted gradient echo sequence. The diagnostic performance of these sequences has been shown to be comparable, and in some respects superior, to other small bowel imaging modalities. The findings of CD on MR enterography have been well described in the literature. New and emerging techniques such as diffusion-weighted imaging (DWI), dynamic contrast enhanced MRI (DCE-MRI), cinematography, and magnetization transfer, may lead to improved accuracy in characterizing the disease. These advanced techniques can provide quantitative parameters that may prove to be useful in assessing disease activity, severity, and response to treatment. In the future, MR enterography may play an increasing role in management decisions for patients with small bowel CD; however, larger studies are needed to validate these emerging MRI parameters as imaging biomarkers. PMID:23712842

  8. Lower mass limit of an evolving interstellar cloud and chemistry in an evolving oscillatory cloud

    NASA Technical Reports Server (NTRS)

    Tarafdar, S. P.

    1986-01-01

    Simultaneous solution of the equation of motion, equation of state and energy equation including heating and cooling processes for interstellar medium gives for a collapsing cloud a lower mass limit which is significantly smaller than the Jeans mass for the same initial density. The clouds with higher mass than this limiting mass collapse whereas clouds with smaller than critical mass pass through a maximum central density giving apparently similar clouds (i.e., same Av, size and central density) at two different phases of its evolution (i.e., with different life time). Preliminary results of chemistry in such an evolving oscillatory cloud show significant difference in abundances of some of the molecules in two physically similar clouds with different life times. The problems of depletion and short life time of evolving clouds appear to be less severe in such an oscillatory cloud.

  9. Lithium battery safety and reliability

    NASA Astrophysics Data System (ADS)

    Levy, Samuel C.

    Lithium batteries have been used in a variety of applications for a number of years. As their use continues to grow, particularly in the consumer market, a greater emphasis needs to be placed on safety and reliability. There is a useful technique which can help to design cells and batteries having a greater degree of safety and higher reliability. This technique, known as fault tree analysis, can also be useful in determining the cause of unsafe behavior and poor reliability in existing designs.

  10. The Evolving Context for Science and Society

    NASA Astrophysics Data System (ADS)

    Leshner, Alan I.

    2012-01-01

    The relationship between science and the rest of society is critical both to the support it receives from the public and to the receptivity of the broader citizenry to science's explanations of the nature of the world and to its other outputs. Science's ultimate usefulness depends on a receptive public. For example, given that science and technology are imbedded in virtually every issue of modern life, either as a cause or a cure, it is critical that the relationship be strong and that the role of science is well appreciated by society, or the impacts of scientific advances will fall short of their great potential. Unfortunately, a variety of problems have been undermining the science-society relationship for over a decade. Some problems emerge from within the scientific enterprise - like scientific misconduct or conflicts of interest - and tarnish or weaken its image and credibility. Other problems and stresses come from outside the enterprise. The most obvious external pressure is that the world economic situation is undermining the financial support of both the conduct and infrastructure of science. Other examples of external pressures include conflicts between what science is revealing and political or economic expediency - e.g., global climate change - or instances where scientific advances encroach upon core human values or beliefs - e.g., scientific understanding of the origins and evolution of the universe as compared to biblical accounts of creation. Significant efforts - some dramatically non-traditional for many in the scientific community - are needed to restore balance to the science-society relationship.

  11. Optimization of reliability allocation strategies through use of genetic algorithms

    SciTech Connect

    Campbell, J.E.; Painton, L.A.

    1996-08-01

    This paper examines a novel optimization technique called genetic algorithms and its application to the optimization of reliability allocation strategies. Reliability allocation should occur in the initial stages of design, when the objective is to determine an optimal breakdown or allocation of reliability to certain components or subassemblies in order to meet system specifications. The reliability allocation optimization is applied to the design of a cluster tool, a highly complex piece of equipment used in semiconductor manufacturing. The problem formulation is presented, including decision variables, performance measures and constraints, and genetic algorithm parameters. Piecewise ``effort curves`` specifying the amount of effort required to achieve a certain level of reliability for each component of subassembly are defined. The genetic algorithm evolves or picks those combinations of ``effort`` or reliability levels for each component which optimize the objective of maximizing Mean Time Between Failures while staying within a budget. The results show that the genetic algorithm is very efficient at finding a set of robust solutions. A time history of the optimization is presented, along with histograms or the solution space fitness, MTBF, and cost for comparative purposes.

  12. A fourth generation reliability predictor

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Martensen, Anna L.

    1988-01-01

    A reliability/availability predictor computer program has been developed and is currently being beta-tested by over 30 US companies. The computer program is called the Hybrid Automated Reliability Predictor (HARP). HARP was developed to fill an important gap in reliability assessment capabilities. This gap was manifested through the use of its third-generation cousin, the Computer-Aided Reliability Estimation (CARE III) program, over a six-year development period and an additional three-year period during which CARE III has been in the public domain. The accumulated experience of the over 30 establishments now using CARE III was used in the development of the HARP program.

  13. Early-branching or fast-evolving eukaryotes? An answer based on slowly evolving positions.

    PubMed

    Philippe, H; Lopez, P; Brinkmann, H; Budin, K; Germot, A; Laurent, J; Moreira, D; Müller, M; Le Guyader, H

    2000-06-22

    The current paradigm of eukaryotic evolution is based primarily on comparative analysis of ribosomal RNA sequences. It shows several early-emerging lineages, mostly amitochondriate, which might be living relics of a progressive assembly of the eukaryotic cell. However, the analysis of slow-evolving positions, carried out with the newly developed slow-fast method, reveals that these lineages are, in terms of nucleotide substitution, fast-evolving ones, misplaced at the base of the tree by a long branch attraction artefact. Since the fast-evolving groups are not always the same, depending on which macromolecule is used as a marker, this explains most of the observed incongruent phylogenies. The current paradigm of eukaryotic evolution thus has to be seriously re-examined as the eukaryotic phylogeny is presently best summarized by a multifurcation. This is consistent with the Big Bang hypothesis that all extant eukaryotic lineages are the result of multiple cladogeneses within a relatively brief period, although insufficiency of data is also a possible explanation for the lack of resolution. For further resolution, rare evolutionary events such as shared insertions and/or deletions or gene fusions might be helpful.

  14. Early-branching or fast-evolving eukaryotes? An answer based on slowly evolving positions.

    PubMed Central

    Philippe, H; Lopez, P; Brinkmann, H; Budin, K; Germot, A; Laurent, J; Moreira, D; Müller, M; Le Guyader, H

    2000-01-01

    The current paradigm of eukaryotic evolution is based primarily on comparative analysis of ribosomal RNA sequences. It shows several early-emerging lineages, mostly amitochondriate, which might be living relics of a progressive assembly of the eukaryotic cell. However, the analysis of slow-evolving positions, carried out with the newly developed slow-fast method, reveals that these lineages are, in terms of nucleotide substitution, fast-evolving ones, misplaced at the base of the tree by a long branch attraction artefact. Since the fast-evolving groups are not always the same, depending on which macromolecule is used as a marker, this explains most of the observed incongruent phylogenies. The current paradigm of eukaryotic evolution thus has to be seriously re-examined as the eukaryotic phylogeny is presently best summarized by a multifurcation. This is consistent with the Big Bang hypothesis that all extant eukaryotic lineages are the result of multiple cladogeneses within a relatively brief period, although insufficiency of data is also a possible explanation for the lack of resolution. For further resolution, rare evolutionary events such as shared insertions and/or deletions or gene fusions might be helpful. PMID:10902687

  15. Synthetic Model of the Oxygen-Evolving Center: Photosystem II under the Spotlight.

    PubMed

    Yu, Yang; Hu, Cheng; Liu, Xiaohong; Wang, Jiangyun

    2015-09-21

    The oxygen-evolving center (OEC) in photosystem II catalyzes a water splitting reaction. Great efforts have already been made to artificially synthesize the OEC, in order to elucidate the structure-function relationship and the mechanism of the reaction. Now, a new synthetic model makes the best mimic yet of the OEC. This recent study opens up the possibility to study the mechanism of photosystem II and photosynthesis in general for applications in renewable energy and synthetic biology.

  16. Robustness and Evolvability of the Human Signaling Network

    PubMed Central

    Kim, Jeong-Rae; Munoz, Amaya Garcia; Kolch, Walter; Cho, Kwang-Hyun

    2014-01-01

    Biological systems are known to be both robust and evolvable to internal and external perturbations, but what causes these apparently contradictory properties? We used Boolean network modeling and attractor landscape analysis to investigate the evolvability and robustness of the human signaling network. Our results show that the human signaling network can be divided into an evolvable core where perturbations change the attractor landscape in state space, and a robust neighbor where perturbations have no effect on the attractor landscape. Using chemical inhibition and overexpression of nodes, we validated that perturbations affect the evolvable core more strongly than the robust neighbor. We also found that the evolvable core has a distinct network structure, which is enriched in feedback loops, and features a higher degree of scale-freeness and longer path lengths connecting the nodes. In addition, the genes with high evolvability scores are associated with evolvability-related properties such as rapid evolvability, low species broadness, and immunity whereas the genes with high robustness scores are associated with robustness-related properties such as slow evolvability, high species broadness, and oncogenes. Intriguingly, US Food and Drug Administration-approved drug targets have high evolvability scores whereas experimental drug targets have high robustness scores. PMID:25077791

  17. Fault tolerant highly reliable inertial navigation system

    NASA Astrophysics Data System (ADS)

    Jeerage, Mahesh; Boettcher, Kevin

    This paper describes a development of failure detection and isolation (FDI) strategies for highly reliable inertial navigation systems. FDI strategies are developed based on the generalized likelihood ratio test (GLRT). A relationship between detection threshold and false alarm rate is developed in terms of the sensor parameters. A new method for correct isolation of failed sensors is presented. Evaluation of FDI performance parameters, such as false alarm rate, wrong isolation probability, and correct isolation probability, are presented. Finally a fault recovery scheme capable of correcting false isolation of good sensors is presented.

  18. BUBBLE DYNAMICS AT GAS-EVOLVING ELECTRODES

    SciTech Connect

    Sides, Paul J.

    1980-12-01

    Nucleation of bubbles, their growth by diffusion of dissolved gas to the bubble surface and by coalescence, and their detachment from the electrode are all very fast phenomena; furthermore, electrolytically generated bubbles range in size from ten to a few hundred microns; therefore, magnification and high speed cinematography are required to observe bubbles and the phenomena of their growth on the electrode surface. Viewing the action from the front side (the surface on which the bubbles form) is complicated because the most important events occur close to the surface and are obscured by other bubbles passing between the camera and the electrode; therefore, oxygen was evolved on a transparent tin oxide "window" electrode and the events were viewed from the backside. The movies showed that coalescence of bubbles is very important for determining the size of bubbles and in the chain of transport processes; growth by diffusion and by coalescence proceeds in series and parallel; coalescing bubbles cause significant fluid motion close to the electrode; bubbles can leave and reattach; and bubbles evolve in a cycle of growth by diffusion and different modes of coalescence. An analytical solution for the primary potential and current distribution around a spherical bubble in contact with a plane electrode is presented. Zero at the contact point, the current density reaches only one percent of its undisturbed value at 30 percent of the radius from that point and goes through a shallow maximum two radii away. The solution obtained for spherical bubbles is shown to apply for the small bubbles of electrolytic processes. The incremental resistance in ohms caused by sparse arrays of bubbles is given by {Delta}R = 1.352 af/kS where f is the void fraction of gas in the bubble layer, a is the bubble layer thickness, k is the conductivity of gas free electrolyte, and S is the electrode area. A densely populated gas bubble layer on an electrode was modeled as a hexagonal array of

  19. Did DNA replication evolve twice independently?

    PubMed

    Leipe, D D; Aravind, L; Koonin, E V

    1999-09-01

    DNA replication is central to all extant cellular organisms. There are substantial functional similarities between the bacterial and the archaeal/eukaryotic replication machineries, including but not limited to defined origins, replication bidirectionality, RNA primers and leading and lagging strand synthesis. However, several core components of the bacterial replication machinery are unrelated or only distantly related to the functionally equivalent components of the archaeal/eukaryotic replication apparatus. This is in sharp contrast to the principal proteins involved in transcription and translation, which are highly conserved in all divisions of life. We performed detailed sequence comparisons of the proteins that fulfill indispensable functions in DNA replication and classified them into four main categories with respect to the conservation in bacteria and archaea/eukaryotes: (i) non-homologous, such as replicative polymerases and primases; (ii) containing homologous domains but apparently non-orthologous and conceivably independently recruited to function in replication, such as the principal replicative helicases or proofreading exonucleases; (iii) apparently orthologous but poorly conserved, such as the sliding clamp proteins or DNA ligases; (iv) orthologous and highly conserved, such as clamp-loader ATPases or 5'-->3' exonucleases (FLAP nucleases). The universal conservation of some components of the DNA replication machinery and enzymes for DNA precursor biosynthesis but not the principal DNA polymerases suggests that the last common ancestor (LCA) of all modern cellular life forms possessed DNA but did not replicate it the way extant cells do. We propose that the LCA had a genetic system that contained both RNA and DNA, with the latter being produced by reverse transcription. Consequently, the modern-type system for double-stranded DNA replication likely evolved independently in the bacterial and archaeal/eukaryotic lineages.

  20. A possible molecular metric for biological evolvability.

    PubMed

    Mittal, Aditya; Jayaram, B

    2012-07-01

    Proteins manifest themselves as phenotypic traits, retained or lost in living systems via evolutionary pressures. Simply put, survival is essentially the ability of a living system to synthesize a functional protein that allows for a response to environmental perturbations (adaptation). Loss of functional proteins leads to extinction. Currently there are no universally applicable quantitative metrics at the molecular level for either measuring 'evolvability' of life or for assessing the conditions under which a living system would go extinct and why. In this work, we show emergence of the first such metric by utilizing the recently discovered stoichiometric margin of life for all known naturally occurring (and functional) proteins. The constraint of having well-defined stoichiometries of the 20 amino acids in naturally occurring protein sequences requires utilization of the full scope of degeneracy in the genetic code, i.e. usage of all codons coding for an amino acid, by only 11 of the 20 amino acids. This shows that the non-availability of individual codons for these 11 amino acids would disturb the fine stoichiometric balance resulting in non-functional proteins and hence extinction. Remarkably, these amino acids are found in close proximity of any given amino acid in the backbones of thousands of known crystal structures of folded proteins. On the other hand, stoichiometry of the remaining 9 amino acids, found to be farther/distal from any given amino acid in backbones of folded proteins, is maintained independent of the number of codons available to synthesize them, thereby providing some robustness and hence survivability.

  1. Evolving Recommendations on Prostate Cancer Screening.

    PubMed

    Brawley, Otis W; Thompson, Ian M; Grönberg, Henrik

    2016-01-01

    Results of a number of studies demonstrate that the serum prostate-specific antigen (PSA) in and of itself is an inadequate screening test. Today, one of the most pressing questions in prostate cancer medicine is how can screening be honed to identify those who have life-threatening disease and need aggressive treatment. A number of efforts are underway. One such effort is the assessment of men in the landmark Prostate Cancer Prevention Trial that has led to a prostate cancer risk calculator (PCPTRC), which is available online. PCPTRC version 2.0 predicts the probability of the diagnosis of no cancer, low-grade cancer, or high-grade cancer when variables such as PSA, age, race, family history, and physical findings are input. Modern biomarker development promises to provide tests with fewer false positives and improved ability to find high-grade cancers. Stockholm III (STHLM3) is a prospective, population-based, paired, screen-positive, prostate cancer diagnostic study assessing a combination of plasma protein biomarkers along with age, family history, previous biopsy, and prostate examination for prediction of prostate cancer. Multiparametric MRI incorporates anatomic and functional imaging to better characterize and predict future behavior of tumors within the prostate. After diagnosis of cancer, several genomic tests promise to better distinguish the cancers that need treatment versus those that need observation. Although the new technologies are promising, there is an urgent need for evaluation of these new tests in high-quality, large population-based studies. Until these technologies are proven, most professional organizations have evolved to a recommendation of informed or shared decision making in which there is a discussion between the doctor and patient. PMID:27249774

  2. Degeneracy: a link between evolvability, robustness and complexity in biological systems.

    PubMed

    Whitacre, James M

    2010-01-01

    A full accounting of biological robustness remains elusive; both in terms of the mechanisms by which robustness is achieved and the forces that have caused robustness to grow over evolutionary time. Although its importance to topics such as ecosystem services and resilience is well recognized, the broader relationship between robustness and evolution is only starting to be fully appreciated. A renewed interest in this relationship has been prompted by evidence that mutational robustness can play a positive role in the discovery of adaptive innovations (evolvability) and evidence of an intimate relationship between robustness and complexity in biology.This paper offers a new perspective on the mechanics of evolution and the origins of complexity, robustness, and evolvability. Here we explore the hypothesis that degeneracy, a partial overlap in the functioning of multi-functional components, plays a central role in the evolution and robustness of complex forms. In support of this hypothesis, we present evidence that degeneracy is a fundamental source of robustness, it is intimately tied to multi-scaled complexity, and it establishes conditions that are necessary for system evolvability. PMID:20167097

  3. Sauropod dinosaurs evolved moderately sized genomes unrelated to body size.

    PubMed

    Organ, Chris L; Brusatte, Stephen L; Stein, Koen

    2009-12-22

    Sauropodomorph dinosaurs include the largest land animals to have ever lived, some reaching up to 10 times the mass of an African elephant. Despite their status defining the upper range for body size in land animals, it remains unknown whether sauropodomorphs evolved larger-sized genomes than non-avian theropods, their sister taxon, or whether a relationship exists between genome size and body size in dinosaurs, two questions critical for understanding broad patterns of genome evolution in dinosaurs. Here we report inferences of genome size for 10 sauropodomorph taxa. The estimates are derived from a Bayesian phylogenetic generalized least squares approach that generates posterior distributions of regression models relating genome size to osteocyte lacunae volume in extant tetrapods. We estimate that the average genome size of sauropodomorphs was 2.02 pg (range of species means: 1.77-2.21 pg), a value in the upper range of extant birds (mean = 1.42 pg, range: 0.97-2.16 pg) and near the average for extant non-avian reptiles (mean = 2.24 pg, range: 1.05-5.44 pg). The results suggest that the variation in size and architecture of genomes in extinct dinosaurs was lower than the variation found in mammals. A substantial difference in genome size separates the two major clades within dinosaurs, Ornithischia (large genomes) and Saurischia (moderate to small genomes). We find no relationship between body size and estimated genome size in extinct dinosaurs, which suggests that neutral forces did not dominate the evolution of genome size in this group.

  4. Predatory prokaryotes: predation and primary consumption evolved in bacteria

    NASA Technical Reports Server (NTRS)

    Guerrero, R.; Pedros-Alio, C.; Esteve, I.; Mas, J.; Chase, D.; Margulis, L.

    1986-01-01

    Two kinds of predatory bacteria have been observed and characterized by light and electron microscopy in samples from freshwater sulfurous lakes in northeastern Spain. The first bacterium, named Vampirococcus, is Gram-negative and ovoidal (0.6 micrometer wide). An anaerobic epibiont, it adheres to the surface of phototrophic bacteria (Chromatium spp.) by specific attachment structures and, as it grows and divides by fission, destroys its prey. An important in situ predatory role can be inferred for Vampirococcus from direct counts in natural samples. The second bacterium, named Daptobacter, is a Gram-negative, facultatively anaerobic straight rod (0.5 x 1.5 micrometers) with a single polar flagellum, which collides, penetrates, and grows inside the cytoplasm of its prey (several genera of Chromatiaceae). Considering also the well-known case of Bdellovibrio, a Gram-negative, aerobic curved rod that penetrates and divides in the periplasmic space of many chemotrophic Gram-negative bacteria, there are three types of predatory prokaryotes presently known (epibiotic, cytoplasmic, and periplasmic). Thus, we conclude that antagonistic relationships such as primary consumption, predation, and scavenging had already evolved in microbial ecosystems prior to the appearance of eukaryotes. Furthermore, because they represent methods by which prokaryotes can penetrate other prokaryotes in the absence of phagocytosis, these associations can be considered preadaptation for the origin of intracellular organelles.

  5. Predatory prokaryotes: Predation and primary consumption evolved in bacteria

    PubMed Central

    Guerrero, Ricardo; Pedrós-Alió, Carlos; Esteve, Isabel; Mas, Jordi; Chase, David; Margulis, Lynn

    1986-01-01

    Two kinds of predatory bacteria have been observed and characterized by light and electron microscopy in samples from freshwater sulfurous lakes in northeastern Spain. The first bacterium, named Vampirococcus, is Gram-negative and ovoidal (0.6 μm wide). An anaerobic epibiont, it adheres to the surface of phototrophic bacteria (Chromatium spp.) by specific attachment structures and, as it grows and divides by fission, destroys its prey. An important in situ predatory role can be inferred for Vampirococcus from direct counts in natural samples. The second bacterium, named Daptobacter, is a Gram-negative, facultatively anaerobic straight rod (0.5 × 1.5 μm) with a single polar flagellum, which collides, penetrates, and grows inside the cytoplasm of its prey (several genera of Chromatiaceae). Considering also the well-known case of Bdellovibrio, a Gram-negative, aerobic curved rod that penetrates and divides in the periplasmic space of many chemotrophic Gram-negative bacteria, there are three types of predatory prokaryotes presently known (epibiotic, cytoplasmic, and periplasmic). Thus, we conclude that antagonistic relationships such as primary consumption, predation, and scavenging had already evolved in microbial ecosystems prior to the appearance of eukaryotes. Furthermore, because they represent methods by which prokaryotes can penetrate other prokaryotes in the absence of phagocytosis, these associations can be considered preadaptations for the origin of intracellular organelles. Images PMID:11542073

  6. Complex Formation History of Highly Evolved Basaltic Shergottite, Zagami

    NASA Technical Reports Server (NTRS)

    Niihara, T.; Misawa, K.; Mikouchi, T.; Nyquist, L. E.; Park, J.; Hirata, D.

    2012-01-01

    Zagami, a basaltic shergottite, contains several kinds of lithologies such as Normal Zagami consisting of Fine-grained (FG) and Coarse-grained (CG), Dark Mottled lithology (DML), and Olivine-rich late-stage melt pocket (DN). Treiman and Sutton concluded that Zagami (Normal Zagami) is a fractional crystallization product from a single magma. It has been suggested that there were two igneous stages (deep magma chamber and shallow magma chamber or surface lava flow) on the basis of chemical zoning features of pyroxenes which have homogeneous Mg-rich cores and FeO, CaO zoning at the rims. Nyquist et al. reported that FG has a different initial Sr isotopic ratio than CG and DML, and suggested the possibility of magma mixing on Mars. Here we report new results of petrology and mineralogy for DML and the Olivine-rich lithology (we do not use DN here), the most evolved lithology in this rock, to understand the relationship among lithologies and reveal Zagami s formation history

  7. Fifty Years of Evolving Partnerships in Veterinary Medical Education.

    PubMed

    Kochevar, Deborah T

    2015-01-01

    The Association of American Veterinary Medical College's (AAVMC's) role in the progression of academic veterinary medical education has been about building successful partnerships in the US and internationally. Membership in the association has evolved over the past 50 years, as have traditions of collaboration that strengthen veterinary medical education and the association. The AAVMC has become a source of information and a place for debate on educational trends, innovative pedagogy, and the value of a diverse learning environment. The AAVMC's relationship with the American Veterinary Medical Association Council on Education (AVMA COE), the accreditor of veterinary medical education recognized by the United Sates Department of Education (DOE), is highlighted here because of the key role that AAVMC members have played in the evolution of veterinary accreditation. The AAVMC has also been a partner in the expansion of veterinary medical education to include global health and One Health and in the engagement of international partners around shared educational opportunities and challenges. Recently, the association has reinforced its desire to be a truly international organization rather than an American organization with international members. To that end, strategic AAVMC initiatives aim to expand and connect the global community of veterinary educators to the benefit of students and the profession around the world. Tables in this article are intended to provide historical context, chronology, and an accessible way to view highlights. PMID:26673208

  8. Evolving microbes and re-emerging streptococcal disease.

    PubMed

    Krause, Richard M

    2002-12-01

    Microbes will evolve and the epidemics they cause will continue to occur in the future as they have in the past. Microbes emerge from the evolutionary stream as a result of genetic events and selective pressures that favor new over old. It is nature's way. Microbes and vectors swim in the evolutionary stream, and they swim much faster than humans. Bacteria reproduce every 30 minutes and, for them, a millennium is compressed into a fortnight. They are "fleet afoot," and the pace of research must keep up with them or they will overtake. Microbes were here on Earth 2 billion years before humans arrived, learning every trick of the trade for survival, and they are likely to be here 2 billion years after we depart. Current research on the rise and decline of epidemics is broadly based and includes evolutionary and population genetics of host-microbe relationships. Within this context, the 19th century pandemic of scarlet fever has been described. The possibility is raised that the GAS, which currently cause STSS, possess some of the virulence factors that caused pandemic scarlet fever. Furthermore, the GAS isolated during the recent outbreaks of ARF in certain locales in the United States have the virulence properties of the GAS frequently isolated in the first half of the 20th century. Finally, it is suggested that the strategy to confront emerging infectious diseases should be the study of infectious diseases from all points of view. They remain the greatest threats to our society.

  9. Mass Loss from Evolved Stars in LMC Clusters

    NASA Astrophysics Data System (ADS)

    Points, Sean; Olsen, K.; Blum, R.; Whitney, B.; Meade, M.; Babler, B.; Indebetouw, R.; Hora, J.; Gordon, K.; Engelbracht, C.; For, B.; Block, M.; Misselt, K.; Meixner, M.; Vijh, U.; Leitherer, C.; Srinivasan, S.

    2006-12-01

    We present preliminary results of our investigation into the mass-loss from evolved stars in rich, well-studied clusters in the Large Magellanic Cloud (LMC) using data obtained with the Spitzer Space Telescope SAGE (Surveying the Agents of a Galaxy's Evolution) survey. We have obtained the 8 and 24 micron magnitudes of point sources toward 30 clusters in the LMC with a range of ages from 10^6 to 10^10 years, a spread in metallicity ([Fe/H]) from -2.0 to 0.0, and masses from 10^3 to 10^5 solar masses. Using the 8 and 24 micron fluxes as proxies for stellar mass loss, we calculate the normalized mass loss rates for the clusters in our sample. We use these data to explore the relationships between mass-loss, age, and metallicity, with the aim of developing a cluster mass-loss history for the LMC. Our further goal is to use these results in conjunction with knowledge of the star formation history of the LMC to investigate the LMC's chemical enrichment history.

  10. Fifty Years of Evolving Partnerships in Veterinary Medical Education.

    PubMed

    Kochevar, Deborah T

    2015-01-01

    The Association of American Veterinary Medical College's (AAVMC's) role in the progression of academic veterinary medical education has been about building successful partnerships in the US and internationally. Membership in the association has evolved over the past 50 years, as have traditions of collaboration that strengthen veterinary medical education and the association. The AAVMC has become a source of information and a place for debate on educational trends, innovative pedagogy, and the value of a diverse learning environment. The AAVMC's relationship with the American Veterinary Medical Association Council on Education (AVMA COE), the accreditor of veterinary medical education recognized by the United Sates Department of Education (DOE), is highlighted here because of the key role that AAVMC members have played in the evolution of veterinary accreditation. The AAVMC has also been a partner in the expansion of veterinary medical education to include global health and One Health and in the engagement of international partners around shared educational opportunities and challenges. Recently, the association has reinforced its desire to be a truly international organization rather than an American organization with international members. To that end, strategic AAVMC initiatives aim to expand and connect the global community of veterinary educators to the benefit of students and the profession around the world. Tables in this article are intended to provide historical context, chronology, and an accessible way to view highlights.

  11. The investigation of supply chain's reliability measure: a case study

    NASA Astrophysics Data System (ADS)

    Taghizadeh, Houshang; Hafezi, Ehsan

    2012-10-01

    In this paper, using supply chain operational reference, the reliability evaluation of available relationships in supply chain is investigated. For this purpose, in the first step, the chain under investigation is divided into several stages including first and second suppliers, initial and final customers, and the producing company. Based on the formed relationships between these stages, the supply chain system is then broken down into different subsystem parts. The formed relationships between the stages are based on the transportation of the orders between stages. Paying attention to the system elements' location, which can be in one of the five forms of series namely parallel, series/parallel, parallel/series, or their combinations, we determine the structure of relationships in the divided subsystems. According to reliability evaluation scales on the three levels of supply chain, the reliability of each chain is then calculated. Finally, using the formulas of calculating the reliability in combined systems, the reliability of each system and ultimately the whole system is investigated.

  12. Did the ctenophore nervous system evolve independently?

    PubMed

    Ryan, Joseph F

    2014-08-01

    Recent evidence supports the placement of ctenophores as the most distant relative to all other animals. This revised animal tree means that either the ancestor of all animals possessed neurons (and that sponges and placozoans apparently lost them) or that ctenophores developed them independently. Differentiating between these possibilities is important not only from a historical perspective, but also for the interpretation of a wide range of neurobiological results. In this short perspective paper, I review the evidence in support of each scenario and show that the relationship between the nervous system of ctenophores and other animals is an unsolved, yet tractable problem. PMID:24986234

  13. Did the ctenophore nervous system evolve independently?

    PubMed

    Ryan, Joseph F

    2014-08-01

    Recent evidence supports the placement of ctenophores as the most distant relative to all other animals. This revised animal tree means that either the ancestor of all animals possessed neurons (and that sponges and placozoans apparently lost them) or that ctenophores developed them independently. Differentiating between these possibilities is important not only from a historical perspective, but also for the interpretation of a wide range of neurobiological results. In this short perspective paper, I review the evidence in support of each scenario and show that the relationship between the nervous system of ctenophores and other animals is an unsolved, yet tractable problem.

  14. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  15. Issues in Modeling System Reliability

    NASA Astrophysics Data System (ADS)

    Cruse, Thomas A.; Annis, Chuck; Booker, Jane; Robinson, David; Sues, Rob

    2002-10-01

    This paper discusses various issues in modeling system reliability. The topics include: 1) Statistical formalisms versus pragmatic numerics; 2) Language; 3) Statistical methods versus reliability-based design methods; 4) Professional bias; and 5) Real issues that need to be identified and resolved prior to certifying designs. This paper is in viewgraph form.

  16. Avionics design for reliability bibliography

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A bibliography with abstracts was presented in support of AGARD lecture series No. 81. The following areas were covered: (1) program management, (2) design for high reliability, (3) selection of components and parts, (4) environment consideration, (5) reliable packaging, (6) life cycle cost, and (7) case histories.

  17. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  18. Impact of Device Scaling on Deep Sub-micron Transistor Reliability: A Study of Reliability Trends using SRAM

    NASA Technical Reports Server (NTRS)

    White, Mark; Huang, Bing; Qin, Jin; Gur, Zvi; Talmor, Michael; Chen, Yuan; Heidecker, Jason; Nguyen, Duc; Bernstein, Joseph

    2005-01-01

    As microelectronics are scaled in to the deep sub-micron regime, users of advanced technology CMOS, particularly in high-reliability applications, should reassess how scaling effects impact long-term reliability. An experimental based reliability study of industrial grade SRAMs, consisting of three different technology nodes, is proposed to substantiate current acceleration models for temperature and voltage life-stress relationships. This reliability study utilizes step-stress techniques to evaluate memory technologies (0.25mum, 0.15mum, and 0.13mum) embedded in many of today's high-reliability space/aerospace applications. Two acceleration modeling approaches are presented to relate experimental FIT calculations to Mfr's qualification data.

  19. Cancer and aging. An evolving panorama.

    PubMed

    Balducci, L; Extermann, M

    2000-02-01

    This article illustrates how the nosology of cancer evolves with the patient's age. If the current trends are maintained, 70% of all neoplasms will occur in persons aged 65 years and over by the year 2020, leading to increased cancer-related morbidity among older persons. Cancer control in the older person involves chemoprevention, early diagnosis, and timely and effective treatment that entails both antineoplastic therapy and symptom management. These interventions must be individualized based on a multidimensional assessment that can predict life expectancy and treatment complications and that may evaluate the quality of life of the older person. This article suggests a number of interventions that may improve cancer control in the aged. Public education is needed to illustrate the benefits of health maintenance and early detection of cancer even among older individuals, to create realistic expectations, and to heighten awareness of early symptoms and signs of cancer. Professional education is needed to train students and practitioners in the evaluation and management of the older person. Of special interest is the current initiative of the Hartford Foundation offering combined fellowships in oncology and geriatrics and incorporating principles of geriatric medicine in medical specialty training. Prudent pharmacologic principles must be followed in managing older persons with cytotoxic chemotherapy. These principles include adjusting the dose according to the patient's renal function, using epoietin to maintain hemoglobin levels of 12 g/dL or more, and using hemopoietic growth factors in persons aged 70 years and older receiving cytotoxic chemotherapy of moderate toxicity (e.g., CHOP). To assure uniformity of data, a cooperative oncology group should formulate a geriatric package outlining a common plan for evaluating function and comorbidity. This article also suggests several important areas of research items: Molecular interactions of age and cancer Host

  20. The Evolvement of Automobile Steering System Based on TRIZ

    NASA Astrophysics Data System (ADS)

    Zhao, Xinjun; Zhang, Shuang

    Products and techniques pass through a process of birth, growth, maturity, death and quit the stage like biological evolution process. The developments of products and techniques conform to some evolvement rules. If people know and hold these rules, they can design new kind of products and forecast the develop trends of the products. Thereby, enterprises can grasp the future technique directions of products, and make product and technique innovation. Below, based on TRIZ theory, the mechanism evolvement, the function evolvement and the appearance evolvement of automobile steering system had been analyzed and put forward some new ideas about future automobile steering system.

  1. Food Addiction: An Evolving Nonlinear Science

    PubMed Central

    Shriner, Richard; Gold, Mark

    2014-01-01

    The purpose of this review is to familiarize readers with the role that addiction plays in the formation and treatment of obesity, type 2 diabetes and disorders of eating. We will outline several useful models that integrate metabolism, addiction, and human relationship adaptations to eating. A special effort will be made to demonstrate how the use of simple and straightforward nonlinear models can and are being used to improve our knowledge and treatment of patients suffering from nutritional pathology. Moving forward, the reader should be able to incorporate some of the findings in this review into their own practice, research, teaching efforts or other interests in the fields of nutrition, diabetes, and/or bariatric (weight) management. PMID:25421535

  2. Could life have evolved in cometary nuclei

    NASA Technical Reports Server (NTRS)

    Bar-Nun, A.; Lazcano-Araujo, A.; Oro, J.

    1981-01-01

    The suggestion by Hoyle and Wickramasinghe (1978) that life might have originated in cometary nuclei rather than directly on the earth is discussed. Factors in the cometary environment including the conditions at perihelion passage leading to the ablation of cometary ices, ice temperatures, the absence of an atmosphere and discrete liquid and solid surfaces, weak cometary structure incapable of supporting a liquid core, and radiation are presented as arguments against biopoesis in comets. It is concluded that although the contribution of cometary and meteoritic matter was significant in shaping the earth environment, the view that life on earth originally arose in comets is untenable, and the proposition that the process of interplanetary infection still occurs is unlikely in view of the high specificity of host-parasite relationships.

  3. Fundamental mechanisms of micromachine reliability

    SciTech Connect

    DE BOER,MAARTEN P.; SNIEGOWSKI,JEFFRY J.; KNAPP,JAMES A.; REDMOND,JAMES M.; MICHALSKE,TERRY A.; MAYER,THOMAS K.

    2000-01-01

    Due to extreme surface to volume ratios, adhesion and friction are critical properties for reliability of Microelectromechanical Systems (MEMS), but are not well understood. In this LDRD the authors established test structures, metrology and numerical modeling to conduct studies on adhesion and friction in MEMS. They then concentrated on measuring the effect of environment on MEMS adhesion. Polycrystalline silicon (polysilicon) is the primary material of interest in MEMS because of its integrated circuit process compatibility, low stress, high strength and conformal deposition nature. A plethora of useful micromachined device concepts have been demonstrated using Sandia National Laboratories' sophisticated in-house capabilities. One drawback to polysilicon is that in air the surface oxidizes, is high energy and is hydrophilic (i.e., it wets easily). This can lead to catastrophic failure because surface forces can cause MEMS parts that are brought into contact to adhere rather than perform their intended function. A fundamental concern is how environmental constituents such as water will affect adhesion energies in MEMS. The authors first demonstrated an accurate method to measure adhesion as reported in Chapter 1. In Chapter 2 through 5, they then studied the effect of water on adhesion depending on the surface condition (hydrophilic or hydrophobic). As described in Chapter 2, they find that adhesion energy of hydrophilic MEMS surfaces is high and increases exponentially with relative humidity (RH). Surface roughness is the controlling mechanism for this relationship. Adhesion can be reduced by several orders of magnitude by silane coupling agents applied via solution processing. They decrease the surface energy and render the surface hydrophobic (i.e. does not wet easily). However, only a molecular monolayer coats the surface. In Chapters 3-5 the authors map out the extent to which the monolayer reduces adhesion versus RH. They find that adhesion is independent of

  4. Impact of staffing parameters on operational reliability

    SciTech Connect

    Hahn, H.A.; Houghton, F.K.

    1993-01-01

    This paper reports on a project related to human resource management of the Department of Energy's (DOE's) High-Level Waste (HLW) Tank program. Safety and reliability of waste tank operations is impacted by several issues, including not only the design of the tanks themselves, but also how operations and operational personnel are managed. As demonstrated by management assessments performed by the Tiger Teams, DOE believes that the effective use of human resources impacts environment safety, and health concerns. For the of the current paper, human resource management activities are identified as Staffing'' and include the of developing the functional responsibilities and qualifications of technical and administrative personnel. This paper discusses the importance of staff plans and management in the overall view of safety and reliability. The work activities and procedures associated with the project, a review of the results of these activities, including a summary of the literature and a preliminary analysis of the data. We conclude that although identification of staffing issues and the development of staffing plans contributes to the overall reliability and safety of the HLW tanks, the relationship is not well understood and is in need of further development.

  5. Impact of staffing parameters on operational reliability

    SciTech Connect

    Hahn, H.A.; Houghton, F.K.

    1993-02-01

    This paper reports on a project related to human resource management of the Department of Energy`s (DOE`s) High-Level Waste (HLW) Tank program. Safety and reliability of waste tank operations is impacted by several issues, including not only the design of the tanks themselves, but also how operations and operational personnel are managed. As demonstrated by management assessments performed by the Tiger Teams, DOE believes that the effective use of human resources impacts environment safety, and health concerns. For the of the current paper, human resource management activities are identified as ``Staffing`` and include the of developing the functional responsibilities and qualifications of technical and administrative personnel. This paper discusses the importance of staff plans and management in the overall view of safety and reliability. The work activities and procedures associated with the project, a review of the results of these activities, including a summary of the literature and a preliminary analysis of the data. We conclude that although identification of staffing issues and the development of staffing plans contributes to the overall reliability and safety of the HLW tanks, the relationship is not well understood and is in need of further development.

  6. Calculating system reliability with SRFYDO

    SciTech Connect

    Morzinski, Jerome; Anderson - Cook, Christine M; Klamann, Richard M

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  7. PV Reliability Development Lessons from JPL's Flat Plate Solar Array Project

    NASA Technical Reports Server (NTRS)

    Ross, Ronald G., Jr.

    2013-01-01

    Key reliability and engineering lessons learned from the 20-year history of the Jet Propulsion Laboratory's Flat-Plate Solar Array Project and thin film module reliability research activities are presented and analyzed. Particular emphasis is placed on lessons applicable to evolving new module technologies and the organizations involved with these technologies. The user-specific demand for reliability is a strong function of the application, its location, and its expected duration. Lessons relative to effective means of specifying reliability are described, and commonly used test requirements are assessed from the standpoint of which are the most troublesome to pass, and which correlate best with field experience. Module design lessons are also summarized, including the significance of the most frequently encountered failure mechanisms and the role of encapsulate and cell reliability in determining module reliability. Lessons pertaining to research, design, and test approaches include the historical role and usefulness of qualification tests and field tests.

  8. Aerospace reliability applied to biomedicine.

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.; Vargo, D. J.

    1972-01-01

    An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.

  9. Reliability analysis of interdependent lattices

    NASA Astrophysics Data System (ADS)

    Limiao, Zhang; Daqing, Li; Pengju, Qin; Bowen, Fu; Yinan, Jiang; Zio, Enrico; Rui, Kang

    2016-06-01

    Network reliability analysis has drawn much attention recently due to the risks of catastrophic damage in networked infrastructures. These infrastructures are dependent on each other as a result of various interactions. However, most of the reliability analyses of these interdependent networks do not consider spatial constraints, which are found important for robustness of infrastructures including power grid and transport systems. Here we study the reliability properties of interdependent lattices with different ranges of spatial constraints. Our study shows that interdependent lattices with strong spatial constraints are more resilient than interdependent Erdös-Rényi networks. There exists an intermediate range of spatial constraints, at which the interdependent lattices have minimal resilience.

  10. Integrating reliability analysis and design

    SciTech Connect

    Rasmuson, D. M.

    1980-10-01

    This report describes the Interactive Reliability Analysis Project and demonstrates the advantages of using computer-aided design systems (CADS) in reliability analysis. Common cause failure problems require presentations of systems, analysis of fault trees, and evaluation of solutions to these. Results have to be communicated between the reliability analyst and the system designer. Using a computer-aided design system saves time and money in the analysis of design. Computer-aided design systems lend themselves to cable routing, valve and switch lists, pipe routing, and other component studies. At EG and G Idaho, Inc., the Applicon CADS is being applied to the study of water reactor safety systems.

  11. (Centralized Reliability Data Organization (CRDO))

    SciTech Connect

    Haire, M J

    1987-04-21

    One of the primary goals of the Centralized Reliability Data Organization (CREDO) is to be an international focal point for the collection, analysis, and dissemination of liquid metal reactor (LMR) component reliability, availability, and maintainability (RAM) data. During FY-1985, the Department of Energy (DOE) entered into a Specific Memorandum of Agreement (SMA) with Japan's Power Reactor and Nuclear Fuel Development Corporation (PNC) regarding cooperative data exchange efforts. This agreement was CREDO's first step toward internationalization and represented an initial realization of the previously mentioned goal. DOE's interest in further internationalization of the CREDO system was the primary motivation for the traveler's attendance at the Reliability '87 conference.

  12. Reliability Analysis and Modeling of ZigBee Networks

    NASA Astrophysics Data System (ADS)

    Lin, Cheng-Min

    The architecture of ZigBee networks focuses on developing low-cost, low-speed ubiquitous communication between devices. The ZigBee technique is based on IEEE 802.15.4, which specifies the physical layer and medium access control (MAC) for a low rate wireless personal area network (LR-WPAN). Currently, numerous wireless sensor networks have adapted the ZigBee open standard to develop various services to promote improved communication quality in our daily lives. The problem of system and network reliability in providing stable services has become more important because these services will be stopped if the system and network reliability is unstable. The ZigBee standard has three kinds of networks; star, tree and mesh. The paper models the ZigBee protocol stack from the physical layer to the application layer and analyzes these layer reliability and mean time to failure (MTTF). Channel resource usage, device role, network topology and application objects are used to evaluate reliability in the physical, medium access control, network, and application layers, respectively. In the star or tree networks, a series system and the reliability block diagram (RBD) technique can be used to solve their reliability problem. However, a division technology is applied here to overcome the problem because the network complexity is higher than that of the others. A mesh network using division technology is classified into several non-reducible series systems and edge parallel systems. Hence, the reliability of mesh networks is easily solved using series-parallel systems through our proposed scheme. The numerical results demonstrate that the reliability will increase for mesh networks when the number of edges in parallel systems increases while the reliability quickly drops when the number of edges and the number of nodes increase for all three networks. More use of resources is another factor impact on reliability decreasing. However, lower network reliability will occur due to

  13. Enhancing the Principal-School Counselor Relationship: Toolkit

    ERIC Educational Resources Information Center

    College Board Advocacy & Policy Center, 2011

    2011-01-01

    The College Board, NASSP and ASCA believe that the principal-counselor relationship is a dynamic and organic relationship that evolves over time in response to the ever-changing needs of a school. The goal of an effective principal-counselor relationship is to use the strength of the relationship to collaboratively lead school reform efforts to…

  14. Evolving Expert Knowledge Bases: Applications of Crowdsourcing and Serious Gaming to Advance Knowledge Development for Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Floryan, Mark

    2013-01-01

    This dissertation presents a novel effort to develop ITS technologies that adapt by observing student behavior. In particular, we define an evolving expert knowledge base (EEKB) that structures a domain's information as a set of nodes and the relationships that exist between those nodes. The structure of this model is not the particularly novel…

  15. Health-Literate Youth: Evolving Challenges for Health Educators

    ERIC Educational Resources Information Center

    Fetro, Joyce V.

    2010-01-01

    This article presents the author's AAHE Scholar presentation at the 2010 AAHE annual meeting in Indianapolis, Indiana. In her discussion, the author addresses what she sees to be some evolving challenges for health educators working with youth as well as some possible strategies for addressing them. These evolving challenges are: (1) understanding…

  16. Adaptation of Escherichia coli to glucose promotes evolvability in lactose.

    PubMed

    Phillips, Kelly N; Castillo, Gerardo; Wünsche, Andrea; Cooper, Tim F

    2016-02-01

    The selective history of a population can influence its subsequent evolution, an effect known as historical contingency. We previously observed that five of six replicate populations that were evolved in a glucose-limited environment for 2000 generations, then switched to lactose for 1000 generations, had higher fitness increases in lactose than populations started directly from the ancestor. To test if selection in glucose systematically increased lactose evolvability, we started 12 replay populations--six from a population subsample and six from a single randomly selected clone--from each of the six glucose-evolved founder populations. These replay populations and 18 ancestral populations were evolved for 1000 generations in a lactose-limited environment. We found that replay populations were initially slightly less fit in lactose than the ancestor, but were more evolvable, in that they increased in fitness at a faster rate and to higher levels. This result indicates that evolution in the glucose environment resulted in genetic changes that increased the potential of genotypes to adapt to lactose. Genome sequencing identified four genes--iclR, nadR, spoT, and rbs--that were mutated in most glucose-evolved clones and are candidates for mediating increased evolvability. Our results demonstrate that short-term selective costs during selection in one environment can lead to changes in evolvability that confer longer term benefits. PMID:26748670

  17. Loops and autonomy promote evolvability of ecosystem networks.

    PubMed

    Luo, Jianxi

    2014-01-01

    The structure of ecological networks, in particular food webs, determines their ability to evolve further, i.e. evolvability. The knowledge about how food web evolvability is determined by the structures of diverse ecological networks can guide human interventions purposefully to either promote or limit evolvability of ecosystems. However, the focus of prior food web studies was on stability and robustness; little is known regarding the impact of ecological network structures on their evolvability. To correlate ecosystem structure and evolvability, we adopt the NK model originally from evolutionary biology to generate and assess the ruggedness of fitness landscapes of a wide spectrum of model food webs with gradual variation in the amount of feeding loops and link density. The variation in network structures is controlled by linkage rewiring. Our results show that more feeding loops and lower trophic link density, i.e. higher autonomy of species, of food webs increase the potential for the ecosystem to generate heritable variations with improved fitness. Our findings allow the prediction of the evolvability of actual food webs according to their network structures, and provide guidance to enhancing or controlling the evolvability of specific ecosystems.

  18. Reliability and Maintainability (RAM) Training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Packard, Michael H. (Editor)

    2000-01-01

    The theme of this manual is failure physics-the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low-cost reliable products. In a broader sense the manual should do more. It should underscore the urgent need CI for mature attitudes toward reliability. Five of the chapters were originally presented as a classroom course to over 1000 Martin Marietta engineers and technicians. Another four chapters and three appendixes have been added, We begin with a view of reliability from the years 1940 to 2000. Chapter 2 starts the training material with a review of mathematics and a description of what elements contribute to product failures. The remaining chapters elucidate basic reliability theory and the disciplines that allow us to control and eliminate failures.

  19. Failure Analysis for Improved Reliability

    NASA Technical Reports Server (NTRS)

    Sood, Bhanu

    2016-01-01

    Outline: Section 1 - What is reliability and root cause? Section 2 - Overview of failure mechanisms. Section 3 - Failure analysis techniques (1. Non destructive analysis techniques, 2. Destructive Analysis, 3. Materials Characterization). Section 4 - Summary and Closure

  20. An experiment in software reliability

    NASA Technical Reports Server (NTRS)

    Dunham, J. R.; Pierce, J. L.

    1986-01-01

    The results of a software reliability experiment conducted in a controlled laboratory setting are reported. The experiment was undertaken to gather data on software failures and is one in a series of experiments being pursued by the Fault Tolerant Systems Branch of NASA Langley Research Center to find a means of credibly performing reliability evaluations of flight control software. The experiment tests a small sample of implementations of radar tracking software having ultra-reliability requirements and uses n-version programming for error detection, and repetitive run modeling for failure and fault rate estimation. The experiment results agree with those of Nagel and Skrivan in that the program error rates suggest an approximate log-linear pattern and the individual faults occurred with significantly different error rates. Additional analysis of the experimental data raises new questions concerning the phenomenon of interacting faults. This phenomenon may provide one explanation for software reliability decay.

  1. Evolving minds: Helping students with cognitive dissonance

    NASA Astrophysics Data System (ADS)

    Bramschreiber, Terry L.

    Even 150 years after Charles Darwin published On the Origin of Species, public school teachers still find themselves dealing with student resistance to learning about biological evolution. Some teachers deal with this pressure by undermining, deemphasizing, or even omitting the topic in their science curriculum. Others face the challenge and deliver solid scientific instruction of evolutionary theory despite the conflicts that may arise. The latter were the topic of this study. I interviewed five teachers that had experience dealing with resistance to learning evolution in their school community. Through these in-depth interviews, I examined strategies these teachers use when facing resistance and how they help students deal with the cognitive dissonance that may be experienced when learning about evolution. I selected the qualitative method of educational criticism and connoisseurship to organize and categorize my data. From the interviews, the following findings emerged. Experienced teachers increased their confidence in teaching evolution by pursuing outside professional development. They not only learned more about evolutionary theory, but about creationist arguments against evolution. These teachers front-load their curriculum to integrate the nature of science into their lessons to address misunderstandings about how science works. They also highlight the importance of learning evolutionary theory but ensure students they do not have an agenda to indoctrinate students. Finally these experienced teachers work hard to create an intellectually safe learning environment to build trusting and respectful relationships with their students.

  2. Accelerator Availability and Reliability Issues

    SciTech Connect

    Steve Suhring

    2003-05-01

    Maintaining reliable machine operations for existing machines as well as planning for future machines' operability present significant challenges to those responsible for system performance and improvement. Changes to machine requirements and beam specifications often reduce overall machine availability in an effort to meet user needs. Accelerator reliability issues from around the world will be presented, followed by a discussion of the major factors influencing machine availability.

  3. Laplacian Estrada and normalized Laplacian Estrada indices of evolving graphs.

    PubMed

    Shang, Yilun

    2015-01-01

    Large-scale time-evolving networks have been generated by many natural and technological applications, posing challenges for computation and modeling. Thus, it is of theoretical and practical significance to probe mathematical tools tailored for evolving networks. In this paper, on top of the dynamic Estrada index, we study the dynamic Laplacian Estrada index and the dynamic normalized Laplacian Estrada index of evolving graphs. Using linear algebra techniques, we established general upper and lower bounds for these graph-spectrum-based invariants through a couple of intuitive graph-theoretic measures, including the number of vertices or edges. Synthetic random evolving small-world networks are employed to show the relevance of the proposed dynamic Estrada indices. It is found that neither the static snapshot graphs nor the aggregated graph can approximate the evolving graph itself, indicating the fundamental difference between the static and dynamic Estrada indices. PMID:25822506

  4. Gene Essentiality Is a Quantitative Property Linked to Cellular Evolvability.

    PubMed

    Liu, Gaowen; Yong, Mei Yun Jacy; Yurieva, Marina; Srinivasan, Kandhadayar Gopalan; Liu, Jaron; Lim, John Soon Yew; Poidinger, Michael; Wright, Graham Daniel; Zolezzi, Francesca; Choi, Hyungwon; Pavelka, Norman; Rancati, Giulia

    2015-12-01

    Gene essentiality is typically determined by assessing the viability of the corresponding mutant cells, but this definition fails to account for the ability of cells to adaptively evolve to genetic perturbations. Here, we performed a stringent screen to assess the degree to which Saccharomyces cerevisiae cells can survive the deletion of ~1,000 individual "essential" genes and found that ~9% of these genetic perturbations could in fact be overcome by adaptive evolution. Our analyses uncovered a genome-wide gradient of gene essentiality, with certain essential cellular functions being more "evolvable" than others. Ploidy changes were prevalent among the evolved mutant strains, and aneuploidy of a specific chromosome was adaptive for a class of evolvable nucleoporin mutants. These data justify a quantitative redefinition of gene essentiality that incorporates both viability and evolvability of the corresponding mutant cells and will enable selection of therapeutic targets associated with lower risk of emergence of drug resistance. PMID:26627736

  5. Gene Essentiality Is a Quantitative Property Linked to Cellular Evolvability.

    PubMed

    Liu, Gaowen; Yong, Mei Yun Jacy; Yurieva, Marina; Srinivasan, Kandhadayar Gopalan; Liu, Jaron; Lim, John Soon Yew; Poidinger, Michael; Wright, Graham Daniel; Zolezzi, Francesca; Choi, Hyungwon; Pavelka, Norman; Rancati, Giulia

    2015-12-01

    Gene essentiality is typically determined by assessing the viability of the corresponding mutant cells, but this definition fails to account for the ability of cells to adaptively evolve to genetic perturbations. Here, we performed a stringent screen to assess the degree to which Saccharomyces cerevisiae cells can survive the deletion of ~1,000 individual "essential" genes and found that ~9% of these genetic perturbations could in fact be overcome by adaptive evolution. Our analyses uncovered a genome-wide gradient of gene essentiality, with certain essential cellular functions being more "evolvable" than others. Ploidy changes were prevalent among the evolved mutant strains, and aneuploidy of a specific chromosome was adaptive for a class of evolvable nucleoporin mutants. These data justify a quantitative redefinition of gene essentiality that incorporates both viability and evolvability of the corresponding mutant cells and will enable selection of therapeutic targets associated with lower risk of emergence of drug resistance.

  6. Laplacian Estrada and normalized Laplacian Estrada indices of evolving graphs.

    PubMed

    Shang, Yilun

    2015-01-01

    Large-scale time-evolving networks have been generated by many natural and technological applications, posing challenges for computation and modeling. Thus, it is of theoretical and practical significance to probe mathematical tools tailored for evolving networks. In this paper, on top of the dynamic Estrada index, we study the dynamic Laplacian Estrada index and the dynamic normalized Laplacian Estrada index of evolving graphs. Using linear algebra techniques, we established general upper and lower bounds for these graph-spectrum-based invariants through a couple of intuitive graph-theoretic measures, including the number of vertices or edges. Synthetic random evolving small-world networks are employed to show the relevance of the proposed dynamic Estrada indices. It is found that neither the static snapshot graphs nor the aggregated graph can approximate the evolving graph itself, indicating the fundamental difference between the static and dynamic Estrada indices.

  7. MEMS reliability: coming of age

    NASA Astrophysics Data System (ADS)

    Douglass, Michael R.

    2008-02-01

    In today's high-volume semiconductor world, one could easily take reliability for granted. As the MOEMS/MEMS industry continues to establish itself as a viable alternative to conventional manufacturing in the macro world, reliability can be of high concern. Currently, there are several emerging market opportunities in which MOEMS/MEMS is gaining a foothold. Markets such as mobile media, consumer electronics, biomedical devices, and homeland security are all showing great interest in microfabricated products. At the same time, these markets are among the most demanding when it comes to reliability assurance. To be successful, each company developing a MOEMS/MEMS device must consider reliability on an equal footing with cost, performance and manufacturability. What can this maturing industry learn from the successful development of DLP technology, air bag accelerometers and inkjet printheads? This paper discusses some basic reliability principles which any MOEMS/MEMS device development must use. Examples from the commercially successful and highly reliable Digital Micromirror Device complement the discussion.

  8. Reliability Quantification of Advanced Stirling Convertor (ASC) Components

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward

    2010-01-01

    The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.

  9. By-product information can stabilize the reliability of communication.

    PubMed

    Schaefer, H Martin; Ruxton, G D

    2012-12-01

    Although communication underpins many biological processes, its function and basic definition remain contentious. In particular, researchers have debated whether information should be an integral part of a definition of communication and how it remains reliable. So far the handicap principle, assuming signal costs to stabilize reliable communication, has been the predominant paradigm in the study of animal communication. The role of by-product information produced by mechanisms other than the communicative interaction has been neglected in the debate on signal reliability. We argue that by-product information is common and that it provides the starting point for ritualization as the process of the evolution of communication. Second, by-product information remains unchanged during ritualization and enforces reliable communication by restricting the options for manipulation and cheating. Third, this perspective changes the focus of research on communication from studying signal costs to studying the costs of cheating. It can thus explain the reliability of signalling in many communication systems that do not rely on handicaps. We emphasize that communication can often be informative but that the evolution of communication does not cause the evolution of information because by-product information often predates and stimulates the evolution of communication. Communication is thus a consequence but not a cause of reliability. Communication is the interplay of inadvertent, informative traits and evolved traits that increase the stimulation and perception of perceivers. Viewing communication as a complex of inadvertent and derived traits facilitates understanding of the selective pressures shaping communication and those shaping information and its reliability. This viewpoint further contributes to resolving the current controversy on the role of information in communication.

  10. Reliability analysis of common hazardous waste treatment processes

    SciTech Connect

    Waters, R.D.

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  11. The Reliability of Content Analysis of Computer Conference Communication

    ERIC Educational Resources Information Center

    Rattleff, Pernille

    2007-01-01

    The focus of this article is the reliability of content analysis of students' computer conference communication. Content analysis is often used when researching the relationship between learning and the use of information and communications technology in educational settings. A number of studies where content analysis is used and classification…

  12. Approximation of reliability of direct genomic breeding values

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Two methods to efficiently approximate theoretical genomic reliabilities are presented. The first method is based on the direct inverse of the left hand side (LHS) of mixed model equations. It uses the genomic relationship matrix for a small subset of individuals with the highest genomic relationshi...

  13. NASA's Space Launch System: An Evolving Capability for Exploration An Evolving Capability for Exploration

    NASA Technical Reports Server (NTRS)

    Creech, Stephen D.; Crumbly, Christopher M.; Robinson, Kimerly F.

    2016-01-01

    A foundational capability for international human deep-space exploration, NASA's Space Launch System (SLS) vehicle represents a new spaceflight infrastructure asset, creating opportunities for mission profiles and space systems that cannot currently be executed. While the primary purpose of SLS, which is making rapid progress towards initial launch readiness in two years, will be to support NASA's Journey to Mars, discussions are already well underway regarding other potential utilization of the vehicle's unique capabilities. In its initial Block 1 configuration, capable of launching 70 metric tons (t) to low Earth orbit (LEO), SLS is capable of propelling the Orion crew vehicle to cislunar space, while also delivering small CubeSat-class spacecraft to deep-space destinations. With the addition of a more powerful upper stage, the Block 1B configuration of SLS will be able to deliver 105 t to LEO and enable more ambitious human missions into the proving ground of space. This configuration offers opportunities for launching co-manifested payloads with the Orion crew vehicle, and a class of secondary payloads, larger than today's CubeSats. Further upgrades to the vehicle, including advanced boosters, will evolve its performance to 130 t in its Block 2 configuration. Both Block 1B and Block 2 also offer the capability to carry 8.4- or 10-m payload fairings, larger than any contemporary launch vehicle. With unmatched mass-lift capability, payload volume, and C3, SLS not only enables spacecraft or mission designs currently impossible with contemporary EELVs, it also offers enhancing benefits, such as reduced risk, operational costs and/or complexity, shorter transit time to destination or launching large systems either monolithically or in fewer components. This paper will discuss both the performance and capabilities of Space Launch System as it evolves, and the current state of SLS utilization planning.

  14. CSP Manufacturing Challenges and Assembly Reliability

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    2000-01-01

    Although the expression of CSP is widely used by industry from suppliers to users, its implied definition had evolved as the technology has matured. There are "expert definition"- package that is up to 1.5 time die- or "interim definition". CSPs are miniature new packages that industry is starting to implement and there are many unresolved technical issues associated with their implementation. For example, in early 1997, packages with 1 mm pitch and lower were the dominant CSPs, whereas in early 1998 packages with 0.8 mm and lower became the norm for CSPs. Other changes included the use of flip chip die rather than wire bond in CSP. Nonetheless the emerging CSPs are competing with bare die assemblies and are becoming the package of choice for size reduction applications. These packages provide the benefits of small size and performance of the bare die or flip chip, with the advantage of standard die packages. The JPL-led MicrotypeBGA Consortium of enterprises representing government agencies and private companies have jointed together to pool in-kind resources for developing the quality and reliability of chip scale packages (CSPs) for a variety of projects. This talk will cover specifically the experience of our consortium on technology implementation challenges, including design and build of both standard and microvia boards, assembly of two types of test vehicles, and the most current environmental thermal cycling test results.

  15. Could life have evolved in cometary nuclei?

    NASA Astrophysics Data System (ADS)

    Bar-Nun, A.; Lazcano-Araujo, A.; Oró, J.

    1981-12-01

    circumstances. 6. Concerning viruses, the high specificity of host-parasite relationships and their coevolutionary lines of descent, rule out a cometary origin for them. In summary, the view that life originated in comets is untenable in the light of all the available evidence.

  16. Evolved pesticide tolerance in amphibians: Predicting mechanisms based on pesticide novelty and mode of action.

    PubMed

    Hua, Jessica; Jones, Devin K; Mattes, Brian M; Cothran, Rickey D; Relyea, Rick A; Hoverman, Jason T

    2015-11-01

    We examined 10 wood frog populations distributed along an agricultural gradient for their tolerance to six pesticides (carbaryl, malathion, cypermethrin, permethrin, imidacloprid, and thiamethoxam) that differed in date of first registration (pesticide novelty) and mode-of-action (MOA). Our goals were to assess whether: 1) tolerance was correlated with distance to agriculture for each pesticide, 2) pesticide novelty predicted the likelihood of evolved tolerance, and 3) populations display cross-tolerance between pesticides that share and differ in MOA. Wood frog populations located close to agriculture were more tolerant to carbaryl and malathion than populations far from agriculture. Moreover, the strength of the relationship between distance to agriculture and tolerance was stronger for older pesticides compared to newer pesticides. Finally, we found evidence for cross-tolerance between carbaryl and malathion (two pesticides that share MOA). This study provides one of the most comprehensive approaches for understanding patterns of evolved tolerance in non-pest species.

  17. EVOLVE historical and projected orbital debris test environments

    NASA Astrophysics Data System (ADS)

    Krisko, P. H.

    2004-01-01

    The NASA/JSC orbital debris research effort within the Earth's low-altitude regime continues with the upgrade of the debris environment simulation model EVOLVE. Two main contributions to this new version will include a more streamlined structure (transparent to the analyst) and an updated, expanded set of launch/orbital insertion files. The new database includes such improvements as high fidelity launch times and orbital elements, dataderived area-to-mass ratios, and individual object dry mass and physical description. As an additional test of the new code, a version of the Anz-Meador [Adv. Space Res. (2004)] explosive fragmentation model is implemented and the resulting historical and projected LEO environments are compared to those of the US Space Surveillance Network (SSN) Catalog, ORDEM2000, and EVOLVE 4.1 (the current production version of the code) historical and projection periods EVOLVE test historical environments compare well with the catalog and ORDEM2000 environments over the 1-mm and larger size range. However, it should be noted that SRM slag is not included in EVOLVE at this time. Differences between the EVOLVE test and the EVOLVE 4.1 long-term projection environments are traced directly to the modified launch cycle and the chosen form of the Anz-Meador [Adv. Space Res. (2004)] breakup model of this EVOLVE test.

  18. Temporal firing reliability in response to periodic synaptic inputs

    NASA Astrophysics Data System (ADS)

    Hunter, John D.; Milton, John G.

    1998-03-01

    Reliable spike timing in the presence of noise is a prerequisite for a spike timing code. Previously we demonstrated that there is an intimate relationship between the phase locked firing patterns and spike timing reliability in the presence of noise: stable 1:m phase-locking generate reliable firing times; n:m phase-locked solutions where n neq 1 generate significantly less reliable spike times, where n is the number of spikes in m cycles of the stimulus. Here we compare spike timing reliability in an Aplysia motoneuron to that in a leaky integrate-and-fire neuron receiving either realistic periodic excitatory (EPSC) or inhibitory (IPSC) post-synaptic currents. For the same frequency and for identical synaptic time courses, EPSCs and IPSCs have opposite effects on spike timing reliability. This effect is shown to be a direct consequence of changes in the DC component of the input. Thus spike-time reliability is sensitively controlled by the interplay between the frequency and DC component of input to the neuron.

  19. Petrography, Geochemistry, and Pairing Relationships of Basaltic Lunar Meteorite Miller Range 13317

    NASA Astrophysics Data System (ADS)

    Zeigler, R. A.; Korotev, R. L.

    2016-08-01

    A petrographic and geochemical description of "new" lunar meteorite MIL 13317, an evolved lunar basaltic regolith breccia. The pairing relationships with previously described lunar meteorites are also explored.

  20. Measurement, estimation, and prediction of software reliability

    NASA Technical Reports Server (NTRS)

    Hecht, H.

    1977-01-01

    Quantitative indices of software reliability are defined, and application of three important indices is indicated: (1) reliability measurement, (2) reliability estimation, and (3) reliability prediction. State of the art techniques for each of these procedures are presented together with considerations of data acquisition. Failure classifications and other documentation for comprehensive software reliability evaluation are described.

  1. Evolving role of pharmaceutical physicians in the industry: Indian perspective

    PubMed Central

    Patil, Anant; Rajadhyaksha, Viraj

    2012-01-01

    The Indian pharmaceutical industry, like any other industry, has undergone significant change in the last decade. The role of a Medical advisor has always been of paramount importance in the pharmaceutical companies in India. On account of the evolving medical science and the competitive environment, the medical advisor's role is also increasingly becoming critical. In India, with changes in regulatory rules, safety surveillance, and concept of medical liaisons, the role of the medical advisor is evolving continuously and is further likely to evolve in the coming years in important areas like health economics, public private partnerships, and strategic planning. PMID:22347701

  2. Heterogeneous edge weights promote epidemic diffusion in weighted evolving networks

    NASA Astrophysics Data System (ADS)

    Duan, Wei; Song, Zhichao; Qiu, Xiaogang

    2016-08-01

    The impact that the heterogeneities of links’ weights have on epidemic diffusion in weighted networks has received much attention. Investigating how heterogeneous edge weights affect epidemic spread is helpful for disease control. In this paper, we study a Reed-Frost epidemic model in weighted evolving networks. Our results indicate that a higher heterogeneity of edge weights leads to higher epidemic prevalence and epidemic incidence at earlier stage of epidemic diffusion in weighted evolving networks. In addition, weighted evolving scale-free networks come with a higher epidemic prevalence and epidemic incidence than unweighted scale-free networks.

  3. Electronics reliability and measurement technology

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S. (Editor)

    1987-01-01

    A summary is presented of the Electronics Reliability and Measurement Technology Workshop. The meeting examined the U.S. electronics industry with particular focus on reliability and state-of-the-art technology. A general consensus of the approximately 75 attendees was that "the U.S. electronics industries are facing a crisis that may threaten their existence". The workshop had specific objectives to discuss mechanisms to improve areas such as reliability, yield, and performance while reducing failure rates, delivery times, and cost. The findings of the workshop addressed various aspects of the industry from wafers to parts to assemblies. Key problem areas that were singled out for attention are identified, and action items necessary to accomplish their resolution are recommended.

  4. Reliability model for planetary gear

    NASA Technical Reports Server (NTRS)

    Savage, M.; Paridon, C. A.; Coy, J. J.

    1982-01-01

    A reliability model is presented for planetary gear trains in which the ring gear is fixed, the Sun gear is the input, and the planet arm is the output. The input and output shafts are coaxial and the input and output torques are assumed to be coaxial with these shafts. Thrust and side loading are neglected. This type of gear train is commonly used in main rotor transmissions for helicopters and in other applications which require high reductions in speed. The reliability model is based on the Weibull distribution of the individual reliabilities of the transmission components. The transmission's basic dynamic capacity is defined as the input torque which may be applied for one million input rotations of the Sun gear. Load and life are related by a power law. The load life exponent and basic dynamic capacity are developed as functions of the component capacities.

  5. A Review of Score Reliability: Contemporary Thinking on Reliability Issues

    ERIC Educational Resources Information Center

    Rosen, Gerald A.

    2004-01-01

    Bruce Thompson's edited volume begins with a basic principle, one might call it a basic truth, "reliability is a property that applies to scores, and not immutably across all conceivable uses everywhere of a given measure." (p. 3). The author claims that this principle is little known and-or little understood. While that is an arguable point, the…

  6. Designing magnetic systems for reliability

    SciTech Connect

    Heitzenroeder, P.J.

    1991-01-01

    Designing magnetic system is an iterative process in which the requirements are set, a design is developed, materials and manufacturing processes are defined, interrelationships with the various elements of the system are established, engineering analyses are performed, and fault modes and effects are studied. Reliability requires that all elements of the design process, from the seemingly most straightforward such as utilities connection design and implementation, to the most sophisticated such as advanced finite element analyses, receives a balanced and appropriate level of attention. D.B. Montgomery's study of magnet failures has shown that the predominance of magnet failures tend not to be in the most intensively engineered areas, but are associated with insulation, leads, ad unanticipated conditions. TFTR, JET, JT-60, and PBX are all major tokamaks which have suffered loss of reliability due to water leaks. Similarly the majority of causes of loss of magnet reliability at PPPL has not been in the sophisticated areas of the design but are due to difficulties associated with coolant connections, bus connections, and external structural connections. Looking towards the future, the major next-devices such as BPX and ITER are most costly and complex than any of their predecessors and are pressing the bounds of operating levels, materials, and fabrication. Emphasis on reliability is a must as the fusion program enters a phase where there are fewer, but very costly devices with the goal of reaching a reactor prototype stage in the next two or three decades. This paper reviews some of the magnet reliability issues which PPPL has faced over the years the lessons learned from them, and magnet design and fabrication practices which have been found to contribute to magnet reliability.

  7. Mechanically reliable scales and coatings

    SciTech Connect

    Tortorelli, P.F.; Alexander, K.B.

    1995-07-01

    As the first stage in examining the mechanical reliability of protective surface oxides, the behavior of alumina scales formed on iron-aluminum alloys during high-temperature cyclic oxidation was characterized in terms of damage and spallation tendencies. Scales were thermally grown on specimens of three iron-aluminum composition using a series of exposures to air at 1000{degrees}C. Gravimetric data and microscopy revealed substantially better integrity and adhesion of the scales grown on an alloy containing zirconium. The use of polished (rather than just ground) specimens resulted in scales that were more suitable for subsequent characterization of mechanical reliability.

  8. Metrological Reliability of Medical Devices

    NASA Astrophysics Data System (ADS)

    Costa Monteiro, E.; Leon, L. F.

    2015-02-01

    The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.

  9. Reliability in the design phase

    SciTech Connect

    Siahpush, A.S.; Hills, S.W.; Pham, H. ); Majumdar, D. )

    1991-12-01

    A study was performed to determine the common methods and tools that are available to calculated or predict a system's reliability. A literature review and software survey are included. The desired product of this developmental work is a tool for the system designer to use in the early design phase so that the final design will achieve the desired system reliability without lengthy testing and rework. Three computer programs were written which provide the first attempt at fulfilling this need. The programs are described and a case study is presented for each one. This is a continuing effort which will be furthered in FY-1992. 10 refs.

  10. Reliability in the design phase

    SciTech Connect

    Siahpush, A.S.; Hills, S.W.; Pham, H.; Majumdar, D.

    1991-12-01

    A study was performed to determine the common methods and tools that are available to calculated or predict a system`s reliability. A literature review and software survey are included. The desired product of this developmental work is a tool for the system designer to use in the early design phase so that the final design will achieve the desired system reliability without lengthy testing and rework. Three computer programs were written which provide the first attempt at fulfilling this need. The programs are described and a case study is presented for each one. This is a continuing effort which will be furthered in FY-1992. 10 refs.

  11. Photovoltaic power system reliability considerations

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.

    1980-01-01

    An example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems is presented. This particular application is for a solar cell power system demonstration project designed to provide electric power requirements for remote villages. The techniques utilized involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of fail-safe and planned spare parts engineering philosophy.

  12. System reliability and risk assessment task goals and status

    NASA Astrophysics Data System (ADS)

    Cruse, T. A.; Mahadevan, S.

    1991-05-01

    The major focus for continued development of the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) codes is in support of system testing and certification of advanced propulsion systems. Propulsion system testing has evolved over the years from tests designed to show success, to tests designed to reveal reliability issues before service use. Such test conditions as performance envelope corners, high rotor imbalance, power dwells, and overspeed tests are designed to shake out problems that can be associated with low and high cycle fatigue, creep, and stress rupture, bearing durability, and the like. Subsystem testing supports system certification by standing as an early evaluation of the same durability and reliability concerns as for the entire system. The NESSUS software system is being further developed to support the definition of rigorous subsystem and system test definition and reliability certification. The principal technical issues are outlined which are related to system reliability, including key technology issues such as failure mode synergism, sequential failure mechanisms, and fault tree definition.

  13. Polarization and studies of evolved star mass loss

    NASA Astrophysics Data System (ADS)

    Sargent, Benjamin; Srinivasan, Sundar; Riebel, David; Meixner, Margaret

    2012-05-01

    Polarization studies of astronomical dust have proven very useful in constraining its properties. Such studies are used to constrain the spatial arrangement, shape, composition, and optical properties of astronomical dust grains. Here we explore possible connections between astronomical polarization observations to our studies of mass loss from evolved stars. We are studying evolved star mass loss in the Large Magellanic Cloud (LMC) by using photometry from the Surveying the Agents of a Galaxy's Evolution (SAGE; PI: M. Meixner) Spitzer Space Telescope Legacy program. We use the radiative transfer program 2Dust to create our Grid of Red supergiant and Asymptotic giant branch ModelS (GRAMS), in order to model this mass loss. To model emission of polarized light from evolved stars, however, we appeal to other radiative transfer codes. We probe how polarization observations might be used to constrain the dust shell and dust grain properties of the samples of evolved stars we are studying.

  14. Evolutionary genetics: you are what you evolve to eat.

    PubMed

    Dworkin, Ian; Jones, Corbin D

    2015-04-20

    The evolution of host specialization can potentially limit future evolutionary opportunities. A new study now shows how Drosophila sechellia, specialized on the toxic Morinda fruit, has evolved new nutritional needs influencing its reproduction.

  15. Evolved Stars: Interferometer Baby Food or Staple Diet?

    NASA Astrophysics Data System (ADS)

    Tuthill, Peter

    With their extreme red and infrared luminosities and large apparent diameters, evolved stars have nurtured generations of interferometers (beginning with Michelson's work on Betelgeuse) with unique science programs at attainable resolutions. Furthermore, the inflated photosphere and circumstellar material associated with dying stars presents complex targets with asymmetric structure on many scales encoding a wealth of poorly-understood astrophysics. A brief review the major past milestones and future prospects for interferometry's contribution to studies of circumstellar matter in evolved stars is presented.

  16. Wanted: A Solid, Reliable PC

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    This article discusses PC reliability, one of the most pressing issues regarding computers. Nearly a quarter century after the introduction of the first IBM PC and the outset of the personal computer revolution, PCs have largely become commodities, with little differentiating one brand from another in terms of capability and performance. Most of…

  17. Web Awards: Are They Reliable?

    ERIC Educational Resources Information Center

    Everhart, Nancy; McKnight, Kathleen

    1997-01-01

    School library media specialists recommend quality Web sites to children based on evaluations and Web awards. This article examines three types of Web awards and who grants them, suggests ways to determine their reliability, and discusses specific award sites. Includes a bibliography of Web sites. (PEN)

  18. Becoming a high reliability organization

    PubMed Central

    2011-01-01

    Aircraft carriers, electrical power grids, and wildland firefighting, though seemingly different, are exemplars of high reliability organizations (HROs) - organizations that have the potential for catastrophic failure yet engage in nearly error-free performance. HROs commit to safety at the highest level and adopt a special approach to its pursuit. High reliability organizing has been studied and discussed for some time in other industries and is receiving increasing attention in health care, particularly in high-risk settings like the intensive care unit (ICU). The essence of high reliability organizing is a set of principles that enable organizations to focus attention on emergent problems and to deploy the right set of resources to address those problems. HROs behave in ways that sometimes seem counterintuitive - they do not try to hide failures but rather celebrate them as windows into the health of the system, they seek out problems, they avoid focusing on just one aspect of work and are able to see how all the parts of work fit together, they expect unexpected events and develop the capability to manage them, and they defer decision making to local frontline experts who are empowered to solve problems. Given the complexity of patient care in the ICU, the potential for medical error, and the particular sensitivity of critically ill patients to harm, high reliability organizing principles hold promise for improving ICU patient care. PMID:22188677

  19. The Reliability of College Grades

    ERIC Educational Resources Information Center

    Beatty, Adam S.; Walmsley, Philip T.; Sackett, Paul R.; Kuncel, Nathan R.; Koch, Amanda J.

    2015-01-01

    Little is known about the reliability of college grades relative to how prominently they are used in educational research, and the results to date tend to be based on small sample studies or are decades old. This study uses two large databases (N > 800,000) from over 200 educational institutions spanning 13 years and finds that both first-year…

  20. Averaging Internal Consistency Reliability Coefficients

    ERIC Educational Resources Information Center

    Feldt, Leonard S.; Charter, Richard A.

    2006-01-01

    Seven approaches to averaging reliability coefficients are presented. Each approach starts with a unique definition of the concept of "average," and no approach is more correct than the others. Six of the approaches are applicable to internal consistency coefficients. The seventh approach is specific to alternate-forms coefficients. Although the…

  1. Photovoltaic performance and reliability workshop

    SciTech Connect

    Kroposki, B

    1996-10-01

    This proceedings is the compilation of papers presented at the ninth PV Performance and Reliability Workshop held at the Sheraton Denver West Hotel on September 4--6, 1996. This years workshop included presentations from 25 speakers and had over 100 attendees. All of the presentations that were given are included in this proceedings. Topics of the papers included: defining service lifetime and developing models for PV module lifetime; examining and determining failure and degradation mechanisms in PV modules; combining IEEE/IEC/UL testing procedures; AC module performance and reliability testing; inverter reliability/qualification testing; standardization of utility interconnect requirements for PV systems; need activities to separate variables by testing individual components of PV systems (e.g. cells, modules, batteries, inverters,charge controllers) for individual reliability and then test them in actual system configurations; more results reported from field experience on modules, inverters, batteries, and charge controllers from field deployed PV systems; and system certification and standardized testing for stand-alone and grid-tied systems.

  2. Reliability Analysis of Money Habitudes

    ERIC Educational Resources Information Center

    Delgadillo, Lucy M.; Bushman, Brittani S.

    2015-01-01

    Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…

  3. Becoming a high reliability organization.

    PubMed

    Christianson, Marlys K; Sutcliffe, Kathleen M; Miller, Melissa A; Iwashyna, Theodore J

    2011-01-01

    Aircraft carriers, electrical power grids, and wildland firefighting, though seemingly different, are exemplars of high reliability organizations (HROs)--organizations that have the potential for catastrophic failure yet engage in nearly error-free performance. HROs commit to safety at the highest level and adopt a special approach to its pursuit. High reliability organizing has been studied and discussed for some time in other industries and is receiving increasing attention in health care, particularly in high-risk settings like the intensive care unit (ICU). The essence of high reliability organizing is a set of principles that enable organizations to focus attention on emergent problems and to deploy the right set of resources to address those problems. HROs behave in ways that sometimes seem counterintuitive--they do not try to hide failures but rather celebrate them as windows into the health of the system, they seek out problems, they avoid focusing on just one aspect of work and are able to see how all the parts of work fit together, they expect unexpected events and develop the capability to manage them, and they defer decision making to local frontline experts who are empowered to solve problems. Given the complexity of patient care in the ICU, the potential for medical error, and the particular sensitivity of critically ill patients to harm, high reliability organizing principles hold promise for improving ICU patient care.

  4. Wind turbine reliability database update.

    SciTech Connect

    Peters, Valerie A.; Hill, Roger Ray; Stinebaugh, Jennifer A.; Veers, Paul S.

    2009-03-01

    This report documents the status of the Sandia National Laboratories' Wind Plant Reliability Database. Included in this report are updates on the form and contents of the Database, which stems from a fivestep process of data partnerships, data definition and transfer, data formatting and normalization, analysis, and reporting. Selected observations are also reported.

  5. Genetically evolved receptor models: a computational approach to construction of receptor models.

    PubMed

    Walters, D E; Hinds, R M

    1994-08-01

    Given the three-dimensional structure of a receptor site, there are several methods available for designing ligands to occupy the site; frequently, the three-dimensional structure of interesting receptors is not known, however. The GERM program uses a genetic algorithm to produce atomic-level models of receptor sites, based on a small set of known structure-activity relationships. The evolved models show a high correlation between calculated intermolecular energies and bioactivities; they also give reasonable predictions of bioactivity for compounds which were not included in model generation. Such models may serve as starting points for computational or human ligand design efforts. PMID:8057298

  6. To evolve an ear: Epistemological implications of Gordon Pask`s electrochemical devices

    SciTech Connect

    Cariani, P.

    1993-12-31

    In the late 1950`s Gordon Pask constructed several electrochemical devices having emergent sensory capabilities. These control systems possessed the ability to adaptively construct their own sensors, thereby choosing the relationship between their internal states and the world at large. Devices were built that evolved de novo sensitivity to sound or magnetic fields. Pask`s devices have far-reaching implications for artificial intelligence, self-constructing devices, theories of observers and epistemically-autonomous agents, theories of functional emergence, machine creativity, and the limits of contemporary machine learning paradigms. 44 refs.

  7. Genetically evolved receptor models: a computational approach to construction of receptor models.

    PubMed

    Walters, D E; Hinds, R M

    1994-08-01

    Given the three-dimensional structure of a receptor site, there are several methods available for designing ligands to occupy the site; frequently, the three-dimensional structure of interesting receptors is not known, however. The GERM program uses a genetic algorithm to produce atomic-level models of receptor sites, based on a small set of known structure-activity relationships. The evolved models show a high correlation between calculated intermolecular energies and bioactivities; they also give reasonable predictions of bioactivity for compounds which were not included in model generation. Such models may serve as starting points for computational or human ligand design efforts.

  8. Bioharness(™) Multivariable Monitoring Device: Part. II: Reliability.

    PubMed

    Johnstone, James A; Ford, Paul A; Hughes, Gerwyn; Watson, Tim; Garrett, Andrew T

    2012-01-01

    The Bioharness(™) monitoring system may provide physiological information on human performance but the reliability of this data is fundamental for confidence in the equipment being used. The objective of this study was to assess the reliability of each of the 5 Bioharness(™) variables using a treadmill based protocol. 10 healthy males participated. A between and within subject design to assess the reliability of Heart rate (HR), Breathing Frequency (BF), Accelerometry (ACC) and Infra-red skin temperature (ST) was completed via a repeated, discontinuous, incremental treadmill protocol. Posture (P) was assessed by a tilt table, moved through 160°. Between subject data reported low Coefficient of Variation (CV) and strong correlations(r) for ACC and P (CV< 7.6; r = 0.99, p < 0.01). In contrast, HR and BF (CV~19.4; r~0.70, p < 0.01) and ST (CV 3.7; r = 0.61, p < 0.01), present more variable data. Intra and inter device data presented strong relationships (r > 0.89, p < 0.01) and low CV (<10.1) for HR, ACC, P and ST. BF produced weaker relationships (r < 0.72) and higher CV (<17.4). In comparison to the other variables BF variable consistently presents less reliability. Global results suggest that the Bioharness(™) is a reliable multivariable monitoring device during laboratory testing within the limits presented. Key pointsHeart rate and breathing frequency data increased in variance at higher velocities (i.e. ≥ 10 km.h(-1))In comparison to the between subject testing, the intra and inter reliability presented good reliability in data suggesting placement or position of device relative to performer could be important for data collectionUnderstanding a devices variability in measurement is important before it can be used within an exercise testing or monitoring setting. PMID:24149347

  9. Preliminary study of the reliability of imaging charge coupled devices

    NASA Technical Reports Server (NTRS)

    Beall, J. R.; Borenstein, M. D.; Homan, R. A.; Johnson, D. L.; Wilson, D. D.; Young, V. F.

    1978-01-01

    Imaging CCDs are capable of low light level response and high signal-to-noise ratios. In space applications they offer the user the ability to achieve extremely high resolution imaging with minimum circuitry in the photo sensor array. This work relates the CCD121H Fairchild device to the fundamentals of CCDs and the representative technologies. Several failure modes are described, construction is analyzed and test results are reported. In addition, the relationship of the device reliability to packaging principles is analyzed and test data presented. Finally, a test program is defined for more general reliability evaluation of CCDs.

  10. Climate in Context - How partnerships evolve in regions

    NASA Astrophysics Data System (ADS)

    Parris, A. S.

    2014-12-01

    In 2015, NOAA's RISA program will celebrate its 20th year of exploration in the development of usable climate information. In the mid-1990s, a vision emerged to develop interdisciplinary research efforts at the regional scale for several important reasons. Recognizable climate patterns, such as the El Nino Southern Oscillation (ENSO), emerge at the regional level where our understanding of observations and models coalesce. Critical resources for society are managed in a context of regional systems, such as water supply and human populations. Multiple scales of governance (local, state, and federal) with complex institutional relationships can be examined across a region. Climate information (i.e. data, science, research etc) developed within these contexts has greater potential for use. All of this work rests on a foundation of iterative engagement between scientists and decision makers. Throughout these interactions, RISAs have navigated diverse politics, extreme events and disasters, socio-economic and ecological disruptions, and advances in both science and technology. Our understanding of information needs is evolving into a richer understanding of complex institutional, legal, political, and cultural contexts within which people can use science to make informed decisions. The outcome of RISA work includes both cases where climate information was used in decisions and cases where capacity for using climate information and making climate resilient decisions has increased over time. In addition to balancing supply and demand of scientific information, RISAs are engaged in a social process of reconciling climate information use with important drivers of society. Because partnerships are critical for sustained engagement, and because engagement is critically important to the use of science, the rapid development of new capacity in regionally-based science programs focused on providing climate decision support is both needed and challenging. New actors can bolster

  11. Phenotypic effect of mutations in evolving populations of RNA molecules

    PubMed Central

    2010-01-01

    Background The secondary structure of folded RNA sequences is a good model to map phenotype onto genotype, as represented by the RNA sequence. Computational studies of the evolution of ensembles of RNA molecules towards target secondary structures yield valuable clues to the mechanisms behind adaptation of complex populations. The relationship between the space of sequences and structures, the organization of RNA ensembles at mutation-selection equilibrium, the time of adaptation as a function of the population parameters, the presence of collective effects in quasispecies, or the optimal mutation rates to promote adaptation all are issues that can be explored within this framework. Results We investigate the effect of microscopic mutations on the phenotype of RNA molecules during their in silico evolution and adaptation. We calculate the distribution of the effects of mutations on fitness, the relative fractions of beneficial and deleterious mutations and the corresponding selection coefficients for populations evolving under different mutation rates. Three different situations are explored: the mutation-selection equilibrium (optimized population) in three different fitness landscapes, the dynamics during adaptation towards a goal structure (adapting population), and the behavior under periodic population bottlenecks (perturbed population). Conclusions The ratio between the number of beneficial and deleterious mutations experienced by a population of RNA sequences increases with the value of the mutation rate μ at which evolution proceeds. In contrast, the selective value of mutations remains almost constant, independent of μ, indicating that adaptation occurs through an increase in the amount of beneficial mutations, with little variations in the average effect they have on fitness. Statistical analyses of the distribution of fitness effects reveal that small effects, either beneficial or deleterious, are well described by a Pareto distribution. These results

  12. Crossroads and Connections: An Evolving Relationship between NASA and the Navajo Nation

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Carron, A.

    2010-08-01

    Is working with Native Americans business as usual? We live in a project-based world that operates on three-to-five-year grants. A long term commitment can be next to impossible to keep, even if you have the best of intentions. Are there things one "must know" before approaching an indigenous population? How is it best to evaluate projects and programs involving Native Americans? In the NASA and the Navajo Nation project, which will turn five in January, 2010, we have compiled some key lessons learned that we hope will inform and encourage future partnerships between the space science education and Native American communities.

  13. The evolving placenta: convergent evolution of variations in the endotheliochorial relationship.

    PubMed

    Enders, A C; Carter, A M

    2012-05-01

    Endotheliochorial placentas occur in orders from all four major clades of eutherian mammal. Species with this type of placenta include one of the smallest (pygmy shrew) and largest (African elephant) land mammals. The endotheliochorial placenta as a definitive form has an interhemal area consisting of maternal endothelium, interstitial lamina, trophoblast, individual or conjoint basal laminas, and fetal endothelium. We commonly think of such placentas as having hypertrophied maternal endothelium with abundant rough endoplasmic reticulum (rER), and as having hemophagous regions. Considering them as a whole, the trophoblast may be syncytial or cellular, fenestrated or nonfenestrated, and there may or may not be hemophagous regions. Variations also appear in the extent of hypertrophy of the maternal endothelium and in the abundance of rER in these cells. This combination of traits and a few other features produces many morphological variants. In addition to endotheliochorial as a definitive condition, a transitory endotheliochorial condition may appear in the course of forming a hemochorial placenta. In some emballonurid bats the early endotheliochorial placenta has two layers of trophoblast, but the definitive placenta lacks an outer syncytial trophoblast layer. In mollosid bats a well developed endotheliochorial placenta is present for a short time even after a definitive hemochorial placenta has developed in a different region. It is concluded that the endotheliochorial placenta is more widespread and diversified than originally thought, with the variant with cellular trophoblast in particular appearing in several species studied recently.

  14. The States and Higher Education: An Evolving Relationship at a Pivotal Moment

    ERIC Educational Resources Information Center

    Meotti, Michael P.

    2016-01-01

    The "proud-parent" attitude of states towards higher education between 1945 and 1970--due to the baby boom, the technological contributions that research universities had made to the war effort, and the GI Bill--began to cool in the late 1960s, when inflation and increasing demands from other state services such as Medicaid, prisons,…

  15. TEMPI: probabilistic modeling time-evolving differential PPI networks with multiPle information

    PubMed Central

    Kim, Yongsoo; Jang, Jin-Hyeok; Choi, Seungjin; Hwang, Daehee

    2014-01-01

    Motivation: Time-evolving differential protein–protein interaction (PPI) networks are essential to understand serial activation of differentially regulated (up- or downregulated) cellular processes (DRPs) and their interplays over time. Despite developments in the network inference, current methods are still limited in identifying temporal transition of structures of PPI networks, DRPs associated with the structural transition and the interplays among the DRPs over time. Results: Here, we present a probabilistic model for estimating Time-Evolving differential PPI networks with MultiPle Information (TEMPI). This model describes probabilistic relationships among network structures, time-course gene expression data and Gene Ontology biological processes (GOBPs). By maximizing the likelihood of the probabilistic model, TEMPI estimates jointly the time-evolving differential PPI networks (TDNs) describing temporal transition of PPI network structures together with serial activation of DRPs associated with transiting networks. This joint estimation enables us to interpret the TDNs in terms of temporal transition of the DRPs. To demonstrate the utility of TEMPI, we applied it to two time-course datasets. TEMPI identified the TDNs that correctly delineated temporal transition of DRPs and time-dependent associations between the DRPs. These TDNs provide hypotheses for mechanisms underlying serial activation of key DRPs and their temporal associations. Availability and implementation: Source code and sample data files are available at http://sbm.postech.ac.kr/tempi/sources.zip. Contact: seungjin@postech.ac.kr or dhwang@dgist.ac.kr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25161233

  16. Variability in Reliability Coefficients and the Standard Error of Measurement from School District to District.

    ERIC Educational Resources Information Center

    Feldt, Leonard S.; Qualls, Audrey L.

    1999-01-01

    Examined the stability of the standard error of measurement and the relationship between the reliability coefficient and the variance of both true scores and error scores for 170 school districts in a state. As expected, reliability coefficients varied as a function of group variability, but the variation in split-half coefficients from school to…

  17. A Model for Estimating the Reliability and Validity of Criterion-Referenced Measures.

    ERIC Educational Resources Information Center

    Edmonston, Leon P.; Randall, Robert S.

    A decision model designed to determine the reliability and validity of criterion referenced measures (CRMs) is presented. General procedures which pertain to the model are discussed as to: Measures of relationship, Reliability, Validity (content, criterion-oriented, and construct validation), and Item Analysis. The decision model is presented in…

  18. Differences in Reliability of Reproductive History Recall among Women in North Africa

    ERIC Educational Resources Information Center

    Soliman, Amr; Allen, Katharine; Lo, An-Chi; Banerjee, Mousumi; Hablas, Ahmed; Benider, Abdellatif; Benchekroun, Nadya; Samir, Salwa; Omar, Hoda G.; Merajver, Sofia; Mullan, Patricia

    2009-01-01

    Breast cancer is the most common cancer among women in North Africa. Women in this region have unique reproductive profiles. It is essential to obtain reliable information on reproductive histories to help better understand the relationship between reductive health and breast cancer. We tested the reliability of a reproductive history-based…

  19. On-orbit spacecraft reliability

    NASA Technical Reports Server (NTRS)

    Bloomquist, C.; Demars, D.; Graham, W.; Henmi, P.

    1978-01-01

    Operational and historic data for 350 spacecraft from 52 U.S. space programs were analyzed for on-orbit reliability. Failure rates estimates are made for on-orbit operation of spacecraft subsystems, components, and piece parts, as well as estimates of failure probability for the same elements during launch. Confidence intervals for both parameters are also given. The results indicate that: (1) the success of spacecraft operation is only slightly affected by most reported incidents of anomalous behavior; (2) the occurrence of the majority of anomalous incidents could have been prevented piror to launch; (3) no detrimental effect of spacecraft dormancy is evident; (4) cycled components in general are not demonstrably less reliable than uncycled components; and (5) application of product assurance elements is conductive to spacecraft success.

  20. Radioactive Reliability of Programmable Memories

    NASA Astrophysics Data System (ADS)

    Loncar, Boris; Osmokrovic, Predrag; Stojanovic, Marko; Stankovic, Srboljub

    2001-02-01

    In this study, we examine the reliability of erasable programmable read only memory (EPROM) and electrically erasable programmable read only memory (EEPROM) components under the influence of gamma radiation. This problem has significance in military industry and space technology. Total dose results are presented for the JL 27C512D EPROM and 28C64C EEPROM components. There is evidence that EPROM components have better radioactive reliability than EEPROM components. Also, the changes to the EPROM are reversible, and after erasing and reprogramming all EPROM components are functional. On the other hand, changes to the EEPROM are irreversible, and under the influence of gamma radiation, all EEPROM components became permanently nonfunctional. The obtained results are analyzed and explained via the interaction of gamma radiation with oxide layers.

  1. What makes a family reliable?

    NASA Technical Reports Server (NTRS)

    Williams, James G.

    1992-01-01

    Asteroid families are clusters of asteroids in proper element space which are thought to be fragments from former collisions. Studies of families promise to improve understanding of large collision events and a large event can open up the interior of a former parent body to view. While a variety of searches for families have found the same heavily populated families, and some searches have found the same families of lower population, there is much apparent disagreement between proposed families of lower population of different investigations. Indicators of reliability, factors compromising reliability, an illustration of the influence of different data samples, and a discussion of how several investigations perceived families in the same region of proper element space are given.

  2. Three approaches to reliability analysis

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.

    1989-01-01

    It is noted that current reliability analysis tools differ not only in their solution techniques, but also in their approach to model abstraction. The analyst must be satisfied with the constraints that are intrinsic to any combination of solution technique and model abstraction. To get a better idea of the nature of these constraints, three reliability analysis tools (HARP, ASSIST/SURE, and CAME) were used to model portions of the Integrated Airframe/Propulsion Control System architecture. When presented with the example problem, all three tools failed to produce correct results. In all cases, either the tool or the model had to be modified. It is suggested that most of the difficulty is rooted in the large model size and long computational times which are characteristic of Markov model solutions.

  3. Gearbox Reliability Collaborative Bearing Calibration

    SciTech Connect

    van Dam, J.

    2011-10-01

    NREL has initiated the Gearbox Reliability Collaborative (GRC) to investigate the root cause of the low wind turbine gearbox reliability. The GRC follows a multi-pronged approach based on a collaborative of manufacturers, owners, researchers and consultants. The project combines analysis, field testing, dynamometer testing, condition monitoring, and the development and population of a gearbox failure database. At the core of the project are two 750kW gearboxes that have been redesigned and rebuilt so that they are representative of the multi-megawatt gearbox topology currently used in the industry. These gearboxes are heavily instrumented and are tested in the field and on the dynamometer. This report discusses the bearing calibrations of the gearboxes.

  4. Assessment of NDE Reliability Data

    NASA Technical Reports Server (NTRS)

    Yee, B. G. W.; Chang, F. H.; Covchman, J. C.; Lemon, G. H.; Packman, P. F.

    1976-01-01

    Twenty sets of relevant Nondestructive Evaluation (NDE) reliability data have been identified, collected, compiled, and categorized. A criterion for the selection of data for statistical analysis considerations has been formulated. A model to grade the quality and validity of the data sets has been developed. Data input formats, which record the pertinent parameters of the defect/specimen and inspection procedures, have been formulated for each NDE method. A comprehensive computer program has been written to calculate the probability of flaw detection at several confidence levels by the binomial distribution. This program also selects the desired data sets for pooling and tests the statistical pooling criteria before calculating the composite detection reliability. Probability of detection curves at 95 and 50 percent confidence levels have been plotted for individual sets of relevant data as well as for several sets of merged data with common sets of NDE parameters.

  5. Some Reliability Estimates for Computerized Adaptive Tests.

    ERIC Educational Resources Information Center

    Nicewander, W. Alan; Thomasson, Gary L.

    1999-01-01

    Derives three reliability estimates for the Bayes modal estimate (BME) and the maximum-likelihood estimate (MLE) of theta in computerized adaptive tests (CATs). Computes the three reliability estimates and the true reliabilities of both BME and MLE for seven simulated CATs. Results show the true reliabilities for BME and MLE to be nearly identical…

  6. JMP Applications in Photovoltaic Reliability (Presentation)

    SciTech Connect

    Jordan, D.; Gotwalt, C.

    2011-09-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted into power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders?utility companies, integrators, investors, and scientist alike. Outdoor testing plays a vital part in quantifying degradation rates of different technologies in various climates. Due to seasonal changes, however, several complete cycles (typically 3-5 years) need to be completed traditionally to obtain reasonably accurate degradation rates. In a rapidly evolving industry such a time span is often unacceptable and the need exists to determine degradation rates more accurately in a shorter period of time. Advanced time series modeling such as ARIMA (Autoregressive Integrated Moving Average) modeling can be utilized to decrease the required time span and is compared with some non-linear modeling. In addition, it will be demonstrated how the JMP 9 map feature was used to reveal important technological trends by climate.

  7. Vulcain engine tests prove reliability

    NASA Astrophysics Data System (ADS)

    Covault, Craig

    1994-04-01

    The development of the oxygen/hydrogen Vulcain first-stage engine for the Ariane 5 involves more than 30 European companies and $1.19-billion. These companies are using existing technology to produce a low-cost system with high thrust and reliability. This article describes ground test of this engine, and provides a comparison of the Vulcain's capabilities with the capabilities of other systems. A list of key Vulcain team members is also given.

  8. Measuring and comparing evolvability and constraint in multivariate characters.

    PubMed

    Hansen, T F; Houle, D

    2008-09-01

    The Lande equation forms the basis for our understanding of the short-term evolution of quantitative traits in a multivariate context. It predicts the response to selection as the product of an additive genetic variance matrix and a selection gradient. The selection gradient approximates the force and direction of selection, and the genetic variance matrix quantifies the role of the genetic system in evolution. Attempts to understand the evolutionary significance of the genetic variance matrix are hampered by the fact that the majority of the methods used to characterize and compare variance matrices have not been derived in an explicit theoretical context. We use the Lande equation to derive new measures of the ability of a variance matrix to allow or constrain evolution in any direction in phenotype space. Evolvability captures the ability of a population to evolve in the direction of selection when stabilizing selection is absent. Conditional evolvability captures the ability of a population to respond to directional selection in the presence of stabilizing selection on other trait combinations. We then derive measures of character autonomy and integration from these evolvabilities. We study the properties of these measures and show how they can be used to interpret and compare variance matrices. As an illustration, we show that divergence of wing shape in the dipteran family Drosophilidae has proceeded in directions that have relatively high evolvabilities.

  9. The fitness and functionality of culturally evolved communication systems.

    PubMed

    Fay, Nicolas; Garrod, Simon; Roberts, Leo

    2008-11-12

    This paper assesses whether human communication systems undergo the same progressive adaptation seen in animal communication systems and concrete artefacts. Four experiments compared the fitness of ad hoc sign systems created under different conditions when participants play a graphical communication task. Experiment 1 demonstrated that when participants are organized into interacting communities, a series of signs evolve that enhance individual learning and promote efficient decoding. No such benefits are found for signs that result from the local interactions of isolated pairs of interlocutors. Experiments 2 and 3 showed that the decoding benefits associated with community evolved signs cannot be attributed to superior sign encoding or detection. Experiment 4 revealed that naive overseers were better able to identify the meaning of community evolved signs when compared with isolated pair developed signs. Hence, the decoding benefits for community evolved signs arise from their greater residual iconicity. We argue that community evolved sign systems undergo a process of communicative selection and adaptation that promotes optimized sign systems. This results from the interplay between sign diversity and a global alignment constraint; pairwise interaction introduces a range of competing signs and the need to globally align on a single sign-meaning mapping for each referent applies selection pressure.

  10. The fitness and functionality of culturally evolved communication systems.

    PubMed

    Fay, Nicolas; Garrod, Simon; Roberts, Leo

    2008-11-12

    This paper assesses whether human communication systems undergo the same progressive adaptation seen in animal communication systems and concrete artefacts. Four experiments compared the fitness of ad hoc sign systems created under different conditions when participants play a graphical communication task. Experiment 1 demonstrated that when participants are organized into interacting communities, a series of signs evolve that enhance individual learning and promote efficient decoding. No such benefits are found for signs that result from the local interactions of isolated pairs of interlocutors. Experiments 2 and 3 showed that the decoding benefits associated with community evolved signs cannot be attributed to superior sign encoding or detection. Experiment 4 revealed that naive overseers were better able to identify the meaning of community evolved signs when compared with isolated pair developed signs. Hence, the decoding benefits for community evolved signs arise from their greater residual iconicity. We argue that community evolved sign systems undergo a process of communicative selection and adaptation that promotes optimized sign systems. This results from the interplay between sign diversity and a global alignment constraint; pairwise interaction introduces a range of competing signs and the need to globally align on a single sign-meaning mapping for each referent applies selection pressure. PMID:18799421

  11. How Hierarchical Topics Evolve in Large Text Corpora.

    PubMed

    Cui, Weiwei; Liu, Shixia; Wu, Zhuofeng; Wei, Hao

    2014-12-01

    Using a sequence of topic trees to organize documents is a popular way to represent hierarchical and evolving topics in text corpora. However, following evolving topics in the context of topic trees remains difficult for users. To address this issue, we present an interactive visual text analysis approach to allow users to progressively explore and analyze the complex evolutionary patterns of hierarchical topics. The key idea behind our approach is to exploit a tree cut to approximate each tree and allow users to interactively modify the tree cuts based on their interests. In particular, we propose an incremental evolutionary tree cut algorithm with the goal of balancing 1) the fitness of each tree cut and the smoothness between adjacent tree cuts; 2) the historical and new information related to user interests. A time-based visualization is designed to illustrate the evolving topics over time. To preserve the mental map, we develop a stable layout algorithm. As a result, our approach can quickly guide users to progressively gain profound insights into evolving hierarchical topics. We evaluate the effectiveness of the proposed method on Amazon's Mechanical Turk and real-world news data. The results show that users are able to successfully analyze evolving topics in text data.

  12. The value of monitoring to control evolving populations

    PubMed Central

    Fischer, Andrej; Mustonen, Ville

    2015-01-01

    Populations can evolve to adapt to external changes. The capacity to evolve and adapt makes successful treatment of infectious diseases and cancer difficult. Indeed, therapy resistance has become a key challenge for global health. Therefore, ideas of how to control evolving populations to overcome this threat are valuable. Here we use the mathematical concepts of stochastic optimal control to study what is needed to control evolving populations. Following established routes to calculate control strategies, we first study how a polymorphism can be maintained in a finite population by adaptively tuning selection. We then introduce a minimal model of drug resistance in a stochastically evolving cancer cell population and compute adaptive therapies. When decisions are in this manner based on monitoring the response of the tumor, this can outperform established therapy paradigms. For both case studies, we demonstrate the importance of high-resolution monitoring of the target population to achieve a given control objective, thus quantifying the intuition that to control, one must monitor. PMID:25587136

  13. How Hierarchical Topics Evolve in Large Text Corpora.

    PubMed

    Cui, Weiwei; Liu, Shixia; Wu, Zhuofeng; Wei, Hao

    2014-12-01

    Using a sequence of topic trees to organize documents is a popular way to represent hierarchical and evolving topics in text corpora. However, following evolving topics in the context of topic trees remains difficult for users. To address this issue, we present an interactive visual text analysis approach to allow users to progressively explore and analyze the complex evolutionary patterns of hierarchical topics. The key idea behind our approach is to exploit a tree cut to approximate each tree and allow users to interactively modify the tree cuts based on their interests. In particular, we propose an incremental evolutionary tree cut algorithm with the goal of balancing 1) the fitness of each tree cut and the smoothness between adjacent tree cuts; 2) the historical and new information related to user interests. A time-based visualization is designed to illustrate the evolving topics over time. To preserve the mental map, we develop a stable layout algorithm. As a result, our approach can quickly guide users to progressively gain profound insights into evolving hierarchical topics. We evaluate the effectiveness of the proposed method on Amazon's Mechanical Turk and real-world news data. The results show that users are able to successfully analyze evolving topics in text data. PMID:26356942

  14. Defining Requirements for Improved Photovoltaic System Reliability

    SciTech Connect

    Maish, A.B.

    1998-12-21

    Reliable systems are an essential ingredient of any technology progressing toward commercial maturity and large-scale deployment. This paper defines reliability as meeting system fictional requirements, and then develops a framework to understand and quantify photovoltaic system reliability based on initial and ongoing costs and system value. The core elements necessary to achieve reliable PV systems are reviewed. These include appropriate system design, satisfactory component reliability, and proper installation and servicing. Reliability status, key issues, and present needs in system reliability are summarized for four application sectors.

  15. Reliability Models and Attributable Risk

    NASA Technical Reports Server (NTRS)

    Jarvinen, Richard D.

    1999-01-01

    The intention of this report is to bring a developing and extremely useful statistical methodology to greater attention within the Safety, Reliability, and Quality Assurance Office of the NASA Johnson Space Center. The statistical methods in this exposition are found under the heading of attributable risk. Recently the Safety, Reliability, and Quality Assurance Office at the Johnson Space Center has supported efforts to introduce methods of medical research statistics dealing with the survivability of people to bear on the problems of aerospace that deal with the reliability of component hardware used in the NASA space program. This report, which describes several study designs for which attributable risk is used, is in concert with the latter goals. The report identifies areas of active research in attributable risk while briefly describing much of what has been developed in the theory of attributable risk. The report, which largely is a report on a report, attempts to recast the medical setting and language commonly found in descriptions of attributable risk into the setting and language of the space program and its component hardware.

  16. Reliability in individual monitoring service.

    PubMed

    Mod Ali, N

    2011-03-01

    As a laboratory certified to ISO 9001:2008 and accredited to ISO/IEC 17025, the Secondary Standard Dosimetry Laboratory (SSDL)-Nuclear Malaysia has incorporated an overall comprehensive system for technical and quality management in promoting a reliable individual monitoring service (IMS). Faster identification and resolution of issues regarding dosemeter preparation and issuing of reports, personnel enhancement, improved customer satisfaction and overall efficiency of laboratory activities are all results of the implementation of an effective quality system. Review of these measures and responses to observed trends provide continuous improvement of the system. By having these mechanisms, reliability of the IMS can be assured in the promotion of safe behaviour at all levels of the workforce utilising ionising radiation facilities. Upgradation of in the reporting program through a web-based e-SSDL marks a major improvement in Nuclear Malaysia's IMS reliability on the whole. The system is a vital step in providing a user friendly and effective occupational exposure evaluation program in the country. It provides a higher level of confidence in the results generated for occupational dose monitoring of the IMS, thus, enhances the status of the radiation protection framework of the country.

  17. Reliability in individual monitoring service.

    PubMed

    Mod Ali, N

    2011-03-01

    As a laboratory certified to ISO 9001:2008 and accredited to ISO/IEC 17025, the Secondary Standard Dosimetry Laboratory (SSDL)-Nuclear Malaysia has incorporated an overall comprehensive system for technical and quality management in promoting a reliable individual monitoring service (IMS). Faster identification and resolution of issues regarding dosemeter preparation and issuing of reports, personnel enhancement, improved customer satisfaction and overall efficiency of laboratory activities are all results of the implementation of an effective quality system. Review of these measures and responses to observed trends provide continuous improvement of the system. By having these mechanisms, reliability of the IMS can be assured in the promotion of safe behaviour at all levels of the workforce utilising ionising radiation facilities. Upgradation of in the reporting program through a web-based e-SSDL marks a major improvement in Nuclear Malaysia's IMS reliability on the whole. The system is a vital step in providing a user friendly and effective occupational exposure evaluation program in the country. It provides a higher level of confidence in the results generated for occupational dose monitoring of the IMS, thus, enhances the status of the radiation protection framework of the country. PMID:21147789

  18. Field-evolved insect resistance to Bt crops: definition, theory, and data.

    PubMed

    Tabashnik, Bruce E; Van Rensburg, J B J; Carrière, Yves

    2009-12-01

    Transgenic crops producing Bacillus thuringiensis (Bt) toxins for insect pest control have been successful, but their efficacy is reduced when pests evolve resistance. Here we review the definition of field-evolved resistance, the relationship between resistance and field control problems, the theory underlying strategies for delaying resistance, and resistance monitoring methods. We also analyze resistance monitoring data from five continents reported in 41 studies that evaluate responses of field populations of 11 lepidopteran pests to four Bt toxins produced by Bt corn and cotton. After more than a decade since initial commercialization of Bt crops, most target pest populations remain susceptible, whereas field-evolved resistance has been documented in some populations of three noctuid moth species: Spodoptera frugiperda (J. E. Smith) to Cry1F in Bt corn in Puerto Rico, Busseola fusca (Fuller) to CrylAb in Bt corn in South Africa, and Helicoverpa zea (Boddie) to CrylAc and Cry2Ab in Bt cotton in the southeastern United States. Field outcomes are consistent with predictions from theory, suggesting that factors delaying resistance include recessive inheritance of resistance, abundant refuges of non-Bt host plants, and two-toxin Bt crops deployed separately from one-toxin Bt crops. The insights gained from systematic analyses of resistance monitoring data may help to enhance the durability of transgenic insecticidal crops. We recommend continued use of the longstanding definition of resistance cited here and encourage discussions about which regulatory actions, if any, should be triggered by specific data on the magnitude, distribution, and impact of field-evolved resistance.

  19. High density lipoprotein cholesterol: an evolving target of therapy in the management of cardiovascular disease

    PubMed Central

    Kapur, Navin K; Ashen, Dominique; Blumenthal, Roger S

    2008-01-01

    Since the pioneering work of John Gofman in the 1950s, our understanding of high density lipoprotein cholesterol (HDL-C) and its relationship to coronary heart disease (CHD) has grown substantially. Numerous clinical trials since the Framingham Study in 1977 have demonstrated an inverse relationship between HDL-C and one’s risk of developing CHD. Over the past two decades, preclinical research has gained further insight into the nature of HDL-C metabolism, specifically regarding the ability of HDL-C to promote reverse cholesterol transport (RCT). Recent attempts to harness HDL’s ability to enhance RCT have revealed the complexity of HDL-C metabolism. This review provides a detailed update on HDL-C as an evolving therapeutic target in the management of cardiovascular disease. PMID:18629371

  20. Autonomous Agent-Based Systems and Their Applications in Fluid Dynamics, Particle Separation, and Co-evolving Networks

    NASA Astrophysics Data System (ADS)

    Graeser, Oliver

    This thesis comprises three parts, reporting research results in Fluid Dynamics (Part I), Particle Separation (Part II) and Co-evolving Networks (Part III). Part I deals with the simulation of fluid dynamics using the lattice-Boltzmann method. Microfluidic devices often feature two-dimensional, repetitive arrays. Flows through such devices are pressure-driven and confined by solid walls. We have defined new adaptive generalised periodic boundary conditions to represent the effects of outer solid walls, and are thus able to exploit the periodicity of the array by simulating the flow through one unit cell in lieu of the entire device. The so-calculated fully developed flow describes the flow through the entire array accurately, but with computational requirements that are reduced according to the dimensions of the array. Part II discusses the problem of separating macromolecules like proteins or DNA coils. The reliable separation of such molecules is a crucial task in molecular biology. The use of Brownian ratchets as mechanisms for the separation of such particles has been proposed and discussed during the last decade. Pressure-driven flows have so far been dismissed as possible driving forces for Brownian ratchets, as they do not generate ratchet asymmetry. We propose a microfluidic design that uses pressure-driven flows to create asymmetry and hence allows particle separation. The dependence of the asymmetry on various factors of the microfluidic geometry is discussed. We further exemplify the feasibility of our approach using Brownian dynamics simulations of particles of different sizes in such a device. The results show that ratchet-based particle separation using flows as the driving force is possible. Simulation results and ratchet theory predictions are in excellent agreement. Part III deals with the co-evolution of networks and dynamic models. A group of agents occupies the nodes of a network, which defines the relationship between these agents. The

  1. A nonspecific defensive compound evolves into a competition avoidance cue and a female sex pheromone

    PubMed Central

    Weiss, Ingmar; Rössler, Thomas; Hofferberth, John; Brummer, Michael; Ruther, Joachim; Stökl, Johannes

    2013-01-01

    The evolution of chemical communication and the origin of pheromones are among the most challenging issues in chemical ecology. Current theory predicts that chemical communication can arise from compounds primarily evolved for non-communicative purposes but experimental evidence showing a gradual evolution of non-informative compounds into cues and true signals is scarce. Here we report that females of the parasitic wasp Leptopilina heterotoma use the defensive compound (−)-iridomyrmecin as a semiochemical cue to avoid interference with con- and heterospecific competitors and as the main component of a species-specific sex pheromone. Although competition avoidance is mediated by (−)-iridomyrmecin alone, several structurally related minor compounds are necessary for reliable mate attraction and recognition. Our findings provide insights into the evolution of insect pheromones by demonstrating that the increasing specificity of chemical information is accompanied by an increasing complexity of the chemical messengers involved and the evolution of the chemosensory adaptations for their exploitation. PMID:24231727

  2. How is cyber threat evolving and what do organisations need to consider?

    PubMed

    Borrett, Martin; Carter, Roger; Wespi, Andreas

    Organisations and members of the public are becoming accustomed to the increasing velocity, frequency and variety of cyber-attacks that they have been facing over the last few years. In response to this challenge, it is important to explore what can be done to offer commercial and private users a reliable and functioning environment. This paper discusses how cyber threats might evolve in the future and seeks to explore these threats more fully. Attention is paid to the changing nature of cyber-attackers and their motivations and what this means for organisations. Finally, useful and actionable steps are provided, which practitioners can use to understand how they can start to address the future challenges of cyber security. PMID:24457327

  3. How is cyber threat evolving and what do organisations need to consider?

    PubMed

    Borrett, Martin; Carter, Roger; Wespi, Andreas

    Organisations and members of the public are becoming accustomed to the increasing velocity, frequency and variety of cyber-attacks that they have been facing over the last few years. In response to this challenge, it is important to explore what can be done to offer commercial and private users a reliable and functioning environment. This paper discusses how cyber threats might evolve in the future and seeks to explore these threats more fully. Attention is paid to the changing nature of cyber-attackers and their motivations and what this means for organisations. Finally, useful and actionable steps are provided, which practitioners can use to understand how they can start to address the future challenges of cyber security.

  4. A nonspecific defensive compound evolves into a competition avoidance cue and a female sex pheromone.

    PubMed

    Weiss, Ingmar; Rössler, Thomas; Hofferberth, John; Brummer, Michael; Ruther, Joachim; Stökl, Johannes

    2013-01-01

    The evolution of chemical communication and the origin of pheromones are among the most challenging issues in chemical ecology. Current theory predicts that chemical communication can arise from compounds primarily evolved for non-communicative purposes but experimental evidence showing a gradual evolution of non-informative compounds into cues and true signals is scarce. Here we report that females of the parasitic wasp Leptopilina heterotoma use the defensive compound (-)-iridomyrmecin as a semiochemical cue to avoid interference with con- and heterospecific competitors and as the main component of a species-specific sex pheromone. Although competition avoidance is mediated by (-)-iridomyrmecin alone, several structurally related minor compounds are necessary for reliable mate attraction and recognition. Our findings provide insights into the evolution of insect pheromones by demonstrating that the increasing specificity of chemical information is accompanied by an increasing complexity of the chemical messengers involved and the evolution of the chemosensory adaptations for their exploitation. PMID:24231727

  5. Evolving mobile robots able to display collective behaviors.

    PubMed

    Baldassarre, Gianluca; Nolfi, Stefano; Parisi, Domenico

    2003-01-01

    We present a set of experiments in which simulated robots are evolved for the ability to aggregate and move together toward a light target. By developing and using quantitative indexes that capture the structural properties of the emerged formations, we show that evolved individuals display interesting behavioral patterns in which groups of robots act as a single unit. Moreover, evolved groups of robots with identical controllers display primitive forms of situated specialization and play different behavioral functions within the group according to the circumstances. Overall, the results presented in the article demonstrate that evolutionary techniques, by exploiting the self-organizing behavioral properties that emerge from the interactions between the robots and between the robots and the environment, are a powerful method for synthesizing collective behavior. PMID:14556687

  6. Synthesis of Evolving Cells for Reconfigurable Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Padayachee, J.; Bright, G.

    2014-07-01

    The concept of Reconfigurable Manufacturing Systems (RMSs) was formulated due to the global necessity for production systems that are able to economically evolve according to changes in markets and products. Technologies and design methods are under development to enable RMSs to exhibit transformable system layouts, reconfigurable processes, cells and machines. Existing factory design methods and software have not yet advanced to include reconfigurable manufacturing concepts. This paper presents the underlying group technology framework for the design of manufacturing cells that are able to evolve according to a changing product mix by mechanisms of reconfiguration. The framework is based on a Norton- Bass forecast and time variant BOM models. An adaptation of legacy group technology methods is presented for the synthesis of evolving cells and two optimization problems are presented within this context.

  7. Evolving Lorentzian wormholes supported by phantom matter and cosmological constant

    SciTech Connect

    Cataldo, Mauricio; Campo, Sergio del; Minning, Paul; Salgado, Patricio

    2009-01-15

    In this paper we study the possibility of sustaining an evolving wormhole via exotic matter made of phantom energy in the presence of a cosmological constant. We derive analytical evolving wormhole geometries by supposing that the radial tension of the phantom matter, which is negative to the radial pressure, and the pressure measured in the tangential directions have barotropic equations of state with constant state parameters. In this case the presence of a cosmological constant ensures accelerated expansion of the wormhole configurations. More specifically, for positive cosmological constant we have wormholes which expand forever and, for negative cosmological constant we have wormholes which expand to a maximum value and then recollapse. At spatial infinity the energy density and the pressures of the anisotropic phantom matter threading the wormholes vanish; thus these evolving wormholes are asymptotically vacuum {lambda}-Friedmann models with either open or closed or flat topologies.

  8. Evolving Systems: An Outcome of Fondest Hopes and Wildest Dreams

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Balas, Mark J.

    2012-01-01

    New theory is presented for evolving systems, which are autonomously controlled subsystems that self-assemble into a new evolved system with a higher purpose. Evolving systems of aerospace structures often require additional control when assembling to maintain stability during the entire evolution process. This is the concept of Adaptive Key Component Control that operates through one specific component to maintain stability during the evolution. In addition, this control must often overcome persistent disturbances that occur while the evolution is in progress. Theoretical results will be presented for Adaptive Key Component control for persistent disturbance rejection. An illustrative example will demonstrate the Adaptive Key Component controller on a system composed of rigid body and flexible body modes.

  9. Topology of Coronal Fields from Evolving Magnetofrictional Models

    NASA Astrophysics Data System (ADS)

    DeRosa, Marc L.; Cheung, M.

    2012-05-01

    The evolving magnetofrictional (MF) scheme enables the construction of time-dependent models of the active region coronal magnetic field in response to photospheric driving. When advancing such models, only the magnetic induction is solved, during which the velocity at each point is assumed to be oriented parallel to the Lorentz force. This leads to the field to evolve toward a force-free state. We present results from an evolving MF model of NOAA AR11158 using driving from time sequences of SDO/HMI data. Utilizing this simulation, we investigate changes in magnetic configurations and topology, including the number of null points, evolution of quasi-separatrix layers, and the time-history of total and free magnetic energies as well as relative helicity. This work seeks to elucidate the relation(s) between topological and energetic properties of the AR.

  10. Hybridization Reveals the Evolving Genomic Architecture of Speciation

    PubMed Central

    Kronforst, Marcus R.; Hansen, Matthew E.B.; Crawford, Nicholas G.; Gallant, Jason R.; Zhang, Wei; Kulathinal, Rob J.; Kapan, Durrell D.; Mullen, Sean P.

    2014-01-01

    SUMMARY The rate at which genomes diverge during speciation is unknown, as are the physical dynamics of the process. Here, we compare full genome sequences of 32 butterflies, representing five species from a hybridizing Heliconius butterfly community, to examine genome-wide patterns of introgression and infer how divergence evolves during the speciation process. Our analyses reveal that initial divergence is restricted to a small fraction of the genome, largely clustered around known wing-patterning genes. Over time, divergence evolves rapidly, due primarily to the origin of new divergent regions. Furthermore, divergent genomic regions display signatures of both selection and adaptive introgression, demonstrating the link between microevolutionary processes acting within species and the origin of species across macroevolutionary timescales. Our results provide a uniquely comprehensive portrait of the evolving species boundary due to the role that hybridization plays in reducing the background accumulation of divergence at neutral sites. PMID:24183670

  11. The evolved basis and adaptive functions of cognitive distortions.

    PubMed

    Gilbert, P

    1998-12-01

    This paper explores common cognitive distortions from the perspective of evolutionary psychology. It is suggested that cognitive distortions are natural consequences of using fast track defensive algorithms that are sensitive to threat. In various contexts, especially those of threat, humans evolved to think adaptively rather than logically. Hence cognitive distortions are not strictly errors in brain functioning and it can be useful to inform patients that 'negative thinking' may be dysfunctional but is a reflection of basic brain design and not personal irrationality. The evolved nature of cognitive distortions has been implicit in cognitive therapy from its early days (Beck, 1963; Ellis, 1962) but has not been fully articulated in what is now known about evolved mental processes. Many forms of cognitive distortion can be seen to use the (previously) adaptive heuristic of better safe than sorry.

  12. Reliability Testing Procedure for MEMS IMUs Applied to Vibrating Environments

    PubMed Central

    De Pasquale, Giorgio; Somà, Aurelio

    2010-01-01

    The diffusion of micro electro-mechanical systems (MEMS) technology applied to navigation systems is rapidly increasing, but currently, there is a lack of knowledge about the reliability of this typology of devices, representing a serious limitation to their use in aerospace vehicles and other fields with medium and high requirements. In this paper, a reliability testing procedure for inertial sensors and inertial measurement units (IMU) based on MEMS for applications in vibrating environments is presented. The sensing performances were evaluated in terms of signal accuracy, systematic errors, and accidental errors; the actual working conditions were simulated by means of an accelerated dynamic excitation. A commercial MEMS-based IMU was analyzed to validate the proposed procedure. The main weaknesses of the system have been localized by providing important information about the relationship between the reliability levels of the system and individual components. PMID:22315550

  13. Methods to Improve Reliability of Video Recorded Behavioral Data

    PubMed Central

    Haidet, Kim Kopenhaver; Tate, Judith; Divirgilio-Thomas, Dana; Kolanowski, Ann; Happ, Mary Beth

    2009-01-01

    Behavioral observation is a fundamental component of nursing practice and a primary source of clinical research data. The use of video technology in behavioral research offers important advantages to nurse scientists in assessing complex behaviors and relationships between behaviors. The appeal of using this method should be balanced, however, by an informed approach to reliability issues. In this paper, we focus on factors that influence reliability, such as the use of sensitizing sessions to minimize participant reactivity and the importance of training protocols for video coders. In addition, we discuss data quality, the selection and use of observational tools, calculating reliability coefficients, and coding considerations for special populations based on our collective experiences across three different populations and settings. PMID:19434651

  14. Mechanical system reliability for long life space systems

    NASA Technical Reports Server (NTRS)

    Kowal, Michael T.

    1994-01-01

    The creation of a compendium of mechanical limit states was undertaken in order to provide a reference base for the application of first-order reliability methods to mechanical systems in the context of the development of a system level design methodology. The compendium was conceived as a reference source specific to the problem of developing the noted design methodology, and not an exhaustive or exclusive compilation of mechanical limit states. The compendium is not intended to be a handbook of mechanical limit states for general use. The compendium provides a diverse set of limit-state relationships for use in demonstrating the application of probabilistic reliability methods to mechanical systems. The compendium is to be used in the reliability analysis of moderately complex mechanical systems.

  15. Active Printed Materials for Complex Self-Evolving Deformations

    NASA Astrophysics Data System (ADS)

    Raviv, Dan; Zhao, Wei; McKnelly, Carrie; Papadopoulou, Athina; Kadambi, Achuta; Shi, Boxin; Hirsch, Shai; Dikovsky, Daniel; Zyracki, Michael; Olguin, Carlos; Raskar, Ramesh; Tibbits, Skylar

    2014-12-01

    We propose a new design of complex self-evolving structures that vary over time due to environmental interaction. In conventional 3D printing systems, materials are meant to be stable rather than active and fabricated models are designed and printed as static objects. Here, we introduce a novel approach for simulating and fabricating self-evolving structures that transform into a predetermined shape, changing property and function after fabrication. The new locally coordinated bending primitives combine into a single system, allowing for a global deformation which can stretch, fold and bend given environmental stimulus.

  16. Active printed materials for complex self-evolving deformations.

    PubMed

    Raviv, Dan; Zhao, Wei; McKnelly, Carrie; Papadopoulou, Athina; Kadambi, Achuta; Shi, Boxin; Hirsch, Shai; Dikovsky, Daniel; Zyracki, Michael; Olguin, Carlos; Raskar, Ramesh; Tibbits, Skylar

    2014-12-18

    We propose a new design of complex self-evolving structures that vary over time due to environmental interaction. In conventional 3D printing systems, materials are meant to be stable rather than active and fabricated models are designed and printed as static objects. Here, we introduce a novel approach for simulating and fabricating self-evolving structures that transform into a predetermined shape, changing property and function after fabrication. The new locally coordinated bending primitives combine into a single system, allowing for a global deformation which can stretch, fold and bend given environmental stimulus.

  17. The cartography of pain: the evolving contribution of pain maps.

    PubMed

    Schott, Geoffrey D

    2010-09-01

    Pain maps are nowadays widely used in clinical practice. This article aims to critically review the fundamental principles that underlie the mapping of pain, to analyse the evolving iconography of pain maps and their sometimes straightforward and sometimes contentious nature when used in the clinic, and to draw attention to some more recent developments in mapping pain. It is concluded that these maps are intriguing and evolving cartographic tools which can be used for depicting not only the spatial features but also the interpretative or perceptual components and accompaniments of pain.

  18. Interrater Reliability of Risk Matrix 2000/s.

    PubMed

    Wakeling, Helen C; Mann, Ruth E; Milner, Rebecca J

    2011-01-01

    Actuarial risk assessment instruments for sexual offenders are often used in high-stakes decision making and therefore should be subject to stringent reliability and validity testing. Furthermore, those involved in the risk assessment of sexual offenders should be aware of the factors that may affect the reliability of these instruments. The present study examined the interrater reliability of the Risk Matrix 2000/s between one field rater and one independent rater with a sample of more than 100 sexual offenders. The results indicated good interrater reliability of the tool, although reliability varies from item to item. A number of factors were identified that seem to reduce the reliability of scoring. The present findings are strengthened by examining interrater reliability of the tool in the usual practitioner context and by calculating a range of reliability statistics. Strategies are suggested to increase reliability in the use of actuarial tools in routine practice. PMID:21216783

  19. Interrater reliability of Risk Matrix 2000/s.

    PubMed

    Wakeling, Helen C; Mann, Ruth E; Milner, Rebecca J

    2011-12-01

    Actuarial risk assessment instruments for sexual offenders are often used in high-stakes decision making and therefore should be subject to stringent reliability and validity testing. Furthermore, those involved in the risk assessment of sexual offenders should be aware of the factors that may affect the reliability of these instruments. The present study examined the interrater reliability of the Risk Matrix 2000/s between one field rater and one independent rater with a sample of more than 100 sexual offenders. The results indicated good interrater reliability of the tool, although reliability varies from item to item. A number of factors were identified that seem to reduce the reliability of scoring. The present findings are strengthened by examining interrater reliability of the tool in the usual practitioner context and by calculating a range of reliability statistics. Strategies are suggested to increase reliability in the use of actuarial tools in routine practice. PMID:22114173

  20. Apogee motor rocketry reliability improvements

    NASA Technical Reports Server (NTRS)

    Behm, J.; Dowler, W.; Gin, W.

    1974-01-01

    Since 1963, solid propellant apogee motors have been placing satellites into geosynchronous orbits. Major technological breakthroughs are not required to satisfy future mission requirements; however, there is a need to improve reliability to enhance cost effectiveness. Several management test options are discussed. A summary of results and conclusions derived from review of missions, where failure of a solid motor was inferred, and correlation of system factors with failures are reported. Highlights of a solid motor diagnostic instrumentation study are presented. Finally, recommendations are provided for areas of future apogee motor upgrade, which will increase project cost effectiveness by reducing the potential for future flight failures.

  1. Reliability-based casing design

    SciTech Connect

    Maes, M.A.; Gulati, K.C.; Johnson, R.C.; McKenna, D.L.; Brand, P.R.; Lewis, D.B.

    1995-06-01

    The present paper describes the development of reliability-based design criteria for oil and/or gas well casing/tubing. The approach is based on the fundamental principles of limit state design. Limit states for tubulars are discussed and specific techniques for the stochastic modeling of loading and resistance variables are described. Zonation methods and calibration techniques are developed which are geared specifically to the characteristic tubular design for both hydrocarbon drilling and production applications. The application of quantitative risk analysis to the development of risk-consistent design criteria is shown to be a major and necessary step forward in achieving more economic tubular design.

  2. [A score in transference. BIP--experiencing the relationship in psychoanalysis].

    PubMed

    Herold, R

    1998-08-01

    The (emotional) experience of reference in psychoanalyses, presented here, has been developed by the author in continuation of a method evolved by Gill and Hoffman 1982 in Chicago, which they termed "The Patient's Experience of the Relationship with the Therapist", to describe the course of analytical work on patient resistance against transference. The "Experience of reference in psychoanalyses" is a rating method that links by means of its two manuals quantitative methods (which are empirical in the conventional sense) with qualitative, clinical-hermeneutical research approaches. After reliable coding of categorial data of a tape recorder transcription, a condensed description of the course and a clinical comment are obtained, the graphic representation of which reads like a score transference. PMID:9745320

  3. Research at the Crossroads: How Intellectual Initiatives across Disciplines Evolve

    ERIC Educational Resources Information Center

    Frost, Susan H.; Jean, Paul M.; Teodorescu, Daniel; Brown, Amy B.

    2004-01-01

    How do intellectual initiatives across disciplines evolve? This qualitative case study of 11 interdisciplinary research initiatives at Emory University identifies key factors in their development: the passionate commitments of scholarly leaders, the presence of strong collegial networks, access to timely and multiple resources, flexible practices,…

  4. The Evolving Status of Photojournalism Education. ERIC Digest.

    ERIC Educational Resources Information Center

    Cookman, Claude

    Noting that new technologies are resulting in extensive changes in the field of photojournalism, both as it is practiced and taught, this Digest reviews this rapidly evolving field of education and professional practice. It discusses what digital photography is; the history of digital photography; how digital photography has changed…

  5. Coevolution Drives the Emergence of Complex Traits and Promotes Evolvability

    PubMed Central

    Zaman, Luis; Meyer, Justin R.; Devangam, Suhas; Bryson, David M.; Lenski, Richard E.; Ofria, Charles

    2014-01-01

    The evolution of complex organismal traits is obvious as a historical fact, but the underlying causes—including the role of natural selection—are contested. Gould argued that a random walk from a necessarily simple beginning would produce the appearance of increasing complexity over time. Others contend that selection, including coevolutionary arms races, can systematically push organisms toward more complex traits. Methodological challenges have largely precluded experimental tests of these hypotheses. Using the Avida platform for digital evolution, we show that coevolution of hosts and parasites greatly increases organismal complexity relative to that otherwise achieved. As parasites evolve to counter the rise of resistant hosts, parasite populations retain a genetic record of past coevolutionary states. As a consequence, hosts differentially escape by performing progressively more complex functions. We show that coevolution's unique feedback between host and parasite frequencies is a key process in the evolution of complexity. Strikingly, the hosts evolve genomes that are also more phenotypically evolvable, similar to the phenomenon of contingency loci observed in bacterial pathogens. Because coevolution is ubiquitous in nature, our results support a general model whereby antagonistic interactions and natural selection together favor both increased complexity and evolvability. PMID:25514332

  6. Evolving Nature of Sexual Orientation and Gender Identity

    ERIC Educational Resources Information Center

    Jourian, T. J.

    2015-01-01

    This chapter discusses the historical and evolving terminology, constructs, and ideologies that inform the language used by those who are lesbian, gay, bisexual, and same-gender loving, who may identify as queer, as well as those who are members of trans* communities from multiple and intersectional perspectives.

  7. Optimists' Creed: Brave New Cyberlearning, Evolving Utopias (Circa 2041)

    ERIC Educational Resources Information Center

    Burleson, Winslow; Lewis, Armanda

    2016-01-01

    This essay imagines the role that artificial intelligence innovations play in the integrated living, learning and research environments of 2041. Here, in 2041, in the context of increasingly complex wicked challenges, whose solutions by their very nature continue to evade even the most capable experts, society and technology have co-evolved to…

  8. Does evolving the future preclude learning from it?

    PubMed

    Dowrick, Peter W

    2014-08-01

    Despite its considerable length, this article proposes a theory of human behavioral science that eschews half the evidence. There is irony in the title "Evolving the Future" when the featured examples of intentional change represent procedures that build slowly on the past. Has an opportunity been missed, or is an evolutionary perspective simply incompatible with learning from the future? PMID:25162866

  9. Hip Hop Is Now: An Evolving Youth Culture

    ERIC Educational Resources Information Center

    Taylor, Carl; Taylor, Virgil

    2007-01-01

    Emerging from Rap music, Hip Hop has become a lifestyle to many modern youth around the world. Embodying both creativity and controversy, Hip Hop mirrors the values, violence, and hypocrisy of modern culture. The authors dispel some of the simplistic views that surround this evolving youth movement embraced by millions of young people who are…

  10. The Evolving Military Learner Population: A Review of the Literature

    ERIC Educational Resources Information Center

    Ford, Kate; Vignare, Karen

    2015-01-01

    This literature review examines the evolving online military learner population with emphasis on current generation military learners, who are most frequently Post-9/11 veterans. The review synthesizes recent scholarly and grey literature on military learner demographics and attributes, college experiences, and academic outcomes against a backdrop…

  11. Towards Evolving Electronic Circuits for Autonomous Space Applications

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Haith, Gary L.; Colombano, Silvano P.; Stassinopoulos, Dimitris

    2000-01-01

    The relatively new field of Evolvable Hardware studies how simulated evolution can reconfigure, adapt, and design hardware structures in an automated manner. Space applications, especially those requiring autonomy, are potential beneficiaries of evolvable hardware. For example, robotic drilling from a mobile platform requires high-bandwidth controller circuits that are difficult to design. In this paper, we present automated design techniques based on evolutionary search that could potentially be used in such applications. First, we present a method of automatically generating analog circuit designs using evolutionary search and a circuit construction language. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. Using a parallel genetic algorithm, we present experimental results for five design tasks. Second, we investigate the use of coevolution in automated circuit design. We examine fitness evaluation by comparing the effectiveness of four fitness schedules. The results indicate that solution quality is highest with static and co-evolving fitness schedules as compared to the other two dynamic schedules. We discuss these results and offer two possible explanations for the observed behavior: retention of useful information, and alignment of problem difficulty with circuit proficiency.

  12. Do Infants Possess an Evolved Spider-Detection Mechanism?

    ERIC Educational Resources Information Center

    Rakison, David H.; Derringer, Jaime

    2008-01-01

    Previous studies with various non-human animals have revealed that they possess an evolved predator recognition mechanism that specifies the appearance of recurring threats. We used the preferential looking and habituation paradigms in three experiments to investigate whether 5-month-old human infants have a perceptual template for spiders that…

  13. A Conceptual Framework for Evolving, Recommender Online Learning Systems

    ERIC Educational Resources Information Center

    Peiris, K. Dharini Amitha; Gallupe, R. Brent

    2012-01-01

    A comprehensive conceptual framework is developed and described for evolving recommender-driven online learning systems (ROLS). This framework describes how such systems can support students, course authors, course instructors, systems administrators, and policy makers in developing and using these ROLS. The design science information systems…

  14. Today`s control systems evolved from early pioneers` dreams

    SciTech Connect

    Smith, D.J.

    1996-04-01

    In the last 100 years, power plant controls have evolved from manual operation and simple instruments to automatic state-of-the-art computerized control systems using smart instruments. This article traces the evolution of controls. The topics of the article include early control systems, developments in the early 20th century, Bailey controls, and developments in the late 20th century.

  15. The Evolving Understanding of the Construct of Intellectual Disability

    ERIC Educational Resources Information Center

    Schalock, Robert L.

    2011-01-01

    This article addresses two major areas concerned with the evolving understanding of the construct of intellectual disability. The first part of the article discusses current answers to five critical questions that have revolved around the general question, "What is Intellectual Disability?" These five are what to call the phenomenon, how to…

  16. Evolving Strategies for Cancer and Autoimmunity: Back to the Future

    PubMed Central

    Lane, Peter J. L.; McConnell, Fiona M.; Anderson, Graham; Nawaf, Maher G.; Gaspal, Fabrina M.; Withers, David R.

    2014-01-01

    Although current thinking has focused on genetic variation between individuals and environmental influences as underpinning susceptibility to both autoimmunity and cancer, an alternative view is that human susceptibility to these diseases is a consequence of the way the immune system evolved. It is important to remember that the immunological genes that we inherit and the systems that they control were shaped by the drive for reproductive success rather than for individual survival. It is our view that human susceptibility to autoimmunity and cancer is the evolutionarily acceptable side effect of the immune adaptations that evolved in early placental mammals to accommodate a fundamental change in reproductive strategy. Studies of immune function in mammals show that high affinity antibodies and CD4 memory, along with its regulation, co-evolved with placentation. By dissection of the immunologically active genes and proteins that evolved to regulate this step change in the mammalian immune system, clues have emerged that may reveal ways of de-tuning both effector and regulatory arms of the immune system to abrogate autoimmune responses whilst preserving protection against infection. Paradoxically, it appears that such a detuned and deregulated immune system is much better equipped to mount anti-tumor immune responses against cancers. PMID:24782861

  17. Institutional Change of Universities as a Problem of Evolving Boundaries

    ERIC Educational Resources Information Center

    Vakkuri, Jarmo

    2004-01-01

    This paper examines institutional change in universities from the perspective of the notion of "boundaries". The paper asks: how can some of the most important regulatory and administrative changes in universities be understood in the framework of evolving boundaries? Two areas are studied empirically. First, the third mission of universities is…

  18. Tensions inherent in the evolving role of the infection preventionist

    PubMed Central

    Conway, Laurie J.; Raveis, Victoria H.; Pogorzelska-Maziarz, Monika; Uchida, May; Stone, Patricia W.; Larson, Elaine L.

    2014-01-01

    Background The role of infection preventionists (IPs) is expanding in response to demands for quality and transparency in health care. Practice analyses and survey research have demonstrated that IPs spend a majority of their time on surveillance and are increasingly responsible for prevention activities and management; however, deeper qualitative aspects of the IP role have rarely been explored. Methods We conducted a qualitative content analysis of in-depth interviews with 19 IPs at hospitals throughout the United States to describe the current IP role, specifically the ways that IPs effect improvements and the facilitators and barriers they face. Results The narratives document that the IP role is evolving in response to recent changes in the health care landscape and reveal that this progression is associated with friction and uncertainty. Tensions inherent in the evolving role of the IP emerged from the interviews as 4 broad themes: (1) expanding responsibilities outstrip resources, (2) shifting role boundaries create uncertainty, (3) evolving mechanisms of influence involve trade-offs, and (4) the stress of constant change is compounded by chronic recurring challenges. Conclusion Advances in implementation science, data standardization, and training in leadership skills are needed to support IPs in their evolving role. PMID:23880116

  19. An Evolved Wavelet Library Based on Genetic Algorithm

    PubMed Central

    Vaithiyanathan, D.; Seshasayanan, R.; Kunaraj, K.; Keerthiga, J.

    2014-01-01

    As the size of the images being captured increases, there is a need for a robust algorithm for image compression which satiates the bandwidth limitation of the transmitted channels and preserves the image resolution without considerable loss in the image quality. Many conventional image compression algorithms use wavelet transform which can significantly reduce the number of bits needed to represent a pixel and the process of quantization and thresholding further increases the compression. In this paper the authors evolve two sets of wavelet filter coefficients using genetic algorithm (GA), one for the whole image portion except the edge areas and the other for the portions near the edges in the image (i.e., global and local filters). Images are initially separated into several groups based on their frequency content, edges, and textures and the wavelet filter coefficients are evolved separately for each group. As there is a possibility of the GA settling in local maximum, we introduce a new shuffling operator to prevent the GA from this effect. The GA used to evolve filter coefficients primarily focuses on maximizing the peak signal to noise ratio (PSNR). The evolved filter coefficients by the proposed method outperform the existing methods by a 0.31 dB improvement in the average PSNR and a 0.39 dB improvement in the maximum PSNR. PMID:25405225

  20. Coevolution drives the emergence of complex traits and promotes evolvability.

    PubMed

    Zaman, Luis; Meyer, Justin R; Devangam, Suhas; Bryson, David M; Lenski, Richard E; Ofria, Charles

    2014-12-01

    The evolution of complex organismal traits is obvious as a historical fact, but the underlying causes--including the role of natural selection--are contested. Gould argued that a random walk from a necessarily simple beginning would produce the appearance of increasing complexity over time. Others contend that selection, including coevolutionary arms races, can systematically push organisms toward more complex traits. Methodological challenges have largely precluded experimental tests of these hypotheses. Using the Avida platform for digital evolution, we show that coevolution of hosts and parasites greatly increases organismal complexity relative to that otherwise achieved. As parasites evolve to counter the rise of resistant hosts, parasite populations retain a genetic record of past coevolutionary states. As a consequence, hosts differentially escape by performing progressively more complex functions. We show that coevolution's unique feedback between host and parasite frequencies is a key process in the evolution of complexity. Strikingly, the hosts evolve genomes that are also more phenotypically evolvable, similar to the phenomenon of contingency loci observed in bacterial pathogens. Because coevolution is ubiquitous in nature, our results support a general model whereby antagonistic interactions and natural selection together favor both increased complexity and evolvability.

  1. Evolvability suppression to stabilize far-sighted adaptations.

    PubMed

    Altenberg, Lee

    2005-01-01

    The opportunistic character of adaptation through natural selection can lead to evolutionary pathologies--situations in which traits evolve that promote the extinction of the population. Such pathologies include imprudent predation and other forms of habitat overexploitation, or the tragedy of the commons, adaptation to temporally unreliable resources, cheating and other antisocial behavior, infectious pathogen carrier states, parthenogenesis, and cancer, an intraorganismal evolutionary pathology. It is known that hierarchical population dynamics can protect a population from invasion by pathological genes. Can it also alter the genotype so as to prevent the generation of such genes in the first place, that is, suppress the evolvability of evolutionary pathologies? A model is constructed in which one locus controls the expression of the pathological trait, and a series of modifier loci exist that can prevent the expression of this trait. It is found that multiple evolvability checkpoint genes can evolve to prevent the generation of variants that cause evolutionary pathologies. The consequences of this finding are discussed.

  2. The Evolving Theme of Teaching Multicultural Art Education. Monograph Series.

    ERIC Educational Resources Information Center

    La Pierre, Sharon Greenleaf, Ed.; Ballengee-Morris, Christine, Ed.

    This publication, sponsored by the U.S. Society of Education through Art (USSEA) as a forum of past presidents involving audience participation, aims to stimulate dialogue on the evolving theme of teaching multicultural issues and what affects student learning. Session participants were past presidents of the USSEA who prepared written statements…

  3. The Evolving Significance of Race: Living, Learning, and Teaching

    ERIC Educational Resources Information Center

    Hughes, Sherick A., Ed.; Berry, Theodorea Regina, Ed.

    2012-01-01

    Individuals are living, learning, and teaching by questioning how to address race in a society that consistently prefers to see itself as colorblind, a society claiming to seek a "post-racial" existence. This edited volume offers evidence of the evolving significance of race from a diverse group of male and female contributors self-identifying as…

  4. Multi-Hop Routing Mechanism for Reliable Sensor Computing

    PubMed Central

    Chen, Jiann-Liang; Ma, Yi-Wei; Lai, Chia-Ping; Hu, Chia-Cheng; Huang, Yueh-Min

    2009-01-01

    Current research on routing in wireless sensor computing concentrates on increasing the service lifetime, enabling scalability for large number of sensors and supporting fault tolerance for battery exhaustion and broken nodes. A sensor node is naturally exposed to various sources of unreliable communication channels and node failures. Sensor nodes have many failure modes, and each failure degrades the network performance. This work develops a novel mechanism, called Reliable Routing Mechanism (RRM), based on a hybrid cluster-based routing protocol to specify the best reliable routing path for sensor computing. Table-driven intra-cluster routing and on-demand inter-cluster routing are combined by changing the relationship between clusters for sensor computing. Applying a reliable routing mechanism in sensor computing can improve routing reliability, maintain low packet loss, minimize management overhead and save energy consumption. Simulation results indicate that the reliability of the proposed RRM mechanism is around 25% higher than that of the Dynamic Source Routing (DSR) and ad hoc On-demand Distance Vector routing (AODV) mechanisms. PMID:22303165

  5. Reliability of steam generator tubing

    SciTech Connect

    Kadokami, E.

    1997-02-01

    The author presents results on studies made of the reliability of steam generator (SG) tubing. The basis for this work is that in Japan the issue of defects in SG tubing is addressed by the approach that any detected defect should be repaired, either by plugging the tube or sleeving it. However, this leaves open the issue that there is a detection limit in practice, and what is the effect of nondetectable cracks on the performance of tubing. These studies were commissioned to look at the safety issues involved in degraded SG tubing. The program has looked at a number of different issues. First was an assessment of the penetration and opening behavior of tube flaws due to internal pressure in the tubing. They have studied: penetration behavior of the tube flaws; primary water leakage from through-wall flaws; opening behavior of through-wall flaws. In addition they have looked at the question of the reliability of tubing with flaws during normal plant operation. Also there have been studies done on the consequences of tube rupture accidents on the integrity of neighboring tubes.

  6. Reliable vision-guided grasping

    NASA Technical Reports Server (NTRS)

    Nicewarner, Keith E.; Kelley, Robert B.

    1992-01-01

    Automated assembly of truss structures in space requires vision-guided servoing for grasping a strut when its position and orientation are uncertain. This paper presents a methodology for efficient and robust vision-guided robot grasping alignment. The vision-guided grasping problem is related to vision-guided 'docking' problems. It differs from other hand-in-eye visual servoing problems, such as tracking, in that the distance from the target is a relevant servo parameter. The methodology described in this paper is hierarchy of levels in which the vision/robot interface is decreasingly 'intelligent,' and increasingly fast. Speed is achieved primarily by information reduction. This reduction exploits the use of region-of-interest windows in the image plane and feature motion prediction. These reductions invariably require stringent assumptions about the image. Therefore, at a higher level, these assumptions are verified using slower, more reliable methods. This hierarchy provides for robust error recovery in that when a lower-level routine fails, the next-higher routine will be called and so on. A working system is described which visually aligns a robot to grasp a cylindrical strut. The system uses a single camera mounted on the end effector of a robot and requires only crude calibration parameters. The grasping procedure is fast and reliable, with a multi-level error recovery system.

  7. Reliable vision-guided grasping

    NASA Technical Reports Server (NTRS)

    Nicewarner, Keith E.; Kelley, Robert B.

    1992-01-01

    Automated assembly of truss structures in space requires vision-guided servoing for grasping a strut when its position and orientation are uncertain. This paper presents a methodology for efficient and robust vision-guided robot grasping alignment. The vision-guided grasping problem is related to vision-guided 'docking' problems. It differs from other hand-in-eye visual servoing problems such as tracking in that the distance from the target is a relevant servo parameter. The methodology described in this paper is a hierarchy of levels in which the vision/robot interface is decreasingly 'intelligent', and increasingly fast. Speed is achieved primarily by information reduction. This reduction exploits the use of region-of-interest windows in the image plane and feature motion prediction. These reductions invariably require stringent assumptions about the image. Therefore, at a higher level, these assumptions are verified using slower, more reliable methods. This hierarchy provides for robust error recovery in that when a lower-level routine fails, the next-higher routine will be called and so on. A working system is described which visually aligns a robot to grasp a cylindrical strut. The system uses a single camera mounted on the end effector of a robot and requires only crude calibration parameters. The grasping procedure is fast and reliable, with a multi-level error recovery system.

  8. Disclosure analysis procedures: reliability issues.

    PubMed

    Hux, K; Sanger, D; Reid, R; Maschka, A

    1997-01-01

    Performing disclosure analyses to supplement assessment procedures and facilitate intervention planning is only valuable if the observations are reliable. The purpose of the present study was to evaluate and compare four methods of assessing reliability on one discourse analysis procedure--a modified version of Damico's Clinical Discourse Analysis (1985a, 1985b, 1992). The selected methods were: (a) Pearson product-moment correlations, (b) interobserver agreement, (c) Cohen's kappa, and (d) generalizability coefficients. Results showed high correlation coefficients and high percentages of interobserver agreement when error type was not taken into account. However, interobserver agreement percentages obtained solely for target behavior occurrences and Cohen's kappa revealed that much of the agreement between rates was due to chance and the high frequency of target behavior non-occurrence. Generalizability coefficients revealed that the procedure was fair to good for discriminating among persons with differing levels of language competency for some aspects of communication performance but was less than desirable for others; the aggregate score was below recommended standards for differentiating among people for diagnostic purposes.

  9. Evolvable mathematical models: A new artificial Intelligence paradigm

    NASA Astrophysics Data System (ADS)

    Grouchy, Paul

    We develop a novel Artificial Intelligence paradigm to generate autonomously artificial agents as mathematical models of behaviour. Agent/environment inputs are mapped to agent outputs via equation trees which are evolved in a manner similar to Symbolic Regression in Genetic Programming. Equations are comprised of only the four basic mathematical operators, addition, subtraction, multiplication and division, as well as input and output variables and constants. From these operations, equations can be constructed that approximate any analytic function. These Evolvable Mathematical Models (EMMs) are tested and compared to their Artificial Neural Network (ANN) counterparts on two benchmarking tasks: the double-pole balancing without velocity information benchmark and the challenging discrete Double-T Maze experiments with homing. The results from these experiments show that EMMs are capable of solving tasks typically solved by ANNs, and that they have the ability to produce agents that demonstrate learning behaviours. To further explore the capabilities of EMMs, as well as to investigate the evolutionary origins of communication, we develop NoiseWorld, an Artificial Life simulation in which interagent communication emerges and evolves from initially noncommunicating EMM-based agents. Agents develop the capability to transmit their x and y position information over a one-dimensional channel via a complex, dialogue-based communication scheme. These evolved communication schemes are analyzed and their evolutionary trajectories examined, yielding significant insight into the emergence and subsequent evolution of cooperative communication. Evolved agents from NoiseWorld are successfully transferred onto physical robots, demonstrating the transferability of EMM-based AIs from simulation into physical reality.

  10. Developing Architectures and Technologies for an Evolvable NASA Space Communication Infrastructure

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Hayden, Jeffrey

    2004-01-01

    Space communications architecture concepts play a key role in the development and deployment of NASA's future exploration and science missions. Once a mission is deployed, the communication link to the user needs to provide maximum information delivery and flexibility to handle the expected large and complex data sets and to enable direct interaction with the spacecraft and experiments. In human and robotic missions, communication systems need to offer maximum reliability with robust two-way links for software uploads and virtual interactions. Identifying the capabilities to cost effectively meet the demanding space communication needs of 21st century missions, proper formulation of the requirements for these missions, and identifying the early technology developments that will be needed can only be resolved with architecture design. This paper will describe the development of evolvable space communication architecture models and the technologies needed to support Earth sensor web and collaborative observation formation missions; robotic scientific missions for detailed investigation of planets, moons, and small bodies in the solar system; human missions for exploration of the Moon, Mars, Ganymede, Callisto, and asteroids; human settlements in space, on the Moon, and on Mars; and great in-space observatories for observing other star systems and the universe. The resulting architectures will enable the reliable, multipoint, high data rate capabilities needed on demand to provide continuous, maximum coverage of areas of concentrated activities, such as in the vicinity of outposts in-space, on the Moon or on Mars.

  11. 76 FR 42534 - Mandatory Reliability Standards for Interconnection Reliability Operating Limits; System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-19

    ... Reliability Operating Limits; System Restoration Reliability Standards AGENCY: Federal Energy Regulatory... restoration from Blackstart Resources and require reliability coordinators to establish plans and prepare personnel to enable effective coordination of the system restoration process. The Commission also...

  12. A diameter-sensitive flow entropy method for reliability consideration in water distribution system design

    NASA Astrophysics Data System (ADS)

    Liu, Haixing; Savić, Dragan; Kapelan, Zoran; Zhao, Ming; Yuan, Yixing; Zhao, Hongbin

    2014-07-01

    Flow entropy is a measure of uniformity of pipe flows in water distribution systems. By maximizing flow entropy one can identify reliable layouts or connectivity in networks. In order to overcome the disadvantage of the common definition of flow entropy that does not consider the impact of pipe diameter on reliability, an extended definition of flow entropy, termed as diameter-sensitive flow entropy, is proposed. This new methodology is then assessed by using other reliability methods, including Monte Carlo Simulation, a pipe failure probability model, and a surrogate measure (resilience index) integrated with water demand and pipe failure uncertainty. The reliability assessment is based on a sample of WDS designs derived from an optimization process for each of the two benchmark networks. Correlation analysis is used to evaluate quantitatively the relationship between entropy and reliability. To ensure reliability, a comparative analysis between the flow entropy and the new method is conducted. The results demonstrate that the diameter-sensitive flow entropy shows consistently much stronger correlation with the three reliability measures than simple flow entropy. Therefore, the new flow entropy method can be taken as a better surrogate measure for reliability and could be potentially integrated into the optimal design problem of WDSs. Sensitivity analysis results show that the velocity parameters used in the new flow entropy has no significant impact on the relationship between diameter-sensitive flow entropy and reliability.

  13. Preliminary Nuclear Electric Propulsion (NEP) reliability study

    NASA Technical Reports Server (NTRS)

    Hsieh, T. M.; Nakashima, A. M.; Mondt, J. F.

    1973-01-01

    A preliminary failure mode, failure effect, and criticality analysis of the major subsystems of nuclear electric propulsion is presented. Simplified reliability block diagrams are also given. A computer program was used to calculate the reliability of the heat rejection subsystem.

  14. 77 FR 26686 - Transmission Planning Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-07

    ... Energy Regulatory Commission 18 CFR Part 40 Transmission Planning Reliability Standards AGENCY: Federal... Act, the Federal Energy Regulatory Commission remands proposed Transmission Planning (TPL) Reliability... section 215(d) of the Federal Power Act,\\1\\ the Commission remands proposed Transmission Planning...

  15. FDA Warns Ovarian Cancer Tests Not Reliable

    MedlinePlus

    ... medlineplus.gov/news/fullstory_160880.html FDA Warns Ovarian Cancer Tests Not Reliable May delay preventive therapies for ... Sept. 9, 2016 (HealthDay News) -- Screening tests for ovarian cancer are not reliable and should not be used, ...

  16. Occasions and the Reliability of Classroom Observations: Alternative Conceptualizations and Methods of Analysis

    ERIC Educational Resources Information Center

    Meyer, J. Patrick; Cash, Anne H.; Mashburn, Andrew

    2011-01-01

    Student-teacher interactions are dynamic relationships that change and evolve over the course of a school year. Measuring classroom quality through observations that focus on these interactions presents challenges when observations are conducted throughout the school year. Variability in observed scores could reflect true changes in the quality of…

  17. Significant lexical relationships

    SciTech Connect

    Pedersen, T.; Kayaalp, M.; Bruce, R.

    1996-12-31

    Statistical NLP inevitably deals with a large number of rare events. As a consequence, NLP data often violates the assumptions implicit in traditional statistical procedures such as significance testing. We describe a significance test, an exact conditional test, that is appropriate for NLP data and can be performed using freely available software. We apply this test to the study of lexical relationships and demonstrate that the results obtained using this test are both theoretically more reliable and different from the results obtained using previously applied tests.

  18. Reliability and Validity for Neuroscience Nurses.

    PubMed

    Buelow, Janice M; Hinkle, Janice L; McNett, Molly

    2016-10-01

    The concepts of reliability and validity are important for neuroscience nurses to understand, particularly because they evaluate existing literature and integrate common scales or tools into their practice. Nurses must ensure instruments measuring specified concepts are both reliable and valid. This article will review types of reliability and validity-sometimes referred to collectively as a psychometric testing-of an instrument. Relevant examples in neuroscience are included to illustrate the importance of reliability and validity to neuroscience nurses. PMID:27579956

  19. Advanced reliability methods - A review

    NASA Astrophysics Data System (ADS)

    Forsyth, David S.

    2016-02-01

    There are a number of challenges to the current practices for Probability of Detection (POD) assessment. Some Nondestructive Testing (NDT) methods, especially those that are image-based, may not provide a simple relationship between a scalar NDT response and a damage size. Some damage types are not easily characterized by a single scalar metric. Other sensing paradigms, such as structural health monitoring, could theoretically replace NDT but require a POD estimate. And the cost of performing large empirical studies to estimate POD can be prohibitive. The response of the research community has been to develop new methods that can be used to generate the same information, POD, in a form that can be used by engineering designers. This paper will highlight approaches to image-based data and complex defects, Model Assisted POD estimation, and Bayesian methods for combining information. This paper will also review the relationship of the POD estimate, confidence bounds, tolerance bounds, and risk assessment.

  20. Space transportation architecture: Reliability sensitivities

    NASA Technical Reports Server (NTRS)

    Williams, A. M.

    1992-01-01

    A sensitivity analysis is given of the benefits and drawbacks associated with a proposed Earth to orbit vehicle architecture. The architecture represents a fleet of six vehicles (two existing, four proposed) that would be responsible for performing various missions as mandated by NASA and the U.S. Air Force. Each vehicle has a prescribed flight rate per year for a period of 31 years. By exposing this fleet of vehicles to a probabilistic environment where the fleet experiences failures, downtimes, setbacks, etc., the analysis involves determining the resiliency and costs associated with the fleet of specific vehicle/subsystem reliabilities. The resources required were actual observed data on the failures and downtimes associated with existing vehicles, data based on engineering judgement for proposed vehicles, and the development of a sensitivity analysis program.