Science.gov

Sample records for evolving reliable relationships

  1. An Evolving Relationship.

    ERIC Educational Resources Information Center

    May, Therese M.

    1990-01-01

    Responds to five major articles by Duckworth, Goldman, Healy, Sampson, and Goodyear on issues pertaining to testing and assessment in counseling psychology. Suggests that the interactive, collaborative aspects of the assessment relationship between psychologist and client need more attention. (TE)

  2. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, T.; Toon, J.; Conner, A.; Adams, T.; Miranda, D.

    2016-01-01

    This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) programs subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.

  3. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, Troy; Toon, Jamie; Conner, Angelo C.; Adams, Timothy C.; Miranda, David J.

    2016-01-01

    This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program’s subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.

  4. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, Jamie; Toon, Troy; Adams, Timothy C.; Miranda, David J.

    2016-01-01

    This paper describes the methodology that was developed to allocate reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program's subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. Allocating is an iterative process; as systems moved beyond their conceptual and preliminary design phases this provided an opportunity for the reliability engineering team to reevaluate allocations based on updated designs and maintainability characteristics of the components. Trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper will discuss the value of modifying reliability and maintainability allocations made for the GSDO subsystems as the program nears the end of its design phase.

  5. Evolvability

    PubMed Central

    Kirschner, Marc; Gerhart, John

    1998-01-01

    Evolvability is an organism’s capacity to generate heritable phenotypic variation. Metazoan evolution is marked by great morphological and physiological diversification, although the core genetic, cell biological, and developmental processes are largely conserved. Metazoan diversification has entailed the evolution of various regulatory processes controlling the time, place, and conditions of use of the conserved core processes. These regulatory processes, and certain of the core processes, have special properties relevant to evolutionary change. The properties of versatile protein elements, weak linkage, compartmentation, redundancy, and exploratory behavior reduce the interdependence of components and confer robustness and flexibility on processes during embryonic development and in adult physiology. They also confer evolvability on the organism by reducing constraints on change and allowing the accumulation of nonlethal variation. Evolvability may have been generally selected in the course of selection for robust, flexible processes suitable for complex development and physiology and specifically selected in lineages undergoing repeated radiations. PMID:9671692

  6. Host-parasite relationship in cystic echinococcosis: an evolving story.

    PubMed

    Siracusano, Alessandra; Delunardo, Federica; Teggi, Antonella; Ortona, Elena

    2012-01-01

    The larval stage of Echinococcus granulosus causes cystic echinococcosis, a neglected infectious disease that constitutes a major public health problem in developing countries. Despite being under constant barrage by the immune system, E. granulosus modulates antiparasite immune responses and persists in the human hosts with detectable humoral and cellular responses against the parasite. In vitro and in vivo immunological approaches, together with molecular biology and immunoproteomic technologies, provided us exciting insights into the mechanisms involved in the initiation of E. granulosus infection and the consequent induction and regulation of the immune response. Although the last decade has clarified many aspects of host-parasite relationship in human cystic echinococcosis, establishing the full mechanisms that cause the disease requires more studies. Here, we review some of the recent developments and discuss new avenues in this evolving story of E. granulosus infection in man. PMID:22110535

  7. Towards resolving Lamiales relationships: insights from rapidly evolving chloroplast sequences

    PubMed Central

    2010-01-01

    Background In the large angiosperm order Lamiales, a diverse array of highly specialized life strategies such as carnivory, parasitism, epiphytism, and desiccation tolerance occur, and some lineages possess drastically accelerated DNA substitutional rates or miniaturized genomes. However, understanding the evolution of these phenomena in the order, and clarifying borders of and relationships among lamialean families, has been hindered by largely unresolved trees in the past. Results Our analysis of the rapidly evolving trnK/matK, trnL-F and rps16 chloroplast regions enabled us to infer more precise phylogenetic hypotheses for the Lamiales. Relationships among the nine first-branching families in the Lamiales tree are now resolved with very strong support. Subsequent to Plocospermataceae, a clade consisting of Carlemanniaceae plus Oleaceae branches, followed by Tetrachondraceae and a newly inferred clade composed of Gesneriaceae plus Calceolariaceae, which is also supported by morphological characters. Plantaginaceae (incl. Gratioleae) and Scrophulariaceae are well separated in the backbone grade; Lamiaceae and Verbenaceae appear in distant clades, while the recently described Linderniaceae are confirmed to be monophyletic and in an isolated position. Conclusions Confidence about deep nodes of the Lamiales tree is an important step towards understanding the evolutionary diversification of a major clade of flowering plants. The degree of resolution obtained here now provides a first opportunity to discuss the evolution of morphological and biochemical traits in Lamiales. The multiple independent evolution of the carnivorous syndrome, once in Lentibulariaceae and a second time in Byblidaceae, is strongly supported by all analyses and topological tests. The evolution of selected morphological characters such as flower symmetry is discussed. The addition of further sequence data from introns and spacers holds promise to eventually obtain a fully resolved plastid tree of

  8. Young People and Alcohol in Italy: An Evolving Relationship

    ERIC Educational Resources Information Center

    Beccaria, Franca; Prina, Franco

    2010-01-01

    In Italy, commonly held opinions and interpretations about the relationship between young people and alcohol are often expressed as generalizations and approximations. In order to further understanding of the relationship between young people and alcohol in contemporary Italy, we have gathered, compared and discussed all the available data, both…

  9. Cats: their history and our evolving relationship with them.

    PubMed

    2016-07-01

    Cats have had a long relationship with people, and their history as a domesticated animal can be traced back as far as 2000 BC. Delegates at a recent conference titled 'People, cats and vets through history' delved a little deeper into the changing nature of this relationship. Georgina Mills reports. PMID:27389749

  10. Models of Shared Leadership: Evolving Structures and Relationships.

    ERIC Educational Resources Information Center

    Hallinger, Philip; Richardson, Don

    1988-01-01

    Explores potential changes in the power relationships among teachers and principals. Describes and analyzes the following models of teacher decision-making: (1) Instructional Leadership Teams; (2) Principals' Advisory Councils; (3) School Improvement Teams; and (4) Lead Teacher Committees. (FMW)

  11. Risk and responsibility: a complex and evolving relationship.

    PubMed

    Kermisch, Céline

    2012-03-01

    This paper analyses the nature of the relationship between risk and responsibility. Since neither the concept of risk nor the concept of responsibility has an unequivocal definition, it is obvious that there is no single interpretation of their relationship. After introducing the different meanings of responsibility used in this paper, we analyse four conceptions of risk. This allows us to make their link with responsibility explicit and to determine if a shift in the connection between risk and responsibility can be outlined. (1) In the engineer's paradigm, the quantitative conception of risk does not include any concept of responsibility. Their relationship is indirect, the locus of responsibility being risk management. (2) In Mary Douglas' cultural theory, risks are constructed through the responsibilities they engage. (3) Rayner and (4) Wolff go further by integrating forms of responsibility in the definition of risk itself. Analysis of these four frameworks shows that the concepts of risk and responsibility are increasingly intertwined. This tendency is reinforced by increasing public awareness and a call for the integration of a moral dimension in risk management. Therefore, we suggest that a form of virtue-responsibility should also be integrated in the concept of risk. PMID:21103951

  12. Evolving Cross-Group Relationships: The Story of Miller High, 1950-2000

    ERIC Educational Resources Information Center

    Eick, Caroline

    2011-01-01

    This paper examines students' evolving cross-group relationships in a comprehensive high school in Baltimore County, Maryland, USA, between 1950 and 2000. The findings of this research, situated at the intersections of two lenses of inquiry: oral historical analysis and critical studies, uncover both the power of students accustomed to integrated…

  13. [Creating a reliable therapeutic relationship with the patient].

    PubMed

    Matsuki, Kunihiro

    2012-01-01

    The factors necessary to create a reliable therapeutic relationship are presented in this paper. They include a demeanor and calmness of temperament as a psychiatric professional, a feeling of respect for the patient that is based on our common sense as human beings, an attitude of listening attentively to what the patient is revealing, maintaining an attitude of receptive neutrality, the ability to withstand the emotional burdens imposed on one by the patient, patience with any difficulty on one's own part to understand the patient, the ability to communicate clearly, including on the patient's negative aspects, and the ability to end psychiatric consultation sessions in a friendly and intimate manner. Creating a beneficial therapeutic relationship is about the building of a trusting relationship, in which the patient can constructively endure being questioned by us, or cope with the tough burdens we may place on them. However, a reliable relationship such as this contains paradoxes. Patients are able to talk to us about their suspicions, anxieties, dissatisfactions or anger only if the therapeutic relationship is good or based on trust. In other words, just like our patients, psychiatrists, too, must deal with what that the patient brings and directs toward us. It is at this point that what we call a true therapeutic relationship starts. PMID:23367840

  14. ELECTRICAL SUBSTATION RELIABILITY EVALUATION WITH EMPHASIS ON EVOLVING INTERDEPENDENCE ON COMMUNICATION INFRASTRUCTURE.

    SciTech Connect

    AZARM,M.A.; BARI,R.; YUE,M.; MUSICKI,Z.

    2004-09-12

    This study developed a probabilistic methodology for assessment of the reliability and security of electrical energy distribution networks. This included consideration of the future grid system, which will rely heavily on the existing digitally based communication infrastructure for monitoring and protection. Event tree and fault tree methods were utilized. The approach extensively modeled the types of faults that a grid could potentially experience, the response of the grid, and the specific design of the protection schemes. We demonstrated the methods by applying it to a small sub-section of a hypothetical grid based on an existing electrical grid system of a metropolitan area. The results showed that for a typical design that relies on communication network for protection, the communication network reliability could contribute significantly to the frequency of loss of electrical power. The reliability of the communication network could become a more important contributor to the electrical grid reliability as the utilization of the communication network significantly increases in the near future to support ''smart'' transmission and/or distributed generation.

  15. Relationship Skills in a Clinical Performance Examination: Reliability and Validity of the Relationship Instrument.

    ERIC Educational Resources Information Center

    Bolton, Cynthia; And Others

    Among the repertoire of clinical skills necessary for the professional development of medical students is the ability to create a positive doctor-patient relationship through effective communication skills. The purpose of this study was to create an instrument that reliably measures the relationship between physician and patient. The Relationship…

  16. ELECTRICAL SUBSTATION RELIABILITY EVALUATION WITH EMPHASIS ON EVOLVING INTERDEPENDENCE ON COMMUNICATION INFRASTRUCTURE.

    SciTech Connect

    AZARM,M.A.BARI,R.A.MUSICKI,Z.

    2004-01-15

    The objective of this study is to develop a methodology for a probabilistic assessment of the reliability and security of electrical energy distribution networks. This includes consideration of the future grid system, which will rely heavily on the existing digitally based communication infrastructure for monitoring and protection. Another important objective of this study is to provide information and insights from this research to Consolidated Edison Company (Con Edison) that could be useful in the design of the new network segment to be installed in the area of the World Trade Center in lower Manhattan. Our method is microscopic in nature and relies heavily on the specific design of the portion of the grid being analyzed. It extensively models the types of faults that a grid could potentially experience, the response of the grid, and the specific design of the protection schemes. We demonstrate that the existing technology can be extended and applied to the electrical grid and to the supporting communication network. A small subsection of a hypothetical grid based on the existing New York City electrical grid system of Con Edison is used to demonstrate the methods. Sensitivity studies show that in the current design the frequency for the loss of the main station is sensitive to the communication network reliability. The reliability of the communication network could become a more important contributor to the electrical grid reliability as the utilization of the communication network significantly increases in the near future to support ''smart'' transmission and/or distributed generation. The identification of potential failure modes and their likelihood can support decisions on potential modifications to the network including hardware, monitoring instrumentation, and protection systems.

  17. Craniosacral rhythm: reliability and relationships with cardiac and respiratory rates.

    PubMed

    Hanten, W P; Dawson, D D; Iwata, M; Seiden, M; Whitten, F G; Zink, T

    1998-03-01

    Craniosacral rhythm (CSR) has long been the subject of debate, both over its existence and its use as a therapeutic tool in evaluation and treatment. Origins of this rhythm are unknown, and palpatory findings lack scientific support. The purpose of this study was to determine the intra- and inter-examiner reliabilities of the palpation of the rate of the CSR and the relationship between the rate of the CSR and the heart or respiratory rates of subjects and examiners. The rates of the CSR of 40 healthy adults were palpated twice by each of two examiners. The heart and respiratory rates of the examiners and the subjects were recorded while the rates of the subjects' CSR were palpated by the examiners. Intraclass correlation coefficients were calculated to determine the intra- and inter-examiner reliabilities of the palpation. Two multiple regression analyses, one for each examiner, were conducted to analyze the relationships between the rate of the CSR and the heart and respiratory rates of the subjects and the examiners. The intraexaminer reliability coefficients were 0.78 for examiner A and 0.83 for examiner B, and the interexaminer reliability coefficient was 0.22. The result of the multiple regression analysis for examiner A was R = 0.46 and adjusted R2 = 0.12 (p = 0.078) and for examiner B was R = 0.63 and adjusted R2 = 0.32 (p = 0.001). The highest bivariate correlation was found between the CSR and the subject's heart rate (r = 0.30) for examiner A and between the CSR and the examiner's heart rate (r = 0.42) for examiner B. The results indicated that a single examiner may be able to palpate the rate of the CSR consistently, if that is what we truly measured. It is possible that the perception of CSR is illusory. The rate of the CSR palpated by two examiners is not consistent. The results of the regression analysis of one examiner offered no validation to those of the other. It appears that a subject's CSR is not related to the heart or respiratory rates of the

  18. How Mentoring Relationships Evolve: A Longitudinal Study of Academic Pediatricians in a Physician Educator Faculty Development Program

    ERIC Educational Resources Information Center

    Balmer, Dorene; D'Alessandro, Donna; Risko, Wanessa; Gusic, Maryellen E.

    2011-01-01

    Introduction: Mentoring is increasingly recognized as central to career development. Less attention has been paid, however, to how mentoring relationships evolve over time. To provide a more complete picture of these complex relationships, the authors explored mentoring from a mentee's perspective within the context of a three-year faculty…

  19. The Unidimensional Relationship Closeness Scale (URCS): Reliability and Validity Evidence for a New Measure of Relationship Closeness

    ERIC Educational Resources Information Center

    Dibble, Jayson L.; Levine, Timothy R.; Park, Hee Sun

    2012-01-01

    A fundamental dimension along which all social and personal relationships vary is closeness. The Unidimensional Relationship Closeness Scale (URCS) is a 12-item self-report scale measuring the closeness of social and personal relationships. The reliability and validity of the URCS were assessed with college dating couples (N = 192), female friends…

  20. Reliability assurance program and its relationship to other regulations

    SciTech Connect

    Polich, T.J.

    1994-12-31

    The need for a safety-oriented reliability effort for the nuclear industry was identified by the U.S. Nuclear Regulatory Commission (NRC) in the Three Mile Island Action Plan (NUREG-0660) Item II.C.4. In SECY-89-013, {open_quotes}Design Requirements Related to the Evolutionary ALWR,{close_quotes} the staff stated that the reliability assurance program (RAP) would be required for design certification to ensure that the design reliability of safety-significant structures, systems, and components (SSCs) is maintained over the life of a plant. In November 1988, the staff informed the advanced light water reactor (ALWR) vendors and the Electric Power Research Institute (EPRI) that it was considering this matter. Since that time, the staff has had numerous interactions with industry regarding RAP. These include discussions and subsequent safety evaluation reports on the EPRI utilities requirements document and for both Evolutionary Designs. The RAP has also been discussed in SECY-93-087, {open_quotes}Policy, Technical, and Licensing Issues Pertaining to Evolutionary and Advanced Light-Water Reactor (ALWR) Designs{close_quotes} and SECY-94-084, {open_quotes}Policy and Technical Issues Associated With the Regulatory Treatment of Non-Safety Systems in Passive Plant Designs.{close_quotes}

  1. The relationship between reliability and bonding techniques in hybrid microcircuits

    NASA Technical Reports Server (NTRS)

    Caruso, S. V.; Kinser, D. L.; Graff, S. M.; Allen, R. V.

    1975-01-01

    Differential thermal expansion was shown to be responsible for many observed failures in ceramic chip capacitors mounted on alumina substrates. It is shown that the mounting techniques used in bonding the capacitors have a marked effect upon the thermally induced mechanical stress and thus the failure rate. A mathematical analysis was conducted of a composite model of the capacitor-substrate system to predict the magnitude of thermally induced stresses. It was experimentally observed that the stresses in more compliant bonding systems such as soft lead tin and indium solders are significantly lower than those in hard solder and epoxy systems. The marked dependence upon heating and cooling rate was proven to be a determining factor in the prediction of failure solder systems. It was found that the harder or higher melting solders are less susceptible to thermal cycling effects but that they are more likely to fail during initial processing operations. Strain gage techniques were used to determine thermally induced expansion stresses of the capacitors and the alumina substrates. The compliance of the different bonding mediums was determined. From the data obtained, several recommendations are made concerning the optimum bonding system for the achievement of maximum reliability.

  2. Animals Used in Research and Education, 1966-2016: Evolving Attitudes, Policies, and Relationships.

    PubMed

    Lairmore, Michael D; Ilkiw, Jan

    2015-01-01

    Since the inception of the Association of American Veterinary Medical Colleges (AAVMC), the use of animals in research and education has been a central element of the programs of member institutions. As veterinary education and research programs have evolved over the past 50 years, so too have societal views and regulatory policies. AAVMC member institutions have continually responded to these events by exchanging best practices in training their students in the framework of comparative medicine and the needs of society. Animals provide students and faculty with the tools to learn the fundamental knowledge and skills of veterinary medicine and scientific discovery. The study of animal models has contributed extensively to medicine, veterinary medicine, and basic sciences as these disciplines seek to understand life processes. Changing societal views over the past 50 years have provided active examination and continued refinement of the use of animals in veterinary medical education and research. The future use of animals to educate and train veterinarians will likely continue to evolve as technological advances are applied to experimental design and educational systems. Natural animal models of both human and animal health will undoubtedly continue to serve a significant role in the education of veterinarians and in the development of new treatments of animal and human disease. As it looks to the future, the AAVMC as an organization will need to continue to support and promote best practices in the humane care and appropriate use of animals in both education and research. PMID:26673210

  3. On the Relationships between Generative Encodings, Regularity, and Learning Abilities when Evolving Plastic Artificial Neural Networks

    PubMed Central

    Tonelli, Paul; Mouret, Jean-Baptiste

    2013-01-01

    A major goal of bio-inspired artificial intelligence is to design artificial neural networks with abilities that resemble those of animal nervous systems. It is commonly believed that two keys for evolving nature-like artificial neural networks are (1) the developmental process that links genes to nervous systems, which enables the evolution of large, regular neural networks, and (2) synaptic plasticity, which allows neural networks to change during their lifetime. So far, these two topics have been mainly studied separately. The present paper shows that they are actually deeply connected. Using a simple operant conditioning task and a classic evolutionary algorithm, we compare three ways to encode plastic neural networks: a direct encoding, a developmental encoding inspired by computational neuroscience models, and a developmental encoding inspired by morphogen gradients (similar to HyperNEAT). Our results suggest that using a developmental encoding could improve the learning abilities of evolved, plastic neural networks. Complementary experiments reveal that this result is likely the consequence of the bias of developmental encodings towards regular structures: (1) in our experimental setup, encodings that tend to produce more regular networks yield networks with better general learning abilities; (2) whatever the encoding is, networks that are the more regular are statistically those that have the best learning abilities. PMID:24236099

  4. The Evolving Symbiotic Relationship of Arts Education and U.S. Business.

    ERIC Educational Resources Information Center

    Sterling, Carol

    1995-01-01

    Proposes an extension of the arts education/business community relationship moving beyond issues of patronage and support. Maintains that the complexity of the 21st-century economy and society will be well served by the flexibility and creativity manifested in arts education. Recommends national goals and standards for arts education. (MJP)

  5. Suprafamilial relationships among Rodentia and the phylogenetic effect of removing fast-evolving nucleotides in mitochondrial, exon and intron fragments

    PubMed Central

    2008-01-01

    Background The number of rodent clades identified above the family level is contentious, and to date, no consensus has been reached on the basal evolutionary relationships among all rodent families. Rodent suprafamilial phylogenetic relationships are investigated in the present study using ~7600 nucleotide characters derived from two mitochondrial genes (Cytochrome b and 12S rRNA), two nuclear exons (IRBP and vWF) and four nuclear introns (MGF, PRKC, SPTBN, THY). Because increasing the number of nucleotides does not necessarily increase phylogenetic signal (especially if the data is saturated), we assess the potential impact of saturation for each dataset by removing the fastest-evolving positions that have been recognized as sources of inconsistencies in phylogenetics. Results Taxonomic sampling included multiple representatives of all five rodent suborders described. Fast-evolving positions for each dataset were identified individually using a discrete gamma rate category and sites belonging to the most rapidly evolving eighth gamma category were removed. Phylogenetic tree reconstructions were performed on individual and combined datasets using Parsimony, Bayesian, and partitioned Maximum Likelihood criteria. Removal of fast-evolving positions enhanced the phylogenetic signal to noise ratio but the improvement in resolution was not consistent across different data types. The results suggested that elimination of fastest sites only improved the support for nodes moderately affected by homoplasy (the deepest nodes for introns and more recent nodes for exons and mitochondrial genes). Conclusion The present study based on eight DNA fragments supports a fully resolved higher level rodent phylogeny with moderate to significant nodal support. Two inter-suprafamilial associations emerged. The first comprised a monophyletic assemblage containing the Anomaluromorpha (Anomaluridae + Pedetidae) + Myomorpha (Muridae + Dipodidae) as sister clade to the Castorimorpha

  6. Evolving Relationship Structures in Multi-sourcing Arrangements: The Case of Mission Critical Outsourcing

    NASA Astrophysics Data System (ADS)

    Heitlager, Ilja; Helms, Remko; Brinkkemper, Sjaak

    Information Technology Outsourcing practice and research mainly considers the outsourcing phenomenon as a generic fulfilment of the IT function by external parties. Inspired by the logic of commodity, core competencies and economies of scale; assets, existing departments and IT functions are transferred to external parties. Although the generic approach might work for desktop outsourcing, where standardisation is the dominant factor, it does not work for the management of mission critical applications. Managing mission critical applications requires a different approach where building relationships is critical. The relationships involve inter and intra organisational parties in a multi-sourcing arrangement, called an IT service chain, consisting of multiple (specialist) parties that have to collaborate closely to deliver high quality services.

  7. A Proposed New "What If Reliability" Analysis for Assessing the Statistical Significance of Bivariate Relationships.

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Daniel, Larry G.; Roberts, J. Kyle

    The purpose of this paper is to illustrate how displaying disattenuated correlation coefficients along with their unadjusted counterparts will allow the reader to assess the impact of unreliability on each bivariate relationship. The paper also demonstrates how a proposed new "what if reliability" analysis can complement the conventional null…

  8. A Proposed New "What if Reliability" Analysis for Assessing the Statistical Significance of Bivariate Relationships

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Roberts, J. Kyle; Daniel, Larry G.

    2005-01-01

    In this article, the authors (a) illustrate how displaying disattenuated correlation coefficients alongside their unadjusted counterparts will allow researchers to assess the impact of unreliability on bivariate relationships and (b) demonstrate how a proposed new "what if reliability" analysis can complement null hypothesis significance tests of…

  9. Assessing the Complex and Evolving Relationship between Charges and Payments in US Hospitals: 1996 – 2012

    PubMed Central

    Bulchis, Anne G.; Lomsadze, Liya; Joseph, Jonathan; Baral, Ranju; Bui, Anthony L.; Horst, Cody; Johnson, Elizabeth; Dieleman, Joseph L.

    2016-01-01

    Background In 2013 the United States spent $2.9 trillion on health care, more than in any previous year. Much of the debate around slowing health care spending growth focuses on the complicated pricing system for services. Our investigation contributes to knowledge of health care spending by assessing the relationship between charges and payments in the inpatient hospital setting. In the US, charges and payments differ because of a complex set of incentives that connect health care providers and funders. Our methodology can also be applied to adjust charge data to reflect actual spending. Methods We extracted cause of health care encounter (cause), primary payer (payer), charge, and payment information for 50,172 inpatient hospital stays from 1996 through 2012. We used linear regression to assess the relationship between charges and payments, stratified by payer, year, and cause. We applied our estimates to a large, nationally representative hospital charge sample to estimate payments. Results The average amount paid per $1 charged varies significantly across three dimensions: payer, year, and cause. Among the 10 largest causes of health care spending, average payments range from 23 to 55 cents per dollar charged. Over time, the amount paid per dollar charged is decreasing for those with private or public insurance, signifying that inpatient charges are increasing faster than the amount insurers pay. Conversely, the amount paid by out-of-pocket payers per dollar charged is increasing over time for several causes. Applying our estimates to a nationally representative hospital charge sample generates payment estimates which align with the official US estimates of inpatient spending. Conclusions The amount paid per $1 charged fluctuates significantly depending on the cause of a health care encounter and the primary payer. In addition, the amount paid per charge is changing over time. Transparent accounting of hospital spending requires a detailed assessment of the

  10. The relationship between unstandardized and standardized alpha, true reliability, and the underlying measurement model.

    PubMed

    Falk, Carl F; Savalei, Victoria

    2011-01-01

    Popular computer programs print 2 versions of Cronbach's alpha: unstandardized alpha, α(Σ), based on the covariance matrix, and standardized alpha, α(R), based on the correlation matrix. Sources that accurately describe the theoretical distinction between the 2 coefficients are lacking, which can lead to the misconception that the differences between α(R) and α(Σ) are unimportant and to the temptation to report the larger coefficient. We explore the relationship between α(R) and α(Σ) and the reliability of the standardized and unstandardized composite under 3 popular measurement models; we clarify the theoretical meaning of each coefficient and conclude that researchers should choose an appropriate reliability coefficient based on theoretical considerations. We also illustrate that α(R) and α(Σ) estimate the reliability of different composite scores, and in most cases cannot be substituted for one another. PMID:21859284

  11. Performance and reliability of empirical mobility relationships for the prediction of Debris Flow inundated areas

    NASA Astrophysics Data System (ADS)

    Simoni, Alessandro; Berti, Matteo; Mammoliti, Maria

    2010-05-01

    Empirical mobility relationships can be used for preliminary DF Hazard assessment. An adaptation of the original relationships has been proposed for alpine debris flows (DFLOWZ model; Berti and Simoni, 2007). Once a reference debris flow volume is chosen, the code DFLOWZ allows to estimate the area potentially affected by the event based on the mutual relationships between channel cross-sectional area, planimetric area of the deposit and overall volume. We back-analyzed 25 DF events occurred in the Bolzano province (Italy), ranging in volume from 3,000 to 300,000 m3 and evalutated the performance of the automated method through an objective reliability index. Our aim is: - evaluate the effects of uncertainty associated with the empirical mobility relationships; - assess other possible sources of error or violations of the assumptions that underlie the model. Results indicate that a high-resolution DEM (≤ 2.5 m) is essential to get a reliable inundation prediction over a fan. The code itself performs well, in a wide range of situations, demonstrating the conceptual correctness of underlying assumptions. The most relevant source of error remains the uncertainty associated with the empirical mobility relationships, due mainly to errors in volume measurements of DF deposits. Their improvement can be achieved through the collection of high quality field data of DF events.

  12. Structural and reliability analysis of quality of relationship index in cancer patients.

    PubMed

    Cousson-Gélie, Florence; de Chalvron, Stéphanie; Zozaya, Carole; Lafaye, Anaïs

    2013-01-01

    Among psychosocial factors affecting emotional adjustment and quality of life, social support is one of the most important and widely studied in cancer patients, but little is known about the perception of support in specific significant relationships in patients with cancer. This study examined the psychometric properties of the Quality of Relationship Inventory (QRI) by evaluating its factor structure and its convergent and discriminant validity in a sample of cancer patients. A total of 388 patients completed the QRI. Convergent validity was evaluated by testing the correlations between the QRI subscales and measures of general social support, anxiety and depression symptoms. Discriminant validity was examined by testing group comparison. The QRI's longitudinal invariance across time was also tested. Principal axis factor analysis with promax rotation identified three factors accounting for 42.99% of variance: perceived social support, depth, and interpersonal conflict. Estimates of reliability with McDonald's ω coefficient were satisfactory for all the QRI subscales (ω ranging from 0.75 - 0.85). Satisfaction from general social support was negatively correlated with the interpersonal conflict subscale and positively with the depth subscale. The interpersonal conflict and social support scales were correlated with depression and anxiety scores. We also found a relative stability of QRI subscales (measured 3 months after the first evaluation) and differences between partner status and gender groups. The Quality of Relationship Inventory is a valid tool for assessing the quality of social support in a particular relationship with cancer patients. PMID:23514252

  13. Impact of relationships between test and training animals and among training animals on reliability of genomic prediction.

    PubMed

    Wu, X; Lund, M S; Sun, D; Zhang, Q; Su, G

    2015-10-01

    One of the factors affecting the reliability of genomic prediction is the relationship among the animals of interest. This study investigated the reliability of genomic prediction in various scenarios with regard to the relationship between test and training animals, and among animals within the training data set. Different training data sets were generated from EuroGenomics data and a group of Nordic Holstein bulls (born in 2005 and afterwards) as a common test data set. Genomic breeding values were predicted using a genomic best linear unbiased prediction model and a Bayesian mixture model. The results showed that a closer relationship between test and training animals led to a higher reliability of genomic predictions for the test animals, while a closer relationship among training animals resulted in a lower reliability. In addition, the Bayesian mixture model in general led to a slightly higher reliability of genomic prediction, especially for the scenario of distant relationships between training and test animals. Therefore, to prevent a decrease in reliability, constant updates of the training population with animals from more recent generations are required. Moreover, a training population consisting of less-related animals is favourable for reliability of genomic prediction. PMID:26010512

  14. Palmar Creases: Classification, Reliability and Relationships to Fetal Alcohol Spectrum Disorders (FASD).

    PubMed

    Mattison, Siobhán M; Brunson, Emily K; Holman, Darryl J

    2015-09-01

    A normal human palm contains 3 major creases: the distal transverse crease; the proximal transverse crease; and the thenar crease. Because permanent crease patterns are thought to be laid down during the first trimester, researchers have speculated that deviations in crease patterns could be indicative of insults during fetal development. The purpose of this study was twofold: (1) to compare the efficacy and reliability of two coding methods, the first (M1) classifying both "simiana" and Sydney line variants and the second (M2) counting the total number of crease points of origin on the radial border of the hand; and (2) to ascertain the relationship between palmar crease patterns and fetal alcohol spectrum disorders (FASD). Bilateral palm prints were taken using the carbon paper and tape method from 237 individuals diagnosed with FASD and 190 unexposed controls. All prints were coded for crease variants under M1 and M2. Additionally, a random sample of 98 matched (right and left) prints was selected from the controls to determine the reliabilities of M1 and M2. For this analysis, each palm was read twice, at different times, by two readers. Intra-observer Kappa coefficients were similar under both methods, ranging from 0.804-0.910. Inter-observer Kappa coefficients ranged from 0.582-0.623 under M1 and from 0.647-0.757 under M2. Using data from the entire sample of 427 prints and controlling for sex and ethnicity (white v. non-white), no relationship was found between palmar crease variants and FASD. Our results suggest that palmar creases can be classified reliably, but palmar crease patterns may not be affected by fetal alcohol exposure. PMID:26898079

  15. Fold and fabric relationships in temporally and spatially evolving slump systems: A multi-cell flow model

    NASA Astrophysics Data System (ADS)

    Alsop, G. Ian; Marco, Shmuel

    2014-06-01

    Folds generated in ductile metamorphic terranes and within unlithified sediments affected by slumping are geometrically identical to one another, and distinguishing the origin of such folds in ancient lithified rocks is therefore challenging. Foliation is observed to lie broadly parallel to the axial planes of tectonic folds, whilst it is frequently regarded as absent in slump folds. The presence of foliation is therefore often considered as a reliable criterion for distinguishing tectonic folds from those created during slumping. To test this assertion, we have examined a series of well exposed slump folds within the late Pleistocene Lisan Formation of the Dead Sea Basin. These slumps contain a number of different foliation types, including an axial-planar grain-shape fabric and a crenulation cleavage formed via microfolding of bedding laminae. Folds also contain a spaced disjunctive foliation characterised by extensional displacements across shear fractures. This spaced foliation fans around recumbent fold hinges, with kinematics reversing across the axial plane indicating a flexural shear fold mechanism. Overall, the spaced foliation is penecontemporaneous with each individual slump where it occurs, although in detail it is pre, syn or post the local folds. The identification of foliations within undoubted slump folds indicates that the presence or absence of foliation is not in itself a robust criterion to distinguish tectonic from soft-sediment folds. Extensional shear fractures displaying a range of temporal relationships with slump folds suggests that traditional single-cell flow models, where extension is focussed at the head and contraction in the lower toe of the slump, are a gross simplification. We therefore propose a new multi-cell flow model involving coeval second-order flow cells that interact with neighbouring cells during translation of the slump.

  16. Food Thought Suppression Inventory: Test-retest reliability and relationship to weight loss treatment outcomes.

    PubMed

    Barnes, Rachel D; Ivezaj, Valentina; Grilo, Carlos M

    2016-08-01

    This study examined the test-retest reliability of the Food Thought Suppression Inventory (FTSI) and its relationship with weight loss during weight loss treatment. Participants were 89 adults with and without binge eating disorder (BED) recruited through primary care for weight loss treatment who completed the FTSI twice prior to starting treatment. Intra-class correlations for the FTSI ranged from .74-.93. Participants with BED scored significantly higher on the FTSI than those without BED at baseline only. Percent weight loss from baseline to mid-treatment was significantly negatively correlated with the FTSI at baseline and at post-treatment. Participants reaching 5% loss of original body weight by post-treatment had significantly lower FTSI scores at post assessment when compared to those who did not reach this weight loss goal. While baseline binge-eating episodes were significantly positively correlated with baseline FTSI scores, change in binge-eating episodes during treatment were not significantly related to FTSI scores. The FTSI showed satisfactory one week test-retest reliability. Higher levels of food thought suppression may impair individuals' ability to lose weight while receiving weight loss treatment. PMID:27112114

  17. Establishing a Reliable Depth-Age Relationship for the Denali Ice Core

    NASA Astrophysics Data System (ADS)

    Wake, C. P.; Osterberg, E. C.; Winski, D.; Ferris, D.; Kreutz, K. J.; Introne, D.; Dalton, M.

    2015-12-01

    Reliable climate reconstruction from ice core records requires the development of a reliable depth-age relationship. We have established a sub-annual resolution depth-age relationship for the upper 198 meters of a 208 m ice core recovered in 2013 from Mt. Hunter (3,900 m asl), Denali National Park, central Alaska. The dating of the ice core was accomplished via annual layer counting of glaciochemical time-series combined with identification of reference horizons from volcanic eruptions and atmospheric nuclear weapons testing. Using the continuous ice core melter system at Dartmouth College, sub-seasonal samples have been collected and analyzed for major ions, liquid conductivity, particle size and concentration, and stable isotope ratios. Annual signals are apparent in several of the chemical species measured in the ice core samples. Calcium and magnesium peak in the spring, ammonium peaks in the summer, methanesulfonic acid (MSA) peaks in the autumn, and stable isotopes display a strong seasonal cycle with the most depleted values occurring during the winter. Thin ice layers representing infrequent summertime melt were also used to identify summer layers in the core. Analysis of approximately one meter sections of the core via nondestructive gamma spectrometry over depths from 84 to 124 m identified a strong radioactive cesium-137 peak at 89 m which corresponds to the 1963 layer deposited during extensive atmospheric nuclear weapons testing. Peaks in the sulfate and chloride record have been used for the preliminary identification of volcanic signals preserved in the ice core, including ten events since 1883. We are confident that the combination of robust annual layers combined with reference horizons provides a timescale for the 20th century that has an error of less than 0.5 years, making calibrations between ice core records and the instrumental climate data particularly robust. Initial annual layer counting through the entire 198 m suggests the Denali Ice

  18. Merlino-Perkins Father-Daughter Relationship Inventory (MP-FDI): Construction, Reliability, Validity, and Implications for Counseling and Research

    ERIC Educational Resources Information Center

    Merlino Perkins, Rose J.

    2008-01-01

    The Merlino-Perkins Father-Daughter Relationship Inventory, a self-report instrument, assesses women's childhood interactions with supportive, doting, distant, controlling, tyrannical, physically abusive, absent, and seductive fathers. Item and scale development, psychometric findings drawn from factor analyses, reliability assessments, and…

  19. An Examination of Coach and Player Relationships According to the Adapted LMX 7 Scale: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Caliskan, Gokhan

    2015-01-01

    The current study aims to test the reliability and validity of the Leader-Member Exchange (LMX 7) scale with regard to coach--player relationships in sports settings. A total of 330 professional soccer players from the Turkish Super League as well as from the First and Second Leagues participated in this study. Factor analyses were performed to…

  20. Quality of Relationships between Youth and Community Service Providers: Reliability and Validity of the Trusting Relationship Questionnaire

    ERIC Educational Resources Information Center

    Mustillo, Sarah A.; Dorsey, Shannon; Farmer, Elizabeth M. Z.

    2005-01-01

    We examined the factor structure and psychometric properties of the Trusting Relationship Questionnaire, a brief measure of relationship quality between youth and community-based service providers involved in their care. Data on youth residing in Therapeutic Foster Care and in Group Homes (N = 296) were collected. We identified a one-factor…

  1. The Relationship Quality Interview: Evidence of Reliability, Convergent and Divergent Validity, and Incremental Utility

    ERIC Educational Resources Information Center

    Lawrence, Erika; Barry, Robin A.; Brock, Rebecca L.; Bunde, Mali; Langer, Amie; Ro, Eunyoe; Fazio, Emily; Mulryan, Lorin; Hunt, Sara; Madsen, Lisa; Dzankovic, Sandra

    2011-01-01

    Relationship satisfaction and adjustment have been the target outcome variables for almost all couple research and therapies. In contrast, far less attention has been paid to the assessment of relationship quality. The present study introduces the Relationship Quality Interview (RQI), a semistructured, behaviorally anchored individual interview.…

  2. On the Relationship between Maximal Reliability and Maximal Validity of Linear Composites

    ERIC Educational Resources Information Center

    Penev, Spiridon; Raykov, Tenko

    2006-01-01

    A linear combination of a set of measures is often sought as an overall score summarizing subject performance. The weights in this composite can be selected to maximize its reliability or to maximize its validity, and the optimal choice of weights is in general not the same for these two optimality criteria. We explore several relationships…

  3. Reliable Attention Network Scores and Mutually Inhibited Inter-network Relationships Revealed by Mixed Design and Non-orthogonal Method

    PubMed Central

    Wang, Yi-Feng; Jing, Xiu-Juan; Liu, Feng; Li, Mei-Ling; Long, Zhi-Liang; Yan, Jin H.; Chen, Hua-Fu

    2015-01-01

    The attention system can be divided into alerting, orienting, and executive control networks. The efficiency and independence of attention networks have been widely tested with the attention network test (ANT) and its revised versions. However, many studies have failed to find effects of attention network scores (ANSs) and inter-network relationships (INRs). Moreover, the low reliability of ANSs can not meet the demands of theoretical and empirical investigations. Two methodological factors (the inter-trial influence in the event-related design and the inter-network interference in orthogonal contrast) may be responsible for the unreliability of ANT. In this study, we combined the mixed design and non-orthogonal method to explore ANSs and directional INRs. With a small number of trials, we obtained reliable and independent ANSs (split-half reliability of alerting: 0.684; orienting: 0.588; and executive control: 0.616), suggesting an individual and specific attention system. Furthermore, mutual inhibition was observed when two networks were operated simultaneously, indicating a differentiated but integrated attention system. Overall, the reliable and individual specific ANSs and mutually inhibited INRs provide novel insight into the understanding of the developmental, physiological and pathological mechanisms of attention networks, and can benefit future experimental and clinical investigations of attention using ANT. PMID:25997025

  4. An Adaptation, Validity and Reliability of the Lifespan Sibling Relationship Scale to the Turkish Adolescents

    ERIC Educational Resources Information Center

    Öz, F. Selda

    2015-01-01

    The purpose of this study is to adapt the Lifespan Sibling Relationship Scale (LSRS) developed by Riggio (2000) to Turkish. The scale with its original form in English consists of 48 items in total. The original scale was translated into Turkish by three instructors who are proficient both in the field and the language. Later, the original and…

  5. Relationship between evolving epileptiform activity and delayed loss of mitochondrial activity after asphyxia measured by near-infrared spectroscopy in preterm fetal sheep

    PubMed Central

    Bennet, L; Roelfsema, V; Pathipati, P; Quaedackers, J S; Gunn, A J

    2006-01-01

    Early onset cerebral hypoperfusion after birth is highly correlated with neurological injury in premature infants, but the relationship with the evolution of injury remains unclear. We studied changes in cerebral oxygenation, and cytochrome oxidase (CytOx) using near-infrared spectroscopy in preterm fetal sheep (103–104 days of gestation, term is 147 days) during recovery from a profound asphyxial insult (n = 7) that we have shown produces severe subcortical injury, or sham asphyxia (n = 7). From 1 h after asphyxia there was a significant secondary fall in carotid blood flow (P < 0.001), and total cerebral blood volume, as reflected by total haemoglobin (P < 0.005), which only partially recovered after 72 h. Intracerebral oxygenation (difference between oxygenated and deoxygenated haemoglobin concentrations) fell transiently at 3 and 4 h after asphyxia (P < 0.01), followed by a substantial increase to well over sham control levels (P < 0.001). CytOx levels were normal in the first hour after occlusion, was greater than sham control values at 2–3 h (P < 0.05), but then progressively fell, and became significantly suppressed from 10 h onward (P < 0.01). In the early hours after reperfusion the fetal EEG was highly suppressed, with a superimposed mixture of fast and slow epileptiform transients; overt seizures developed from 8 ± 0.5 h. These data strongly indicate that severe asphyxia leads to delayed, evolving loss of mitochondrial oxidative metabolism, accompanied by late seizures and relative luxury perfusion. In contrast, the combination of relative cerebral deoxygenation with evolving epileptiform transients in the early recovery phase raises the possibility that these early events accelerate or worsen the subsequent mitochondrial failure. PMID:16484298

  6. The Inventory of Teacher-Student Relationships: Factor Structure, Reliability, and Validity among African American Youth in Low-Income Urban Schools

    ERIC Educational Resources Information Center

    Murray, Christopher; Zvoch, Keith

    2011-01-01

    This study investigates the factor structure, reliability, and validity of the Inventory of Teacher-Student Relationships (IT-SR), a measure that was developed by adapting the widely used Inventory of Parent and Peer Attachments (Armsden & Greenberg, 1987) for use in the context of teacher-student relationships. The instrument was field tested…

  7. The Bindex(®) ultrasound device: reliability of cortical bone thickness measures and their relationship to regional bone mineral density.

    PubMed

    Behrens, Martin; Felser, Sabine; Mau-Moeller, Anett; Weippert, Matthias; Pollex, Johannes; Skripitz, Ralf; Herlyn, Philipp K E; Fischer, Dagmar-C; Bruhn, Sven; Schober, Hans-Christof; Zschorlich, Volker; Mittlmeier, Thomas

    2016-09-01

    The Bindex(®) quantitative ultrasound (QUS) device is currently available and this study analyzed (I) its relative and absolute intra- and inter-session reliability and (II) the relationship between the data provided by Bindex(®)-QUS and the bone mineral density (BMD) measured by dual-energy x-ray absorptiometry at corresponding skeletal sites in young and healthy subjects (age: 25.0  ±  3.6 years). Bindex(®)-QUS calculates a density index on the basis of the thickness of cortical bone measured at the distal radius and the distal plus proximal tibia. The data show a very good relative and absolute intra- (ICC  =  0.977, CV  =  1.5%) and inter-session reliability (ICC  =  0.978, CV  =  1.4%) for the density index. The highest positive correlations were found between cortical thickness and BMD for the distal radius and distal tibia (r  ⩾  0.71, p  <  0.001). The data indicate that the Bindex(®)-QUS parameters are repeatable within and between measurement sessions. Furthermore, the measurements reflect the BMD at specific skeletal sites. Bindex(®)-QUS might be a useful tool for the measurement of skeletal adaptations. PMID:27511629

  8. [Relationship between hope and subjective well-being: reliability and validity of the dispositional Hope Scale, Japanese version].

    PubMed

    Kato, Tsukasa; Snyder, C R

    2005-08-01

    We conducted three studies to translate the Snyder Hope Sales into Japanese, examine reliability and validity of the Japanese version, and investigate the relationship between the tendency to be hopeful and subjective well-being. In Study 1, confirmatory factor analysis was performed of the Hope Scale in the Japanese version: agency and pathways. Its test-retest reliability coefficients for the data from 113 undergraduates ranged from .81 to .84. In Study 2, concurrent validity of the Japanese version Hope Scale was examined with the data from 550 respondents, which looked at the correlations between hope and optimism, self-esteem, and self-efficacy. Results suggested that the Japanese version had high validity. In addition, the tendency to be hopeful had negative correlations with stress response, hopelessness, depressive tendency, and trait anxiety, and positive one with feeling of happiness. In Study 3, 175 undergraduates completed the Hope Scale and State-Trait Anxiety Inventory (STAI) immediately prior to final examinations. Results of regression analysis suggested that the tendency to be hopeful moderated examination anxiety. Taken together, results of the studies supported the hypothesis that hope had positive effects on subjective well-being. PMID:16200877

  9. Self Evolving Modular Network

    NASA Astrophysics Data System (ADS)

    Tokunaga, Kazuhiro; Kawabata, Nobuyuki; Furukawa, Tetsuo

    We propose a novel modular network called the Self-Evolving Modular Network (SEEM). The SEEM has a modular network architecture with a graph structure and these following advantages: (1) new modules are added incrementally to allow the network to adapt in a self-organizing manner, and (2) graph's paths are formed based on the relationships between the models represented by modules. The SEEM is expected to be applicable to evolving functions of an autonomous robot in a self-organizing manner through interaction with the robot's environment and categorizing large-scale information. This paper presents the architecture and an algorithm for the SEEM. Moreover, performance characteristic and effectiveness of the network are shown by simulations using cubic functions and a set of 3D-objects.

  10. How reliable are randomised controlled trials for studying the relationship between diet and disease? A narrative review.

    PubMed

    Temple, Norman J

    2016-08-01

    Large numbers of randomised controlled trials (RCT) have been carried out in order to investigate diet-disease relationships. This article examines eight sets of studies and compares the findings with those from epidemiological studies (cohort studies in seven of the cases). The studies cover the role of dietary factors in blood pressure, body weight, cancer and heart disease. In some cases, the findings from the two types of study are consistent, whereas in other cases the findings appear to be in conflict. A critical evaluation of this evidence suggests factors that may account for conflicting findings. Very often RCT recruit subjects with a history of the disease under study (or at high risk of it) and have a follow-up of only a few weeks or months. Cohort studies, in contrast, typically recruit healthy subjects and have a follow-up of 5-15 years. Owing to these differences, findings from RCT are not necessarily more reliable than those from well-designed prospective cohort studies. We cannot assume that the results of RCT can be freely applied beyond the specific features of the studies. PMID:27267302

  11. An Investigation of the Relationship between Reliability, Power, and the Type I Error Rate of the Mantel-Haenszel and Simultaneous Item Bias Detection Procedures.

    ERIC Educational Resources Information Center

    Ackerman, Terry A.; Evans, John A.

    The relationship between levels of reliability and the power of two bias and differential item functioning (DIF) detection methods is examined. Both methods, the Mantel-Haenszel (MH) procedure of P. W. Holland and D. T. Thayer (1988) and the Simultaneous Item Bias (SIB) procedure of R. Shealy and W. Stout (1991), use examinees' raw scores as a…

  12. Changing and Evolving Relationships between Two- and Four-Year Colleges and Universities: They're Not Your Parents' Community Colleges Anymore

    ERIC Educational Resources Information Center

    Labov, Jay B.

    2012-01-01

    This paper describes a summit on Community Colleges in the Evolving STEM Education Landscape organized by a committee of the National Research Council (NRC) and the National Academy of Engineering (NAE) and held at the Carnegie Institution for Science on December 15, 2011. This summit followed a similar event organized by Dr. Jill Biden, spouse of…

  13. The Assessment of Positivity and Negativity in Social Networks: The Reliability and Validity of the Social Relationships Index

    ERIC Educational Resources Information Center

    Campo, Rebecca A.; Uchino, Bert N.; Holt-Lunstad, Julianne; Vaughn, Allison; Reblin, Maija; Smith, Timothy W.

    2009-01-01

    The Social Relationships Index (SRI) was designed to examine positivity and negativity in social relationships. Unique features of this scale include its brevity and the ability to examine relationship positivity and negativity at the level of the specific individual and social network. The SRI's psychometric properties were examined in three…

  14. Evolvable synthetic neural system

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A. (Inventor)

    2009-01-01

    An evolvable synthetic neural system includes an evolvable neural interface operably coupled to at least one neural basis function. Each neural basis function includes an evolvable neural interface operably coupled to a heuristic neural system to perform high-level functions and an autonomic neural system to perform low-level functions. In some embodiments, the evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy.

  15. NON-POINT SOURCE--STREAM NUTRIENT LEVEL RELATIONSHIPS: A NATIONWIDE STUDY. SUPPLEMENT 1: NUTRIENT MAP RELIABILITY

    EPA Science Inventory

    The National Eutrophication Survey (NES) national maps of non-point source nitrogen and phosphorus concentrations in streams were evaluated for applicability and reliability. Interpretations on these maps which were based on data from 928 sampling sites associated with non-point ...

  16. Reliability, Validity, and Associations with Sexual Behavior among Ghanaian Teenagers of Scales Measuring Four Dimensions Relationships with Parents and Other Adults

    PubMed Central

    Bingenheimer, Jeffrey B.; Asante, Elizabeth; Ahiadeke, Clement

    2013-01-01

    Little research has been done on the social contexts of adolescent sexual behaviors in sub-Saharan Africa. As part of a longitudinal cohort study (N=1275) of teenage girls and boys in two Ghanaian towns, interviewers administered a 26 item questionnaire module intended to assess four dimensions of youth-adult relationships: monitoring conflict, emotional support, and financial support. Confirmatory factor and traditional psychometric analyses showed the four scales to be reliable. Known-groups comparisons provided evidence of their validity. All four scales had strong bivariate associations with self-reported sexual behavior (odds ratios = 1.66, 0.74, 0.47, and 0.60 for conflict, support, monitoring, and financial support). The instrument is practical for use in sub-Saharan African settings and produces measures that are reliable, valid, and predictive of sexual behavior in youth. PMID:25821286

  17. Making Reliability Arguments in Classrooms

    ERIC Educational Resources Information Center

    Parkes, Jay; Giron, Tilia

    2006-01-01

    Reliability methodology needs to evolve as validity has done into an argument supported by theory and empirical evidence. Nowhere is the inadequacy of current methods more visible than in classroom assessment. Reliability arguments would also permit additional methodologies for evidencing reliability in classrooms. It would liberalize methodology…

  18. Reliability and Validity of the Parent-Child Relationship Inventory (PCRI): Evidence from a Longitudinal Cross-Informant Investigation

    ERIC Educational Resources Information Center

    Coffman, Jacqueline K.; Guerin, Diana Wright; Gottfried, Allen W.

    2006-01-01

    Psychometric properties of the Parent-Child Relationship Inventory (PCRI) were examined using data collected from adolescents and their parents in the Fullerton Longitudinal Study. Results revealed acceptable internal consistency for most scales and moderate to high 1-year stability for all scales. Both parents' PCRI scores correlated with their…

  19. Prokaryote and eukaryote evolvability.

    PubMed

    Poole, Anthony M; Phillips, Matthew J; Penny, David

    2003-05-01

    The concept of evolvability covers a broad spectrum of, often contradictory, ideas. At one end of the spectrum it is equivalent to the statement that evolution is possible, at the other end are untestable post hoc explanations, such as the suggestion that current evolutionary theory cannot explain the evolution of evolvability. We examine similarities and differences in eukaryote and prokaryote evolvability, and look for explanations that are compatible with a wide range of observations. Differences in genome organisation between eukaryotes and prokaryotes meets this criterion. The single origin of replication in prokaryote chromosomes (versus multiple origins in eukaryotes) accounts for many differences because the time to replicate a prokaryote genome limits its size (and the accumulation of junk DNA). Both prokaryotes and eukaryotes appear to switch from genetic stability to genetic change in response to stress. We examine a range of stress responses, and discuss how these impact on evolvability, particularly in unicellular organisms versus complex multicellular ones. Evolvability is also limited by environmental interactions (including competition) and we describe a model that places limits on potential evolvability. Examples are given of its application to predator competition and limits to lateral gene transfer. We suggest that unicellular organisms evolve largely through a process of metabolic change, resulting in biochemical diversity. Multicellular organisms evolve largely through morphological changes, not through extensive changes to cellular biochemistry. PMID:12689728

  20. REFLECTIONS ON EVOLVING CHANGE.

    PubMed

    Angood, Peter B

    2016-01-01

    Physician leadership is increasingly recognized as pivotal for improved change in health care. Multi-professional care teams, education and leadership are evolving trends that are important for health care's future. PMID:27295737

  1. The Skin Cancer and Sun Knowledge (SCSK) Scale: Validity, Reliability, and Relationship to Sun-Related Behaviors Among Young Western Adults.

    PubMed

    Day, Ashley K; Wilson, Carlene; Roberts, Rachel M; Hutchinson, Amanda D

    2014-08-01

    Increasing public knowledge remains one of the key aims of skin cancer awareness campaigns, yet diagnosis rates continue to rise. It is essential we measure skin cancer knowledge adequately so as to determine the nature of its relationship to sun-related behaviors. This study investigated the psychometric properties of a new measure of skin cancer knowledge, the Skin Cancer and Sun Knowledge (SCSK) scale. A total of 514 Western young adults (females n = 320, males n = 194) aged 18 to 26 years completed measures of skin type, skin cancer knowledge, tanning behavior, sun exposure, and sun protection. Two-week test-retest of the SCSK was conducted with 52 participants. Internal reliability of the SCSK scale was acceptable (KR-20 = .69), test-retest reliability was high (r = .83, n = 52), and acceptable levels of face, content, and incremental validity were demonstrated. Skin cancer knowledge (as measured by SCSK) correlated with sun protection, sun exposure, and tanning behaviors in the female sample, but not in the males. Skin cancer knowledge appears to be more relevant to the behavior of young women than that of young males. We recommend that future research establish the validity of the SCSK across a range of participant groups. PMID:24722215

  2. Evolving Digital Ecological Networks

    PubMed Central

    Wagner, Aaron P.; Ofria, Charles

    2013-01-01

    “It is hard to realize that the living world as we know it is just one among many possibilities” [1]. Evolving digital ecological networks are webs of interacting, self-replicating, and evolving computer programs (i.e., digital organisms) that experience the same major ecological interactions as biological organisms (e.g., competition, predation, parasitism, and mutualism). Despite being computational, these programs evolve quickly in an open-ended way, and starting from only one or two ancestral organisms, the formation of ecological networks can be observed in real-time by tracking interactions between the constantly evolving organism phenotypes. These phenotypes may be defined by combinations of logical computations (hereafter tasks) that digital organisms perform and by expressed behaviors that have evolved. The types and outcomes of interactions between phenotypes are determined by task overlap for logic-defined phenotypes and by responses to encounters in the case of behavioral phenotypes. Biologists use these evolving networks to study active and fundamental topics within evolutionary ecology (e.g., the extent to which the architecture of multispecies networks shape coevolutionary outcomes, and the processes involved). PMID:23533370

  3. Relationship Between Agility Tests and Short Sprints: Reliability and Smallest Worthwhile Difference in National Collegiate Athletic Association Division-I Football Players.

    PubMed

    Mann, J Bryan; Ivey, Pat A; Mayhew, Jerry L; Schumacher, Richard M; Brechue, William F

    2016-04-01

    The Pro-Agility test (I-Test) and 3-cone drill (3-CD) are widely used in football to assess quickness in change of direction. Likewise, the 10-yard (yd) sprint, a test of sprint acceleration, is gaining popularity for testing physical competency in football players. Despite their frequent use, little information exists on the relationship between agility and sprint tests as well the reliability and degree of change necessary to indicate meaningful improvement resulting from training. The purpose of this study was to determine the reliability and smallest worthwhile difference (SWD) of the I-Test and 3-CD and the relationship of sprint acceleration to their performance. Division-I football players (n = 64, age = 20.5 ± 1.2 years, height = 185.2 ± 6.1 cm, body mass = 107.8 ± 20.7 kg) performed duplicate trials in each test during 2 separate weeks at the conclusion of a winter conditioning period. The better time of the 2 trials for each week was used for comparison. The 10-yd sprint was timed electronically, whereas the I-Test and 3-CD were hand timed by experienced testers. Each trial was performed on an indoor synthetic turf, with players wearing multicleated turf shoes. There was no significant difference (p > 0.06) between test weeks for the I-Test (4.53 ± 0.35 vs. 4.54 ± 0.31 seconds), 3-CD (7.45 ± 0.06 vs. 7.49 ± 0.06 seconds), or 10-yd sprint (1.85 ± 0.12 vs. 1.84 ± 0.12 seconds). The intraclass correlation coefficients (ICC) for 3-CD (ICC = 0.962) and 10-yd sprint (ICC = 0.974) were slightly higher than for the I-Test (ICC = 0.914). These values lead to acceptable levels of the coefficient of variation for each test (1.2, 1.2, and 1.9%, respectively). The SWD% indicated that a meaningful improvement due to training would require players to decrease their times by 6.6% for I-Test, 3.7% for 3-CD, and 3.8% for 10-yd sprint. Performance in agility and short sprint tests are highly related and reliable in college football players, providing quantifiable

  4. On Component Reliability and System Reliability for Space Missions

    NASA Technical Reports Server (NTRS)

    Chen, Yuan; Gillespie, Amanda M.; Monaghan, Mark W.; Sampson, Michael J.; Hodson, Robert F.

    2012-01-01

    This paper is to address the basics, the limitations and the relationship between component reliability and system reliability through a study of flight computing architectures and related avionics components for NASA future missions. Component reliability analysis and system reliability analysis need to be evaluated at the same time, and the limitations of each analysis and the relationship between the two analyses need to be understood.

  5. An Evolving Astrobiology Glossary

    NASA Astrophysics Data System (ADS)

    Meech, K. J.; Dolci, W. W.

    2009-12-01

    One of the resources that evolved from the Bioastronomy 2007 meeting was an online interdisciplinary glossary of terms that might not be universally familiar to researchers in all sub-disciplines feeding into astrobiology. In order to facilitate comprehension of the presentations during the meeting, a database driven web tool for online glossary definitions was developed and participants were invited to contribute prior to the meeting. The glossary was downloaded and included in the conference registration materials for use at the meeting. The glossary web tool is has now been delivered to the NASA Astrobiology Institute so that it can continue to grow as an evolving resource for the astrobiology community.

  6. ILZRO-sponsored field data collection and analysis to determine relationships between service conditions and reliability of VRLA batteries in stationary applications

    SciTech Connect

    Taylor, P.A.; Moseley, P.T.; Butler, P.C.

    1998-09-01

    Although valve-regulated lead-acid (VRLA) batteries have served in stationary applications for more than a decade, proprietary concerns of battery manufacturers and users and varying approaches to record-keeping have made the data available on performance and life relatively sparse and inconsistent. Such incomplete data are particularly detrimental to understanding the cause or causes of premature capacity loss (PCL) reported in VRLA batteries after as little as two years of service. The International Lead Zinc Research Organization (ILZRO), in cooperation with Sandia National Laboratories, has initiated a multi-phase project to characterize relationships between batteries, service conditions, and failure modes; establish the degree of correlation between specific operating procedures and PCL; identify operating procedures that mitigate PCL; identify best-fits between the operating requirements of specific applications and the capabilities of specific VRLA technologies; and recommend combinations of battery design, manufacturing processes, and operating conditions that enhance VRLA performance and reliability. This paper, prepared before preliminary conclusions were possible, presents the surveys distributed to manufacturers and end-users; discusses the analytic approach; presents an overview of the responses to the surveys and trends that emerge in the early analysis of the data; and previews the functionality of the database being constructed. The presentation of this paper will include preliminary results and information regarding the follow-on workshop for the study.

  7. Evolvable Neural Software System

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  8. Evolving Robust Gene Regulatory Networks

    PubMed Central

    Noman, Nasimul; Monjo, Taku; Moscato, Pablo; Iba, Hitoshi

    2015-01-01

    Design and implementation of robust network modules is essential for construction of complex biological systems through hierarchical assembly of ‘parts’ and ‘devices’. The robustness of gene regulatory networks (GRNs) is ascribed chiefly to the underlying topology. The automatic designing capability of GRN topology that can exhibit robust behavior can dramatically change the current practice in synthetic biology. A recent study shows that Darwinian evolution can gradually develop higher topological robustness. Subsequently, this work presents an evolutionary algorithm that simulates natural evolution in silico, for identifying network topologies that are robust to perturbations. We present a Monte Carlo based method for quantifying topological robustness and designed a fitness approximation approach for efficient calculation of topological robustness which is computationally very intensive. The proposed framework was verified using two classic GRN behaviors: oscillation and bistability, although the framework is generalized for evolving other types of responses. The algorithm identified robust GRN architectures which were verified using different analysis and comparison. Analysis of the results also shed light on the relationship among robustness, cooperativity and complexity. This study also shows that nature has already evolved very robust architectures for its crucial systems; hence simulation of this natural process can be very valuable for designing robust biological systems. PMID:25616055

  9. Regolith Evolved Gas Analyzer

    NASA Technical Reports Server (NTRS)

    Hoffman, John H.; Hedgecock, Jud; Nienaber, Terry; Cooper, Bonnie; Allen, Carlton; Ming, Doug

    2000-01-01

    The Regolith Evolved Gas Analyzer (REGA) is a high-temperature furnace and mass spectrometer instrument for determining the mineralogical composition and reactivity of soil samples. REGA provides key mineralogical and reactivity data that is needed to understand the soil chemistry of an asteroid, which then aids in determining in-situ which materials should be selected for return to earth. REGA is capable of conducting a number of direct soil measurements that are unique to this instrument. These experimental measurements include: (1) Mass spectrum analysis of evolved gases from soil samples as they are heated from ambient temperature to 900 C; and (2) Identification of liberated chemicals, e.g., water, oxygen, sulfur, chlorine, and fluorine. REGA would be placed on the surface of a near earth asteroid. It is an autonomous instrument that is controlled from earth but does the analysis of regolith materials automatically. The REGA instrument consists of four primary components: (1) a flight-proven mass spectrometer, (2) a high-temperature furnace, (3) a soil handling system, and (4) a microcontroller. An external arm containing a scoop or drill gathers regolith samples. A sample is placed in the inlet orifice where the finest-grained particles are sifted into a metering volume and subsequently moved into a crucible. A movable arm then places the crucible in the furnace. The furnace is closed, thereby sealing the inner volume to collect the evolved gases for analysis. Owing to the very low g forces on an asteroid compared to Mars or the moon, the sample must be moved from inlet to crucible by mechanical means rather than by gravity. As the soil sample is heated through a programmed pattern, the gases evolved at each temperature are passed through a transfer tube to the mass spectrometer for analysis and identification. Return data from the instrument will lead to new insights and discoveries including: (1) Identification of the molecular masses of all of the gases

  10. Redfield's evolving legacy

    NASA Astrophysics Data System (ADS)

    Gruber, Nicolas; Deutsch, Curtis A.

    2014-12-01

    The ratio of nitrogen to phosphorus in organic matter is close to that in seawater, a relationship maintained through a set of biological feedbacks. The rapid delivery of nutrients from human activities may test the efficacy of these processes.

  11. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1989-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predictions for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates produced by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  12. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1990-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predicitons for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates prodcued by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  13. Fat: an evolving issue

    PubMed Central

    Speakman, John R.; O’Rahilly, Stephen

    2012-01-01

    Summary Work on obesity is evolving, and obesity is a consequence of our evolutionary history. In the space of 50 years, we have become an obese species. The reasons why can be addressed at a number of different levels. These include separating between whether the primary cause lies on the food intake or energy expenditure side of the energy balance equation, and determining how genetic and environmental effects contribute to weight variation between individuals. Opinion on whether increased food intake or decreased energy expenditure drives the obesity epidemic is still divided, but recent evidence favours the idea that food intake, rather than altered expenditure, is most important. There is more of a consensus that genetics explains most (probably around 65%) of weight variation between individuals. Recent advances in genome-wide association studies have identified many polymorphisms that are linked to obesity, yet much of the genetic variance remains unexplained. Finding the causes of this unexplained variation will be an impetus of genetic and epigenetic research on obesity over the next decade. Many environmental factors – including gut microbiota, stress and endocrine disruptors – have been linked to the risk of developing obesity. A better understanding of gene-by-environment interactions will also be key to understanding obesity in the years to come. PMID:22915015

  14. Evolving endoscopic surgery.

    PubMed

    Sakai, Paulo; Faintuch, Joel

    2014-06-01

    Since the days of Albukasim in medieval Spain, natural orifices have been regarded not only as a rather repugnant source of bodily odors, fluids and excreta, but also as a convenient invitation to explore and treat the inner passages of the organism. However, surgical ingenuity needed to be matched by appropriate tools and devices. Lack of technologically advanced instrumentation was a strong deterrent during almost a millennium until recent decades when a quantum jump materialized. Endoscopic surgery is currently a vibrant and growing subspecialty, which successfully handles millions of patients every year. Additional opportunities lie ahead which might benefit millions more, however, requiring even more sophisticated apparatuses, particularly in the field of robotics, artificial intelligence, and tissue repair (surgical suturing). This is a particularly exciting and worthwhile challenge, namely of larger and safer endoscopic interventions, followed by seamless and scarless recovery. In synthesis, the future is widely open for those who use together intelligence and creativity to develop new prototypes, new accessories and new techniques. Yet there are many challenges in the path of endoscopic surgery. In this new era of robotic endoscopy, one will likely need a virtual simulator to train and assess the performance of younger doctors. More evidence will be essential in multiple evolving fields, particularly to elucidate whether more ambitious and complex pathways, such as intrathoracic and intraperitoneal surgery via natural orifice transluminal endoscopic surgery (NOTES), are superior or not to conventional techniques. PMID:24628672

  15. Communicability across evolving networks

    NASA Astrophysics Data System (ADS)

    Grindrod, Peter; Parsons, Mark C.; Higham, Desmond J.; Estrada, Ernesto

    2011-04-01

    Many natural and technological applications generate time-ordered sequences of networks, defined over a fixed set of nodes; for example, time-stamped information about “who phoned who” or “who came into contact with who” arise naturally in studies of communication and the spread of disease. Concepts and algorithms for static networks do not immediately carry through to this dynamic setting. For example, suppose A and B interact in the morning, and then B and C interact in the afternoon. Information, or disease, may then pass from A to C, but not vice versa. This subtlety is lost if we simply summarize using the daily aggregate network given by the chain A-B-C. However, using a natural definition of a walk on an evolving network, we show that classic centrality measures from the static setting can be extended in a computationally convenient manner. In particular, communicability indices can be computed to summarize the ability of each node to broadcast and receive information. The computations involve basic operations in linear algebra, and the asymmetry caused by time’s arrow is captured naturally through the noncommutativity of matrix-matrix multiplication. Illustrative examples are given for both synthetic and real-world communication data sets. We also discuss the use of the new centrality measures for real-time monitoring and prediction.

  16. Evolving synergetic interactions

    PubMed Central

    Wu, Bin; Arranz, Jordi; Du, Jinming; Zhou, Da; Traulsen, Arne

    2016-01-01

    Cooperators forgo their own interests to benefit others. This reduces their fitness and thus cooperators are not likely to spread based on natural selection. Nonetheless, cooperation is widespread on every level of biological organization ranging from bacterial communities to human society. Mathematical models can help to explain under which circumstances cooperation evolves. Evolutionary game theory is a powerful mathematical tool to depict the interactions between cooperators and defectors. Classical models typically involve either pairwise interactions between individuals or a linear superposition of these interactions. For interactions within groups, however, synergetic effects may arise: their outcome is not just the sum of its parts. This is because the payoffs via a single group interaction can be different from the sum of any collection of two-player interactions. Assuming that all interactions start from pairs, how can such synergetic multiplayer games emerge from simpler pairwise interactions? Here, we present a mathematical model that captures the transition from pairwise interactions to synergetic multiplayer ones. We assume that different social groups have different breaking rates. We show that non-uniform breaking rates do foster the emergence of synergy, even though individuals always interact in pairs. Our work sheds new light on the mechanisms underlying such synergetic interactions. PMID:27466437

  17. Evolving synergetic interactions.

    PubMed

    Wu, Bin; Arranz, Jordi; Du, Jinming; Zhou, Da; Traulsen, Arne

    2016-07-01

    Cooperators forgo their own interests to benefit others. This reduces their fitness and thus cooperators are not likely to spread based on natural selection. Nonetheless, cooperation is widespread on every level of biological organization ranging from bacterial communities to human society. Mathematical models can help to explain under which circumstances cooperation evolves. Evolutionary game theory is a powerful mathematical tool to depict the interactions between cooperators and defectors. Classical models typically involve either pairwise interactions between individuals or a linear superposition of these interactions. For interactions within groups, however, synergetic effects may arise: their outcome is not just the sum of its parts. This is because the payoffs via a single group interaction can be different from the sum of any collection of two-player interactions. Assuming that all interactions start from pairs, how can such synergetic multiplayer games emerge from simpler pairwise interactions? Here, we present a mathematical model that captures the transition from pairwise interactions to synergetic multiplayer ones. We assume that different social groups have different breaking rates. We show that non-uniform breaking rates do foster the emergence of synergy, even though individuals always interact in pairs. Our work sheds new light on the mechanisms underlying such synergetic interactions. PMID:27466437

  18. Stochastically evolving networks

    NASA Astrophysics Data System (ADS)

    Chan, Derek Y.; Hughes, Barry D.; Leong, Alex S.; Reed, William J.

    2003-12-01

    We discuss a class of models for the evolution of networks in which new nodes are recruited into the network at random times, and links between existing nodes that are not yet directly connected may also form at random times. The class contains both models that produce “small-world” networks and less tightly linked models. We produce both trees, appropriate in certain biological applications, and networks in which closed loops can appear, which model communication networks and networks of human sexual interactions. One of our models is closely related to random recursive trees, and some exact results known in that context can be exploited. The other models are more subtle and difficult to analyze. Our analysis includes a number of exact results for moments, correlations, and distributions of coordination number and network size. We report simulations and also discuss some mean-field approximations. If the system has evolved for a long time and the state of a random node (which thus has a random age) is observed, power-law distributions for properties of the system arise in some of these models.

  19. Reliability training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  20. Photovoltaic performance and reliability workshop

    NASA Astrophysics Data System (ADS)

    Mrig, L.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986-1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the U.S., PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  1. Photovoltaic performance and reliability workshop

    SciTech Connect

    Mrig, L.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986--1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the US, PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  2. Hyper massive black holes in evolved galaxies

    NASA Astrophysics Data System (ADS)

    Romero-Cruz, Fernando J.

    2015-09-01

    From the SDSS DR7 we took a sample of 16733 galaxies which do not show all of the emission lines required to classify their activity according to the classical BPT diagram (Baldwin et al. 1981 PASP). Since they do not show these emission lines they are thought to be evolved enough so to host Hyper Massive Black holes. We compared their statistical properties with other galaxies from the SDSS DR7 which do show emission lines and confirmed that their M-sigma relationship correspond to HMBHs (Gutelkin et al. 2009 ApJ) and also that their SFH confirms evolution. We also analyzed them with a new Diagnostic Diagram in the IR (Coziol et al. 2015 AJ) and found that their position in the IR color space (W3W4 vs W2W3) correspond to AGN activity with current low SF, another confirmation of an evolved galaxy. The position of our final sample in the IR diagram is in the same region in which Holm 15A lies, this galaxy is considered to host the most massive BHs in the nearby universe (Lopez-Cruz et al. 2014 ApJL). The morphology of these galaxies (all of them are classified as elliptical) confirms that they are very evolved. We claim that the hyper massive BH lie in galaxies very evolved and with very low SF and without clear AGN activity in the BPT diagram.

  3. Disgust: Evolved Function and Structure

    ERIC Educational Resources Information Center

    Tybur, Joshua M.; Lieberman, Debra; Kurzban, Robert; DeScioli, Peter

    2013-01-01

    Interest in and research on disgust has surged over the past few decades. The field, however, still lacks a coherent theoretical framework for understanding the evolved function or functions of disgust. Here we present such a framework, emphasizing 2 levels of analysis: that of evolved function and that of information processing. Although there is…

  4. Reliability physics

    NASA Technical Reports Server (NTRS)

    Cuddihy, E. F.; Ross, R. G., Jr.

    1984-01-01

    Speakers whose topics relate to the reliability physics of solar arrays are listed and their topics briefly reviewed. Nine reports are reviewed ranging in subjects from studies of photothermal degradation in encapsulants and polymerizable ultraviolet stabilizers to interface bonding stability to electrochemical degradation of photovoltaic modules.

  5. The Reliability of Density Measurements.

    ERIC Educational Resources Information Center

    Crothers, Charles

    1978-01-01

    Data from a land-use study of small- and medium-sized towns in New Zealand are used to ascertain the relationship between official and effective density measures. It was found that the reliability of official measures of density is very low overall, although reliability increases with community size. (Author/RLV)

  6. Automated gas burette system for evolved hydrogen measurements

    SciTech Connect

    Zheng Feng; Rassat, Scot D.; Helderandt, David J.; Caldwell, Dustin D.; Aardahl, Christopher L.; Autrey, Tom; Linehan, John C.; Rappe, Kenneth G.

    2008-08-15

    This paper reports a simple and efficient gas burette system that allows automated determination of evolved gas volume in real time using only temperature and pressure measurements. The system is reliable and has been used successfully to study the hydrogen release kinetics of ammonia borane thermolysis. The system is especially suitable for bench scale studies involving small batches and potentially rapid reaction kinetics.

  7. The Skin Cancer and Sun Knowledge (SCSK) Scale: Validity, Reliability, and Relationship to Sun-Related Behaviors among Young Western Adults

    ERIC Educational Resources Information Center

    Day, Ashley K.; Wilson, Carlene; Roberts, Rachel M.; Hutchinson, Amanda D.

    2014-01-01

    Increasing public knowledge remains one of the key aims of skin cancer awareness campaigns, yet diagnosis rates continue to rise. It is essential we measure skin cancer knowledge adequately so as to determine the nature of its relationship to sun-related behaviors. This study investigated the psychometric properties of a new measure of skin cancer…

  8. Natural Selection Promotes Antigenic Evolvability

    PubMed Central

    Graves, Christopher J.; Ros, Vera I. D.; Stevenson, Brian; Sniegowski, Paul D.; Brisson, Dustin

    2013-01-01

    The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide an experimentally tractable system to test whether natural selection has favored mechanisms that increase evolvability. Many antigenic variation systems consist of paralogous unexpressed ‘cassettes’ that recombine into an expression site to rapidly alter the expressed protein. Importantly, the magnitude of antigenic change is a function of the genetic diversity among the unexpressed cassettes. Thus, evidence that selection favors among-cassette diversity is direct evidence that natural selection promotes antigenic evolvability. We used the Lyme disease bacterium, Borrelia burgdorferi, as a model to test the prediction that natural selection favors amino acid diversity among unexpressed vls cassettes and thereby promotes evolvability in a primary surface antigen, VlsE. The hypothesis that diversity among vls cassettes is favored by natural selection was supported in each B. burgdorferi strain analyzed using both classical (dN/dS ratios) and Bayesian population genetic analyses of genetic sequence data. This hypothesis was also supported by the conservation of highly mutable tandem-repeat structures across B. burgdorferi strains despite a near complete absence of sequence conservation. Diversification among vls cassettes due to natural selection and mutable repeat structures promotes long-term antigenic evolvability of VlsE. These findings provide a direct demonstration that molecular mechanisms that enhance evolvability of surface antigens are an evolutionary adaptation. The molecular evolutionary processes identified here can serve as a model for the evolution of antigenic evolvability in many pathogens which utilize similar strategies to establish chronic infections

  9. Spacetimes containing slowly evolving horizons

    SciTech Connect

    Kavanagh, William; Booth, Ivan

    2006-08-15

    Slowly evolving horizons are trapping horizons that are ''almost'' isolated horizons. This paper reviews their definition and discusses several spacetimes containing such structures. These include certain Vaidya and Tolman-Bondi solutions as well as (perturbatively) tidally distorted black holes. Taking into account the mass scales and orders of magnitude that arise in these calculations, we conjecture that slowly evolving horizons are the norm rather than the exception in astrophysical processes that involve stellar-scale black holes.

  10. On the Discovery of Evolving Truth

    PubMed Central

    Li, Yaliang; Li, Qi; Gao, Jing; Su, Lu; Zhao, Bo; Fan, Wei; Han, Jiawei

    2015-01-01

    In the era of big data, information regarding the same objects can be collected from increasingly more sources. Unfortunately, there usually exist conflicts among the information coming from different sources. To tackle this challenge, truth discovery, i.e., to integrate multi-source noisy information by estimating the reliability of each source, has emerged as a hot topic. In many real world applications, however, the information may come sequentially, and as a consequence, the truth of objects as well as the reliability of sources may be dynamically evolving. Existing truth discovery methods, unfortunately, cannot handle such scenarios. To address this problem, we investigate the temporal relations among both object truths and source reliability, and propose an incremental truth discovery framework that can dynamically update object truths and source weights upon the arrival of new data. Theoretical analysis is provided to show that the proposed method is guaranteed to converge at a fast rate. The experiments on three real world applications and a set of synthetic data demonstrate the advantages of the proposed method over state-of-the-art truth discovery methods. PMID:26705502

  11. Proposed reliability cost model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  12. Network reliability

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1985-01-01

    Network control (or network management) functions are essential for efficient and reliable operation of a network. Some control functions are currently included as part of the Open System Interconnection model. For local area networks, it is widely recognized that there is a need for additional control functions, including fault isolation functions, monitoring functions, and configuration functions. These functions can be implemented in either a central or distributed manner. The Fiber Distributed Data Interface Medium Access Control and Station Management protocols provide an example of distributed implementation. Relative information is presented here in outline form.

  13. Robustness to Faults Promotes Evolvability: Insights from Evolving Digital Circuits

    PubMed Central

    Nolfi, Stefano

    2016-01-01

    We demonstrate how the need to cope with operational faults enables evolving circuits to find more fit solutions. The analysis of the results obtained in different experimental conditions indicates that, in absence of faults, evolution tends to select circuits that are small and have low phenotypic variability and evolvability. The need to face operation faults, instead, drives evolution toward the selection of larger circuits that are truly robust with respect to genetic variations and that have a greater level of phenotypic variability and evolvability. Overall our results indicate that the need to cope with operation faults leads to the selection of circuits that have a greater probability to generate better circuits as a result of genetic variation with respect to a control condition in which circuits are not subjected to faults. PMID:27409589

  14. Evolving phenotypic networks in silico.

    PubMed

    François, Paul

    2014-11-01

    Evolved gene networks are constrained by natural selection. Their structures and functions are consequently far from being random, as exemplified by the multiple instances of parallel/convergent evolution. One can thus ask if features of actual gene networks can be recovered from evolutionary first principles. I review a method for in silico evolution of small models of gene networks aiming at performing predefined biological functions. I summarize the current implementation of the algorithm, insisting on the construction of a proper "fitness" function. I illustrate the approach on three examples: biochemical adaptation, ligand discrimination and vertebrate segmentation (somitogenesis). While the structure of the evolved networks is variable, dynamics of our evolved networks are usually constrained and present many similar features to actual gene networks, including properties that were not explicitly selected for. In silico evolution can thus be used to predict biological behaviours without a detailed knowledge of the mapping between genotype and phenotype. PMID:24956562

  15. The evolved function of the oedipal conflict.

    PubMed

    Josephs, Lawrence

    2010-08-01

    Freud based his oedipal theory on three clinical observations of adult romantic relationships: (1) Adults tend to split love and lust; (2) There tend to be sex differences in the ways that men and women split love and lust; (3) Adult romantic relationships are unconsciously structured by the dynamics of love triangles in which dramas of seduction and betrayal unfold. Freud believed that these aspects of adult romantic relationships were derivative expressions of a childhood oedipal conflict that has been repressed. Recent research conducted by evolutionary psychologists supports many of Freud's original observations and suggests that Freud's oedipal conflict may have evolved as a sexually selected adaptation for reproductive advantage. The evolution of bi-parental care based on sexually exclusive romantic bonds made humans vulnerable to the costs of sexual infidelity, a situation of danger that seriously threatens monogamous bonds. A childhood oedipal conflict enables humans to better adapt to this longstanding evolutionary problem by providing the child with an opportunity to develop working models of love triangles. On the one hand, the oedipal conflict facilitates monogamous resolutions by creating intense anxiety about the dangers of sexual infidelity and mate poaching. On the other hand, the oedipal conflict in humans may facilitate successful cheating and mate poaching by cultivating a talent for hiding our true sexual intentions from others and even from ourselves. The oedipal conflict in humans may be disguised by evolutionary design in order to facilitate tactical deception in adult romantic relationships. PMID:20840647

  16. Slippery Texts and Evolving Literacies

    ERIC Educational Resources Information Center

    Mackey, Margaret

    2007-01-01

    The idea of "slippery texts" provides a useful descriptor for materials that mutate and evolve across different media. Eight adult gamers, encountering the slippery text "American McGee's Alice," demonstrate a variety of ways in which players attempt to manage their attention as they encounter a new text with many resonances. The range of their…

  17. Sequentially evolved bilateral epidural haematomas.

    PubMed

    Rochat, P; Johannesen, H H; Poulsgård, L; Bøgeskov, L

    2002-12-01

    Sequentially evolved bilateral epidural haematomas, where the second haematoma evolves after surgical removal of the first haematoma, are rarely reported. We report two cases of this entity. One patient was involved in a road traffic accident and the other was suffering from a head injury after an assault. CT scans showed that both patients had an unilateral epidural haematoma with a thin presumably epidural haemorrhage on the opposite side. Both patients were operated for their epidural haematomas, but did not improve after surgical treatment, and postoperative CT scans revealed evolving of an epidural haematoma on the opposite side. After evacuation of the second epidural haematoma both patients recovered quickly. Sequentially evolved bilateral epidural haematomas are rare, but must be considered in the postoperative intensive care treatment in patients with epidural haematomas. Both cases emphasize the need for intensive care monitoring after an operation for an epidural haematoma and the need for CT scans if the patient does not improve quickly after removal of the haematoma. This is especially important if a small contralateral haematoma is seen on the initial CT scan. PMID:12445923

  18. Signing Apes and Evolving Linguistics.

    ERIC Educational Resources Information Center

    Stokoe, William C.

    Linguistics retains from its antecedents, philology and the study of sacred writings, some of their apologetic and theological bias. Thus it has not been able to face squarely the question how linguistic function may have evolved from animal communication. Chimpanzees' use of signs from American Sign Language forces re-examination of language…

  19. You 3.0: The Most Important Evolving Technology

    ERIC Educational Resources Information Center

    Tamarkin, Molly; Bantz, David A.; Childs, Melody; diFilipo, Stephen; Landry, Stephen G.; LoPresti, Frances; McDonald, Robert H.; McGuthry, John W.; Meier, Tina; Rodrigo, Rochelle; Sparrow, Jennifer; Diggs, D. Teddy; Yang, Catherine W.

    2010-01-01

    That technology evolves is a given. Not as well understood is the impact of technological evolution on each individual--on oneself, one's skill development, one's career, and one's relationship with the work community. The authors believe that everyone in higher education will become an IT worker and that IT workers will be managing a growing…

  20. Diversity sustains an evolving network

    PubMed Central

    Mehrotra, Ravi; Soni, Vikram; Jain, Sanjay

    2009-01-01

    We study an evolutionary model of a complex system that evolves under catalytic dynamics and Darwinian selection and exhibits spontaneous growth, stasis and then a collapse of its structure. We find that the typical lifetime of the system increases sharply with the diversity of its components or species. We also find that the prime reason for crashes is a naturally occurring internal fragility of the system. This fragility is captured in the network organizational character and is related to a reduced multiplicity of pathways or feedback loops between its components. These results apply to several generalizations of the model as well. This work suggests new parameters for understanding the robustness of evolving molecular networks, ecosystems, societies and markets. PMID:19033136

  1. Evolvable Hardware for Space Applications

    NASA Technical Reports Server (NTRS)

    Lohn, Jason; Globus, Al; Hornby, Gregory; Larchev, Gregory; Kraus, William

    2004-01-01

    This article surveys the research of the Evolvable Systems Group at NASA Ames Research Center. Over the past few years, our group has developed the ability to use evolutionary algorithms in a variety of NASA applications ranging from spacecraft antenna design, fault tolerance for programmable logic chips, atomic force field parameter fitting, analog circuit design, and earth observing satellite scheduling. In some of these applications, evolutionary algorithms match or improve on human performance.

  2. When did oxygenic photosynthesis evolve?

    PubMed

    Buick, Roger

    2008-08-27

    The atmosphere has apparently been oxygenated since the 'Great Oxidation Event' ca 2.4 Ga ago, but when the photosynthetic oxygen production began is debatable. However, geological and geochemical evidence from older sedimentary rocks indicates that oxygenic photosynthesis evolved well before this oxygenation event. Fluid-inclusion oils in ca 2.45 Ga sandstones contain hydrocarbon biomarkers evidently sourced from similarly ancient kerogen, preserved without subsequent contamination, and derived from organisms producing and requiring molecular oxygen. Mo and Re abundances and sulphur isotope systematics of slightly older (2.5 Ga) kerogenous shales record a transient pulse of atmospheric oxygen. As early as ca 2.7 Ga, stromatolites and biomarkers from evaporative lake sediments deficient in exogenous reducing power strongly imply that oxygen-producing cyanobacteria had already evolved. Even at ca 3.2 Ga, thick and widespread kerogenous shales are consistent with aerobic photoautrophic marine plankton, and U-Pb data from ca 3.8 Ga metasediments suggest that this metabolism could have arisen by the start of the geological record. Hence, the hypothesis that oxygenic photosynthesis evolved well before the atmosphere became permanently oxygenated seems well supported. PMID:18468984

  3. Evolving Systems and Adaptive Key Component Control

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Balas, Mark J.

    2009-01-01

    We propose a new framework called Evolving Systems to describe the self-assembly, or autonomous assembly, of actively controlled dynamical subsystems into an Evolved System with a higher purpose. An introduction to Evolving Systems and exploration of the essential topics of the control and stability properties of Evolving Systems is provided. This chapter defines a framework for Evolving Systems, develops theory and control solutions for fundamental characteristics of Evolving Systems, and provides illustrative examples of Evolving Systems and their control with adaptive key component controllers.

  4. Development and the evolvability of human limbs

    PubMed Central

    Young, Nathan M.; Wagner, Günter P.; Hallgrímsson, Benedikt

    2010-01-01

    The long legs and short arms of humans are distinctive for a primate, the result of selection acting in opposite directions on each limb at different points in our evolutionary history. This mosaic pattern challenges our understanding of the relationship of development and evolvability because limbs are serially homologous and genetic correlations should act as a significant constraint on their independent evolution. Here we test a developmental model of limb covariation in anthropoid primates and demonstrate that both humans and apes exhibit significantly reduced integration between limbs when compared to quadrupedal monkeys. This result indicates that fossil hominins likely escaped constraints on independent limb variation via reductions to genetic pleiotropy in an ape-like last common ancestor (LCA). This critical change in integration among hominoids, which is reflected in macroevolutionary differences in the disparity between limb lengths, facilitated selection for modern human limb proportions and demonstrates how development helps shape evolutionary change. PMID:20133636

  5. Bolometric Flux Estimation for Cool Evolved Stars

    NASA Astrophysics Data System (ADS)

    van Belle, Gerard T.; Creech-Eakman, Michelle J.; Ruiz-Velasco, Alma E.

    2016-07-01

    Estimation of bolometric fluxes ({F}{{BOL}}) is an essential component of stellar effective temperature determination with optical and near-infrared interferometry. Reliable estimation of {F}{{BOL}} simply from broadband K-band photometry data is a useful tool in those cases were contemporaneous and/or wide-range photometry is unavailable for a detailed spectral energy distribution (SED) fit, as was demonstrated in Dyck et al. Recalibrating the intrinsic {F}{{BOL}} versus observed {F}{{2.2}μ {{m}}} relationship of that study with modern SED fitting routines, which incorporate the significantly non-blackbody, empirical spectral templates of the INGS spectral library (an update of the library in Pickles) and estimation of reddening, serves to greatly improve the accuracy and observational utility of this relationship. We find that {F}{{BOL}} values predicted are roughly 11% less than the corresponding values predicted in Dyck et al., indicating the effects of SED absorption features across bolometric flux curves.

  6. A slowly evolving host moves first in symbiotic interactions

    NASA Astrophysics Data System (ADS)

    Damore, James; Gore, Jeff

    2011-03-01

    Symbiotic relationships, both parasitic and mutualistic, are ubiquitous in nature. Understanding how these symbioses evolve, from bacteria and their phages to humans and our gut microflora, is crucial in understanding how life operates. Often, symbioses consist of a slowly evolving host species with each host only interacting with its own sub-population of symbionts. The Red Queen hypothesis describes coevolutionary relationships as constant arms races with each species rushing to evolve an advantage over the other, suggesting that faster evolution is favored. Here, we use a simple game theoretic model of host- symbiont coevolution that includes population structure to show that if the symbionts evolve much faster than the host, the equilibrium distribution is the same as it would be if it were a sequential game where the host moves first against its symbionts. For the slowly evolving host, this will prove to be advantageous in mutualisms and a handicap in antagonisms. The model allows for symbiont adaptation to its host, a result that is robust to changes in the parameters and generalizes to continuous and multiplayer games. Our findings provide insight into a wide range of symbiotic phenomena and help to unify the field of coevolutionary theory.

  7. Reliability and Confidence.

    ERIC Educational Resources Information Center

    Test Service Bulletin, 1952

    1952-01-01

    Some aspects of test reliability are discussed. Topics covered are: (1) how high should a reliability coefficient be?; (2) two factors affecting the interpretation of reliability coefficients--range of talent and interval between testings; (3) some common misconceptions--reliability of speed tests, part vs. total reliability, reliability for what…

  8. Synchronization in an evolving network

    NASA Astrophysics Data System (ADS)

    Singh, R. K.; Bagarti, Trilochan

    2015-09-01

    In this work we study the dynamics of Kuramoto oscillators on a stochastically evolving network whose evolution is governed by the phases of the individual oscillators and degree distribution. Synchronization is achieved after a threshold connection density is reached. This cumulative effect of topology and dynamics has many real-world implications, where synchronization in a system emerges as a collective property of its components in a self-organizing manner. The synchronous state remains stable as long as the connection density remains above the threshold value, with additional links providing resilience against network fluctuations.

  9. International Lead Zinc Research Organization-sponsored field-data collection and analysis to determine relationships between service conditions and reliability of valve-regulated lead-acid batteries in stationary applications

    NASA Astrophysics Data System (ADS)

    Taylor, P. A.; Moseley, P. T.; Butler, P. C.

    The International Lead Zinc Research Organization (ILZRO), in cooperation with Sandia National Laboratories, has initiated a multi-phase project with the following aims: to characterize relationships between valve-regulated lead-acid (VRLA) batteries, service conditions, and failure modes; to establish the degree of correlation between specific operating procedures and PCL; to identify operating procedures that mitigate PCL; to identify best-fits between the operating requirements of specific applications and the capabilities of specific VRLA technologies; to recommend combinations of battery design, manufacturing processes, and operating conditions that enhance VRLA performance and reliability. In the first phase of this project, ILZRO has contracted with Energetics to identify and survey manufacturers and users of VRLA batteries for stationary applications (including electric utilities, telecommunications companies, and government facilities). The confidential survey is collecting the service conditions of specific applications and performance records for specific VRLA technologies. From the data collected, Energetics is constructing a database of the service histories and analyzing the data to determine trends in performance for particular technologies in specific service conditions. ILZRO plans to make the final report of the analysis and a version of the database (that contains no proprietary information) available to ILZRO members, participants in the survey, and participants in a follow-on workshop for stakeholders in VRLA reliability. This paper presents the surveys distributed to manufacturers and end-users, discusses the analytic approach, presents an overview of the responses to the surveys and trends that have emerged in the early analysis of the data, and previews the functionality of the database being constructed.

  10. canEvolve: A Web Portal for Integrative Oncogenomics

    PubMed Central

    Yan, Zhenyu; Wang, Xujun; Cao, Qingyi; Munshi, Nikhil C.; Li, Cheng

    2013-01-01

    Background & Objective Genome-wide profiles of tumors obtained using functional genomics platforms are being deposited to the public repositories at an astronomical scale, as a result of focused efforts by individual laboratories and large projects such as the Cancer Genome Atlas (TCGA) and the International Cancer Genome Consortium. Consequently, there is an urgent need for reliable tools that integrate and interpret these data in light of current knowledge and disseminate results to biomedical researchers in a user-friendly manner. We have built the canEvolve web portal to meet this need. Results canEvolve query functionalities are designed to fulfill most frequent analysis needs of cancer researchers with a view to generate novel hypotheses. canEvolve stores gene, microRNA (miRNA) and protein expression profiles, copy number alterations for multiple cancer types, and protein-protein interaction information. canEvolve allows querying of results of primary analysis, integrative analysis and network analysis of oncogenomics data. The querying for primary analysis includes differential gene and miRNA expression as well as changes in gene copy number measured with SNP microarrays. canEvolve provides results of integrative analysis of gene expression profiles with copy number alterations and with miRNA profiles as well as generalized integrative analysis using gene set enrichment analysis. The network analysis capability includes storage and visualization of gene co-expression, inferred gene regulatory networks and protein-protein interaction information. Finally, canEvolve provides correlations between gene expression and clinical outcomes in terms of univariate survival analysis. Conclusion At present canEvolve provides different types of information extracted from 90 cancer genomics studies comprising of more than 10,000 patients. The presence of multiple data types, novel integrative analysis for identifying regulators of oncogenesis, network analysis and ability

  11. The Evolving Relationship between Researchers and Public Policy

    ERIC Educational Resources Information Center

    Henig, Jeffrey R.

    2008-01-01

    When it comes to the role of research in shaping public policy and debate, one might reasonably argue that this is the best of times. No Child Left Behind (NCLB), with its frequent mention of evidence-based decision making, has underscored the role that objective knowledge should play in a democratic society. The Institute of Education Sciences,…

  12. HIV and HLA Class I: an evolving relationship

    PubMed Central

    Goulder, Philip J.R.; Walker, Bruce D

    2014-01-01

    Successful vaccine development for infectious diseases has largely been achieved in settings where natural immunity to the pathogen results in clearance in at least some individuals. HIV presents an additional challenge in that natural clearance of infection does not occur, and the correlates of immune protection are still uncertain. However, partial control of viremia and markedly different outcomes of disease are observed in HIV infected persons. Here we examine the antiviral mechanisms implicated by one variable that has been consistently associated with extremes of outcome, namely HLA class I alleles, and in particular HLA-B, and examine the mechanisms by which this modulation is likely to occur, and the impact of these interactions on evolution of the virus and the host. Studies to date provide evidence for both HLA-dependent and epitope-dependent influences on viral control and viral evolution, and have important implications for the continued quest for an effective HIV vaccine. PMID:22999948

  13. Primordial evolvability: Impasses and challenges.

    PubMed

    Vasas, Vera; Fernando, Chrisantha; Szilágyi, András; Zachár, István; Santos, Mauro; Szathmáry, Eörs

    2015-09-21

    While it is generally agreed that some kind of replicating non-living compounds were the precursors of life, there is much debate over their possible chemical nature. Metabolism-first approaches propose that mutually catalytic sets of simple organic molecules could be capable of self-replication and rudimentary chemical evolution. In particular, the graded autocatalysis replication domain (GARD) model, depicting assemblies of amphiphilic molecules, has received considerable interest. The system propagates compositional information across generations and is suggested to be a target of natural selection. However, evolutionary simulations indicate that the system lacks selectability (i.e. selection has negligible effect on the equilibrium concentrations). We elaborate on the lessons learnt from the example of the GARD model and, more widely, on the issue of evolvability, and discuss the implications for similar metabolism-first scenarios. We found that simple incorporation-type chemistry based on non-covalent bonds, as assumed in GARD, is unlikely to result in alternative autocatalytic cycles when catalytic interactions are randomly distributed. An even more serious problem stems from the lognormal distribution of catalytic factors, causing inherent kinetic instability of such loops, due to the dominance of efficiently catalyzed components that fail to return catalytic aid. Accordingly, the dynamics of the GARD model is dominated by strongly catalytic, but not auto-catalytic, molecules. Without effective autocatalysis, stable hereditary propagation is not possible. Many repetitions and different scaling of the model come to no rescue. Despite all attempts to show the contrary, the GARD model is not evolvable, in contrast to reflexively autocatalytic networks, complemented by rare uncatalyzed reactions and compartmentation. The latter networks, resting on the creation and breakage of chemical bonds, can generate novel ('mutant') autocatalytic loops from a given set of

  14. Isotopic Analysis and Evolved Gases

    NASA Technical Reports Server (NTRS)

    Swindle, Timothy D.; Boynton, William V.; Chutjian, Ara; Hoffman, John H.; Jordan, Jim L.; Kargel, Jeffrey S.; McEntire, Richard W.; Nyquist, Larry

    1996-01-01

    Precise measurements of the chemical, elemental, and isotopic composition of planetary surface material and gases, and observed variations in these compositions, can contribute significantly to our knowledge of the source(s), ages, and evolution of solar system materials. The analyses discussed in this paper are mostly made by mass spectrometers or some other type of mass analyzer, and address three broad areas of interest: (1) atmospheric composition - isotopic, elemental, and molecular, (2) gases evolved from solids, and (3) solids. Current isotopic data on nine elements, mostly from in situ analysis, but also from meteorites and telescopic observations are summarized. Potential instruments for isotopic analysis of lunar, Martian, Venusian, Mercury, and Pluto surfaces, along with asteroid, cometary and icy satellites, surfaces are discussed.

  15. Drastic events make evolving networks

    NASA Astrophysics Data System (ADS)

    Ausloos, M.; Lambiotte, R.

    2007-05-01

    Co-authorship networks of neighbouring scientific disciplines, i.e. granular (G) media and networks (N) are studied in order to observe drastic structural changes in evolving networks. The data is taken from arXives. The system is described as coupled networks. By considering the 1995-2005 time interval and scanning the author-article network evolution with a mobile time window, we focus on the properties of the links, as well as on the time evolution of the nodes. They can be in three states, N, G or multi-disciplinary (M). This leads to drastic jumps in a so-called order parameter, i.e. the link proportion of a given type, forming the main island, that reminds of features appearing at percolation and during metastable (aggregation-desaggregation) processes. The data analysis also focuses on the way different kinds (N, G or M) of authors collaborate, and on the kind of the resulting collaboration.

  16. Speech processing: An evolving technology

    SciTech Connect

    Crochiere, R.E.; Flanagan, J.L.

    1986-09-01

    As we enter the information age, speech processing is emerging as an important technology for making machines easier and more convenient for humans to use. It is both an old and a new technology - dating back to the invention of the telephone and forward, at least in aspirations, to the capabilities of HAL in 2001. Explosive advances in microelectronics now make it possible to implement economical real-time hardware for sophisticated speech processing - processing that formerly could be demonstrated only in simulations on main-frame computers. As a result, fundamentally new product concepts - as well as new features and functions in existing products - are becoming possible and are being explored in the marketplace. As the introductory piece to this issue, the authors draw a brief perspective on the evolving field of speech processing and assess the technology in the the three constituent sectors: speech coding, synthesis, and recognition.

  17. Planets in Evolved Binary Systems

    NASA Astrophysics Data System (ADS)

    Perets, Hagai B.

    2011-03-01

    Exo-planets are typically thought to form in protoplanetary disks left over from protostellar disk of their newly formed host star. However, additional planetary formation and evolution routes may exist in old evolved binary systems. Here we discuss the implications of binary stellar evolution on planetary systems in such environments. In these binary systems stellar evolution could lead to the formation of symbiotic stars, where mass is lost from one star and could be transferred to its binary companion, and may form an accretion disk around it. This raises the possibility that such a disk could provide the necessary environment for the formation of a new, second generation of planets in both circumstellar or circumbinary configurations. Pre-existing first generation planets surviving the post-MS evolution of such systems would be dynamically effected by the mass loss in the systems and may also interact with the newly formed disk. Such planets and/or planetesimals may also serve as seeds for the formation of the second generation planets, and/or interact with them, possibly forming atypical planetary systems. Second generation planetary systems should be typically found in white dwarf binary systems, and may show various observational signatures. Most notably, second generation planets could form in environment which are inaccessible, or less favorable, for first generation planets. The orbital phase space available for the second generation planets could be forbidden (in terms of the system stability) to first generation planets in the pre-evolved progenitor binaries. In addition planets could form in metal poor environments such as globular clusters and/or in double compact object binaries. Observations of exo-planets in such forbidden or unfavorable regions could possibly serve to uniquely identify their second generation character. Finally, we point out a few observed candidate second generation planetary systems, including Gl 86, HD 27442 and all of the

  18. Evolving toward Laughter in Learning

    ERIC Educational Resources Information Center

    Strean, William B.

    2008-01-01

    Lowman (1995) described the relationship between teacher and student and student engagement as the two most important ingredients in learning in higher education. Humour builds teacher-student connection (Berk, 1998) and engages students in the learning process. The bond between student and teacher is essential for learning, satisfaction, and…

  19. Carl Thoresen: The Evolving Pioneer

    ERIC Educational Resources Information Center

    Harris, Alex H. S.

    2009-01-01

    This interview with Carl E. Thoresen highlights the experiences, relationships, and ideas that have influenced this pioneering psychologist throughout the past half century. His scholarly work, professional service, teaching, and mentorship have motivated many counseling psychologists to radically expand their areas of inquiry. He was among the…

  20. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  1. Reliability model generator

    NASA Technical Reports Server (NTRS)

    McMann, Catherine M. (Inventor); Cohen, Gerald C. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  2. Multiscale modelling of evolving foams

    NASA Astrophysics Data System (ADS)

    Saye, R. I.; Sethian, J. A.

    2016-06-01

    We present a set of multi-scale interlinked algorithms to model the dynamics of evolving foams. These algorithms couple the key effects of macroscopic bubble rearrangement, thin film drainage, and membrane rupture. For each of the mechanisms, we construct consistent and accurate algorithms, and couple them together to work across the wide range of space and time scales that occur in foam dynamics. These algorithms include second order finite difference projection methods for computing incompressible fluid flow on the macroscale, second order finite element methods to solve thin film drainage equations in the lamellae and Plateau borders, multiphase Voronoi Implicit Interface Methods to track interconnected membrane boundaries and capture topological changes, and Lagrangian particle methods for conservative liquid redistribution during rearrangement and rupture. We derive a full set of numerical approximations that are coupled via interface jump conditions and flux boundary conditions, and show convergence for the individual mechanisms. We demonstrate our approach by computing a variety of foam dynamics, including coupled evolution of three-dimensional bubble clusters attached to an anchored membrane and collapse of a foam cluster.

  3. Circumstellar Crystalline Silicates: Evolved Stars

    NASA Astrophysics Data System (ADS)

    Tartar, Josh; Speck, A. K.

    2008-05-01

    One of the most exciting developments in astronomy in the last 15 years was the discovery of crystalline silicate stardust by the Short Wavelength Spectrometer (SWS) on board of ISO; discovery of the crystalline grains was indeed one of the biggest surprises of the ISO mission. Initially discovered around AGB stars (evolved stars in the range of 0.8 > M/M¤>8) at far-infrared (IR) wavelengths, crystalline silicates have since been seen in many astrophysical environments including young stellar objects (T Tauri and Herbig Ae/Be), comets and Ultra Luminous Infrared Galaxies. Low and intermediate mass stars (LIMS) comprise 95% of the contributors to the ISM, so study of the formation of crystalline silicates is critical to our understanding of the ISM, which is thought to be primarily amorphous (one would expect an almost exact match between the composition of AGB dust shells and the dust in the ISM). Whether the crystalline dust is merely undetectable or amorphized remains a mystery. The FORCAST instrument on SOFIA as well as the PACS instrument on Herschel will provide exciting observing opportunities for the further study of crystalline silicates.

  4. How do drumlin patterns evolve?

    NASA Astrophysics Data System (ADS)

    Ely, Jeremy; Clark, Chris; Spagnolo, Matteo; Hughes, Anna

    2016-04-01

    The flow of a geomorphic agent over a sediment bed creates patterns in the substrate composed of bedforms. Ice is no exception to this, organising soft sedimentary substrates into subglacial bedforms. As we are yet to fully observe their initiation and evolution beneath a contemporary ice mass, little is known about how patterns in subglacial bedforms develop. Here we study 36,222 drumlins, divided into 72 flowsets, left behind by the former British-Irish Ice sheet. These flowsets provide us with 'snapshots' of drumlin pattern development. The probability distribution functions of the size and shape metrics of drumlins within these flowsets were analysed to determine whether behaviour that is common of other patterned phenomena has occurred. Specifically, we ask whether drumlins i) are printed at a specific scale; ii) grow or shrink after they initiate; iii) stabilise at a specific size and shape; and iv) migrate. Our results indicate that drumlins initiate at a minimum size and spacing. After initiation, the log-normal distribution of drumlin size and shape metrics suggests that drumlins grow, or possibly shrink, as they develop. We find no evidence for stabilisation in drumlin length, supporting the idea of a subglacial bedform continuum. Drumlin migration is difficult to determine from the palaeo-record. However, there are some indications that a mixture of static and mobile drumlins occurs, which could potentially lead to collisions, cannibalisation and coarsening. Further images of modern drumlin fields evolving beneath ice are required to capture stages of drumlin pattern evolution.

  5. Magnetic fields around evolved stars

    NASA Astrophysics Data System (ADS)

    Leal-Ferreira, M.; Vlemmings, W.; Kemball, A.; Amiri, N.; Maercker, M.; Ramstedt, S.; Olofsson, G.

    2014-04-01

    A number of mechanisms, such as magnetic fields, (binary) companions and circumstellar disks have been suggested to be the cause of non-spherical PNe and in particular collimated outflows. This work investigates one of these mechanisms: the magnetic fields. While MHD simulations show that the fields can indeed be important, few observations of magnetic fields have been done so far. We used the VLBA to observe five evolved stars, with the goal of detecting the magnetic field by means of water maser polarization. The sample consists in four AGB stars (IK Tau, RT Vir, IRC+60370 and AP Lyn) and one pPN (OH231.8+4.2). In four of the five sources, several strong maser features were detected allowing us to measure the linear and/or circular polarization. Based on the circular polarization detections, we infer the strength of the component of the field along the line of sight to be between ~30 mG and ~330 mG in the water maser regions of these four sources. When extrapolated to the surface of the stars, the magnetic field strength would be between a few hundred mG and a few Gauss when assuming a toroidal field geometry and higher when assuming more complex magnetic fields. We conclude that the magnetic energy we derived in the water maser regions is higher than the thermal and kinetic energy, leading to the conclusion that, indeed, magnetic fields probably play an important role in shaping Planetary Nebulae.

  6. Submillimeter observations of evolved stars

    NASA Technical Reports Server (NTRS)

    Sopka, R. J.; Hildebrand, R.; Jaffe, D. T.; Gatley, I.; Roellig, T.

    1985-01-01

    Broadband submillimeter observations of thermal emission from several evolved stars have been obtained using the United Kingdom Infrared Telescope on Mauna Kea, Hawaii. The observations were carried out at an effective wavelength of 400 microns in order to estimate the mass loss rates in dust from the stars. Direct estimates of mass loss rates are in the range 10 to the -9th to 10 to the -6th solar mass/yr. Analysis of the spectrum of IRC + 10216 confirmed previous estimates of dust grain emissivity in the range 10-1000 microns. The infrared properties of IRC + 10216 are found to be similar to the carbon rich object CRL 3068. No systematic difference was found between the dust masses of carbon rich and oxygen rich envelopes. The largest mass loss rates in dust were obtained for the bipolar objects OH 231.8 + 4.2 CRL 2688, CRL 618, and NGC 7027. It is suggested that the ratios of gas to dust, and the slopes of the far infrared to submillimeter wavelength continua of these stars objects are probably representative of amorphous rather than crystalline grains.

  7. Exploring Evolving Media Discourse Through Event Cueing.

    PubMed

    Lu, Yafeng; Steptoe, Michael; Burke, Sarah; Wang, Hong; Tsai, Jiun-Yi; Davulcu, Hasan; Montgomery, Douglas; Corman, Steven R; Maciejewski, Ross

    2016-01-01

    Online news, microblogs and other media documents all contain valuable insight regarding events and responses to events. Underlying these documents is the concept of framing, a process in which communicators act (consciously or unconsciously) to construct a point of view that encourages facts to be interpreted by others in a particular manner. As media discourse evolves, how topics and documents are framed can undergo change, shifting the discussion to different viewpoints or rhetoric. What causes these shifts can be difficult to determine directly; however, by linking secondary datasets and enabling visual exploration, we can enhance the hypothesis generation process. In this paper, we present a visual analytics framework for event cueing using media data. As discourse develops over time, our framework applies a time series intervention model which tests to see if the level of framing is different before or after a given date. If the model indicates that the times before and after are statistically significantly different, this cues an analyst to explore related datasets to help enhance their understanding of what (if any) events may have triggered these changes in discourse. Our framework consists of entity extraction and sentiment analysis as lenses for data exploration and uses two different models for intervention analysis. To demonstrate the usage of our framework, we present a case study on exploring potential relationships between climate change framing and conflicts in Africa. PMID:26529702

  8. Evolving the ingredients for reciprocity and spite

    PubMed Central

    Hauser, Marc; McAuliffe, Katherine; Blake, Peter R.

    2009-01-01

    Darwin never provided a satisfactory account of altruism, but posed the problem beautifully in light of the logic of natural selection. Hamilton and Williams delivered the necessary satisfaction by appealing to kinship, and Trivers showed that kinship was not necessary as long as the originally altruistic act was conditionally reciprocated. From the late 1970s to the present, the kinship theories in particular have been supported by considerable empirical data and elaborated to explore a number of other social interactions such as cooperation, selfishness and punishment, giving us what is now a rich description of the nature of social relationships among organisms. There are, however, two forms of theoretically possible social interactions—reciprocity and spite—that appear absent or nearly so in non-human vertebrates, despite considerable research efforts on a wide diversity of species. We suggest that the rather weak comparative evidence for these interactions is predicted once we consider the requisite socioecological pressures and psychological mechanisms. That is, a consideration of ultimate demands and proximate prerequisites leads to the prediction that reciprocity and spite should be rare in non-human animals, and common in humans. In particular, reciprocity and spite evolved in humans because of adaptive demands on cooperation among unrelated individuals living in large groups, and the integrative capacities of inequity detection, future-oriented decision-making and inhibitory control. PMID:19805432

  9. Reliability Generalization: "Lapsus Linguae"

    ERIC Educational Resources Information Center

    Smith, Julie M.

    2011-01-01

    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  10. Submillimeter observations of evolved stars

    SciTech Connect

    Sopka, R.J.; Hildebrand, R.; Jaffe, D.T.; Gatley, I.; Roellig, T.; Werner, M.; Jura, M.; Zuckerman, B.

    1985-07-01

    Broad-band submillimeter observations of the thermal emission from evolved stars have been obtained with the United Kingdom Infrared Telescope on Mauna Kea, Hawaii. These observations, at an effective wavelength of 400 ..mu..m, provide the most direct method for estimating the mass loss rate in dust from these stars and also help to define the long-wavelength thermal spectrum of the dust envelopes. The mass loss rates in dust that we derive range from 10/sup -9/ to 10/sup -6/ M/sub sun/ yr/sup -1/ and are compared with mass loss rates derived from molecular line observations to estimate gas-to-dust ratios in outflowing envelopes. These values are found to be generally compatible with the interstellar gas-to-dust ratio of approx.100 if submillimeter emissivities appropriate to amorphous grain structures are assumed. Our analysis of the spectrum of IRC+10216 confirms previous suggestions that the grain emissivity varies as lambda/sup -1.2/ rather than as lambda/sup -2/ for 10

  11. Voyages Through Time: Everything Evolves

    NASA Astrophysics Data System (ADS)

    Pendleton, Y. J.; Tarter, J. C.; DeVore, E. K.; O'Sullivan, K. A.; Taylor, S. M.

    2001-12-01

    Evolutionary change is a powerful framework for studying our world and our place therein. It is a recurring theme in every realm of science: over time, the universe, the planet Earth, life, and human technologies all change, albeit on vastly different scales. Evolution offers scientific explanations for the age-old question, "Where did we come from?" In addition, historical perspectives of science show how our understanding has evolved over time. The complexities of all of these systems will never reveal a "finished" story. But it is a story of epic size, capable of inspiring awe and of expanding our sense of time and place, and eminently worthy of investigating. This story is the basis of Voyages Through Time. Voyages Through Time (VTT), provides teachers with not only background science content and pedagogy, but also with materials and resources for the teaching of evolution. The six modules, Cosmic Evolution, Planetary Evolution, Origin of Life, Evolution of Life, Hominid Evolution, and Evolution of Technology, emphasize student inquiry, and promote the nature of science, as recommended in the NSES and BSL. The modules are unified by the overarching theme of evolution and the meta questions: "What is changing?" "What is the rate of change?" and "What is the mechanism of change?" Determination of student outcomes for the project required effective collaboration of scientists, teachers, students and media specialists. The broadest curricula students outcomes are 1) an enjoyment of science, 2) an understanding of the nature of science, especially the understanding of evidence and re-evaluation, and 3) key science content. The curriculum is being developed by the SETI Institute, NASA Ames Research Center, California Academy of Sciences, and San Francisco State University, and is funded by the NSF (IMD 9730693), with support form Hewlett-Packard Company, The Foundation for Microbiology, Combined Federated Charities, NASA Astrobiology Institute, and NASA Fundamental

  12. Evolving issues in surrogate motherhood.

    PubMed

    Erlen, J A; Holzman, I R

    1990-01-01

    Surrogate mothering is an arrangement whereby a woman who gives birth to an infant intends--through a contractual agreement--to give that baby to another couple. The recent Baby M case in the United States has raised numerous legal concerns causing many legislative bodies to consider possible statutes to regulate or prohibit surrogacy. The competing interests among and between the individuals involved in this relationship (i.e., the surrogate mother, the couple, the baby, and society) suggest various ethical issues related to benefits, risks, and autonomy. Legal and ethical concerns surrounding the technologically possible procedure of surrogate motherhood are discussed. PMID:2391288

  13. HELIOS Critical Design Review: Reliability

    NASA Technical Reports Server (NTRS)

    Benoehr, H. C.; Herholz, J.; Prem, H.; Mann, D.; Reichert, L.; Rupp, W.; Campbell, D.; Boettger, H.; Zerwes, G.; Kurvin, C.

    1972-01-01

    This paper presents Helios Critical Design Review Reliability form October 16-20, 1972. The topics include: 1) Reliability Requirement; 2) Reliability Apportionment; 3) Failure Rates; 4) Reliability Assessment; 5) Reliability Block Diagram; and 5) Reliability Information Sheet.

  14. Can There Be Reliability without "Reliability?"

    ERIC Educational Resources Information Center

    Mislevy, Robert J.

    2004-01-01

    An "Educational Researcher" article by Pamela Moss (1994) asks the title question, "Can there be validity without reliability?" Yes, she answers, if by reliability one means "consistency among independent observations intended as interchangeable" (Moss, 1994, p. 7), quantified by internal consistency indices such as KR-20 coefficients and…

  15. Compound estimation procedures in reliability

    NASA Technical Reports Server (NTRS)

    Barnes, Ron

    1990-01-01

    At NASA, components and subsystems of components in the Space Shuttle and Space Station generally go through a number of redesign stages. While data on failures for various design stages are sometimes available, the classical procedures for evaluating reliability only utilize the failure data on the present design stage of the component or subsystem. Often, few or no failures have been recorded on the present design stage. Previously, Bayesian estimators for the reliability of a single component, conditioned on the failure data for the present design, were developed. These new estimators permit NASA to evaluate the reliability, even when few or no failures have been recorded. Point estimates for the latter evaluation were not possible with the classical procedures. Since different design stages of a component (or subsystem) generally have a good deal in common, the development of new statistical procedures for evaluating the reliability, which consider the entire failure record for all design stages, has great intuitive appeal. A typical subsystem consists of a number of different components and each component has evolved through a number of redesign stages. The present investigations considered compound estimation procedures and related models. Such models permit the statistical consideration of all design stages of each component and thus incorporate all the available failure data to obtain estimates for the reliability of the present version of the component (or subsystem). A number of models were considered to estimate the reliability of a component conditioned on its total failure history from two design stages. It was determined that reliability estimators for the present design stage, conditioned on the complete failure history for two design stages have lower risk than the corresponding estimators conditioned only on the most recent design failure data. Several models were explored and preliminary models involving bivariate Poisson distribution and the

  16. The Problem of Evolving a Genetic Code

    ERIC Educational Resources Information Center

    Woese, Carl R.

    1970-01-01

    Proposes models for the evolution of the genetic code and translation mechanisms. Suggests that the translation process is so complex and precise that it must have evolved in many stages, and that the evolution of the code was influenced by the constraints imposed by the evolving translation mechanism. (EB)

  17. What Technology? Reflections on Evolving Services

    ERIC Educational Resources Information Center

    Collins, Sharon

    2009-01-01

    Each year, the members of the EDUCAUSE Evolving Technologies Committee identify and research the evolving technologies that are having--or are predicted to have--the most direct impact on higher education institutions. The committee members choose the relevant topics, write white papers, and present their findings at the EDUCAUSE annual…

  18. Reliability computation from reliability block diagrams

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.; Eckstein, R. E.

    1971-01-01

    A method and a computer program are presented to calculate probability of system success from an arbitrary reliability block diagram. The class of reliability block diagrams that can be handled include any active/standby combination of redundancy, and the computations include the effects of dormancy and switching in any standby redundancy. The mechanics of the program are based on an extension of the probability tree method of computing system probabilities.

  19. Human Reliability Program Overview

    SciTech Connect

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  20. Power electronics reliability analysis.

    SciTech Connect

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  1. Evolvable Cryogenics (ECRYO) Pressure Transducer Calibration Test

    NASA Technical Reports Server (NTRS)

    Diaz, Carlos E., Jr.

    2015-01-01

    This paper provides a summary of the findings of recent activities conducted by Marshall Space Flight Center's (MSFC) In-Space Propulsion Branch and MSFC's Metrology and Calibration Lab to assess the performance of current "state of the art" pressure transducers for use in long duration storage and transfer of cryogenic propellants. A brief historical narrative in this paper describes the Evolvable Cryogenics program and the relevance of these activities to the program. This paper also provides a review of three separate test activities performed throughout this effort, including: (1) the calibration of several pressure transducer designs in a liquid nitrogen cryogenic environmental chamber, (2) the calibration of a pressure transducer in a liquid helium Dewar, and (3) the calibration of several pressure transducers at temperatures ranging from 20 to 70 degrees Kelvin (K) using a "cryostat" environmental chamber. These three separate test activities allowed for study of the sensors along a temperature range from 4 to 300 K. The combined data shows that both the slope and intercept of the sensor's calibration curve vary as a function of temperature. This homogeneous function is contrary to the linearly decreasing relationship assumed at the start of this investigation. Consequently, the data demonstrates the need for lookup tables to change the slope and intercept used by any data acquisition system. This ultimately would allow for more accurate pressure measurements at the desired temperature range. This paper concludes with a review of a request for information (RFI) survey conducted amongst different suppliers to determine the availability of current "state of the art" flight-qualified pressure transducers. The survey identifies requirements that are most difficult for the suppliers to meet, most notably the capability to validate the sensor's performance at temperatures below 70 K.

  2. Reliable Design Versus Trust

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  3. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  4. Reliability model generator specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Mccann, Catherine

    1990-01-01

    The Reliability Model Generator (RMG), a program which produces reliability models from block diagrams for ASSIST, the interface for the reliability evaluation tool SURE is described. An account is given of motivation for RMG and the implemented algorithms are discussed. The appendices contain the algorithms and two detailed traces of examples.

  5. Predicting software reliability

    NASA Technical Reports Server (NTRS)

    Littlewood, B.

    1989-01-01

    A detailed look is given to software reliability techniques. A conceptual model of the failure process is examined, and some software reliability growth models are discussed. Problems for which no current solutions exist are addressed, emphasizing the very difficult problem of safety-critical systems for which the reliability requirements can be enormously demanding.

  6. Properties of artificial networks evolved to contend with natural spectra.

    PubMed

    Morgenstern, Yaniv; Rostami, Mohammad; Purves, Dale

    2014-07-22

    Understanding why spectra that are physically the same appear different in different contexts (color contrast), whereas spectra that are physically different appear similar (color constancy) presents a major challenge in vision research. Here, we show that the responses of biologically inspired neural networks evolved on the basis of accumulated experience with spectral stimuli automatically generate contrast and constancy. The results imply that these phenomena are signatures of a strategy that biological vision uses to circumvent the inverse optics problem as it pertains to light spectra, and that double-opponent neurons in early-level vision evolve to serve this purpose. This strategy provides a way of understanding the peculiar relationship between the objective world and subjective color experience, as well as rationalizing the relevant visual circuitry without invoking feature detection or image representation. PMID:25024184

  7. Properties of artificial networks evolved to contend with natural spectra

    PubMed Central

    Morgenstern, Yaniv; Rostami, Mohammad; Purves, Dale

    2014-01-01

    Understanding why spectra that are physically the same appear different in different contexts (color contrast), whereas spectra that are physically different appear similar (color constancy) presents a major challenge in vision research. Here, we show that the responses of biologically inspired neural networks evolved on the basis of accumulated experience with spectral stimuli automatically generate contrast and constancy. The results imply that these phenomena are signatures of a strategy that biological vision uses to circumvent the inverse optics problem as it pertains to light spectra, and that double-opponent neurons in early-level vision evolve to serve this purpose. This strategy provides a way of understanding the peculiar relationship between the objective world and subjective color experience, as well as rationalizing the relevant visual circuitry without invoking feature detection or image representation. PMID:25024184

  8. Systems approaches in understanding evolution and evolvability.

    PubMed

    Agarwal, Sumeet

    2013-12-01

    Systems and network-based approaches are becoming increasingly popular in cellular biology. One contribution of such approaches has been to shed some light on the evolutionary origins of core organisational principles in biological systems, such as modularity, robustness, and evolvability. Models of interactions between genes (epistasis) have also provided insight into how sexual reproduction may have evolved. Additionally, recent work on viewing evolution as a form of learning from the environment has indicated certain bounds on the complexity of the genetic circuits that can evolve within feasible quantities of time and resources. Here we review the key studies and results in these areas, and discuss possible connections between them. In particular, we speculate on the link between the two notions of 'evolvability': the evolvability of a system in terms of how agile it is in responding to novel goals or environments, and the evolvability of certain kinds of gene network functionality in terms of its computational complexity. Drawing on some recent work on the complexity of graph-theoretic problems on modular networks, we suggest that modularity as an organising principle may have its raison d'etre in its ability to enhance evolvability, in both its senses. PMID:24120732

  9. Evolving communicative complexity: insights from rodents and beyond.

    PubMed

    Pollard, Kimberly A; Blumstein, Daniel T

    2012-07-01

    Social living goes hand in hand with communication, but the details of this relationship are rarely simple. Complex communication may be described by attributes as diverse as a species' entire repertoire, signallers' individualistic signatures, or complex acoustic phenomena within single calls. Similarly, attributes of social complexity are diverse and may include group size, social role diversity, or networks of interactions and relationships. How these different attributes of social and communicative complexity co-evolve is an active question in behavioural ecology. Sciurid rodents (ground squirrels, prairie dogs and marmots) provide an excellent model system for studying these questions. Sciurid studies have found that demographic role complexity predicts alarm call repertoire size, while social group size predicts alarm call individuality. Along with other taxa, sciurids reveal an important insight: different attributes of sociality are linked to different attributes of communication. By breaking social and communicative complexity down to different attributes, focused studies can better untangle the underlying evolutionary relationships and move us closer to a comprehensive theory of how sociality and communication evolve. PMID:22641825

  10. Interactions between planets and evolved stars

    NASA Astrophysics Data System (ADS)

    Shengbang, Qian; Zhongtao, Han; Fernández Lajús, E.; liying, Zhu; Wenping, Liao; Miloslav, Zejda; Linjia, Li; Voloshina, Irina; Liang, Liu; Jiajia., He

    2016-07-01

    Searching for planetary companions to evolved stars (e.g., white dwarfs (WD) and Cataclysmic Variables (CV)) can provide insight into the interaction between planets and evolved stars as well as on the ultimate fate of planets. We have monitored decades of CVs and their progenitors including some detached WD binaries since 2006 to search for planets orbiting these systems. In the present paper, we will show some observational results of circumbinary planets in orbits around CVs and their progenitors. Some of our findings include planets with the shortest distance to the central evolved binaries and a few multiple planetary systems orbiting binary stars. Finally, by comparing the observational properties of planetary companions to single WDs and WD binaries, the interaction between planets and evolved stars and the ultimate fate of planets are discussed.

  11. Neural mechanisms underlying the evolvability of behaviour

    PubMed Central

    Katz, Paul S.

    2011-01-01

    The complexity of nervous systems alters the evolvability of behaviour. Complex nervous systems are phylogenetically constrained; nevertheless particular species-specific behaviours have repeatedly evolved, suggesting a predisposition towards those behaviours. Independently evolved behaviours in animals that share a common neural architecture are generally produced by homologous neural structures, homologous neural pathways and even in the case of some invertebrates, homologous identified neurons. Such parallel evolution has been documented in the chromatic sensitivity of visual systems, motor behaviours and complex social behaviours such as pair-bonding. The appearance of homoplasious behaviours produced by homologous neural substrates suggests that there might be features of these nervous systems that favoured the repeated evolution of particular behaviours. Neuromodulation may be one such feature because it allows anatomically defined neural circuitry to be re-purposed. The developmental, genetic and physiological mechanisms that contribute to nervous system complexity may also bias the evolution of behaviour, thereby affecting the evolvability of species-specific behaviour. PMID:21690127

  12. Human reliability analysis

    SciTech Connect

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach.

  13. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  14. Reliability of fluid systems

    NASA Astrophysics Data System (ADS)

    Kopáček, Jaroslav; Fojtášek, Kamil; Dvořák, Lukáš

    2016-03-01

    This paper focuses on the importance of detection reliability, especially in complex fluid systems for demanding production technology. The initial criterion for assessing the reliability is the failure of object (element), which is seen as a random variable and their data (values) can be processed using by the mathematical methods of theory probability and statistics. They are defined the basic indicators of reliability and their applications in calculations of serial, parallel and backed-up systems. For illustration, there are calculation examples of indicators of reliability for various elements of the system and for the selected pneumatic circuit.

  15. Power Quality and Reliability Project

    NASA Technical Reports Server (NTRS)

    Attia, John O.

    2001-01-01

    One area where universities and industry can link is in the area of power systems reliability and quality - key concepts in the commercial, industrial and public sector engineering environments. Prairie View A&M University (PVAMU) has established a collaborative relationship with the University of'Texas at Arlington (UTA), NASA/Johnson Space Center (JSC), and EP&C Engineering and Technology Group (EP&C) a small disadvantage business that specializes in power quality and engineering services. The primary goal of this collaboration is to facilitate the development and implementation of a Strategic Integrated power/Systems Reliability and Curriculum Enhancement Program. The objectives of first phase of this work are: (a) to develop a course in power quality and reliability, (b) to use the campus of Prairie View A&M University as a laboratory for the study of systems reliability and quality issues, (c) to provide students with NASA/EPC shadowing and Internship experience. In this work, a course, titled "Reliability Analysis of Electrical Facilities" was developed and taught for two semesters. About thirty seven has benefited directly from this course. A laboratory accompanying the course was also developed. Four facilities at Prairie View A&M University were surveyed. Some tests that were performed are (i) earth-ground testing, (ii) voltage, amperage and harmonics of various panels in the buildings, (iii) checking the wire sizes to see if they were the right size for the load that they were carrying, (iv) vibration tests to test the status of the engines or chillers and water pumps, (v) infrared testing to the test arcing or misfiring of electrical or mechanical systems.

  16. The transcriptomics of an experimentally evolved plant-virus interaction

    PubMed Central

    Hillung, Julia; García-García, Francisco; Dopazo, Joaquín; Cuevas, José M.; Elena, Santiago F.

    2016-01-01

    Models of plant-virus interaction assume that the ability of a virus to infect a host genotype depends on the matching between virulence and resistance genes. Recently, we evolved tobacco etch potyvirus (TEV) lineages on different ecotypes of Arabidopsis thaliana, and found that some ecotypes selected for specialist viruses whereas others selected for generalists. Here we sought to evaluate the transcriptomic basis of such relationships. We have characterized the transcriptomic responses of five ecotypes infected with the ancestral and evolved viruses. Genes and functional categories differentially expressed by plants infected with local TEV isolates were identified, showing heterogeneous responses among ecotypes, although significant parallelism existed among lineages evolved in the same ecotype. Although genes involved in immune responses were altered upon infection, other functional groups were also pervasively over-represented, suggesting that plant resistance genes were not the only drivers of viral adaptation. Finally, the transcriptomic consequences of infection with the generalist and specialist lineages were compared. Whilst the generalist induced very similar perturbations in the transcriptomes of the different ecotypes, the perturbations induced by the specialist were divergent. Plant defense mechanisms were activated when the infecting virus was specialist but they were down-regulated when infecting with generalist. PMID:27113435

  17. The transcriptomics of an experimentally evolved plant-virus interaction.

    PubMed

    Hillung, Julia; García-García, Francisco; Dopazo, Joaquín; Cuevas, José M; Elena, Santiago F

    2016-01-01

    Models of plant-virus interaction assume that the ability of a virus to infect a host genotype depends on the matching between virulence and resistance genes. Recently, we evolved tobacco etch potyvirus (TEV) lineages on different ecotypes of Arabidopsis thaliana, and found that some ecotypes selected for specialist viruses whereas others selected for generalists. Here we sought to evaluate the transcriptomic basis of such relationships. We have characterized the transcriptomic responses of five ecotypes infected with the ancestral and evolved viruses. Genes and functional categories differentially expressed by plants infected with local TEV isolates were identified, showing heterogeneous responses among ecotypes, although significant parallelism existed among lineages evolved in the same ecotype. Although genes involved in immune responses were altered upon infection, other functional groups were also pervasively over-represented, suggesting that plant resistance genes were not the only drivers of viral adaptation. Finally, the transcriptomic consequences of infection with the generalist and specialist lineages were compared. Whilst the generalist induced very similar perturbations in the transcriptomes of the different ecotypes, the perturbations induced by the specialist were divergent. Plant defense mechanisms were activated when the infecting virus was specialist but they were down-regulated when infecting with generalist. PMID:27113435

  18. Evolving treatment plan quality criteria from institution-specific experience

    SciTech Connect

    Ruan, D.; Shao, W.; DeMarco, J.; Tenn, S.; King, C.; Low, D.; Kupelian, P.; Steinberg, M.

    2012-05-15

    Purpose: The dosimetric aspects of radiation therapy treatment plan quality are usually evaluated and reported with dose volume histogram (DVH) endpoints. For clinical practicality, a small number of representative quantities derived from the DVH are often used as dose endpoints to summarize the plan quality. National guidelines on reference values for such quantities for some standard treatment approaches are often used as acceptance criteria to trigger treatment plan review. On the other hand, treatment prescription and planning approaches specific to each institution warrants the need to report plan quality in terms of practice consistency and with respect to institution-specific experience. The purpose of this study is to investigate and develop a systematic approach to record and characterize the institution-specific plan experience and use such information to guide the design of plan quality criteria. In the clinical setting, this approach will assist in (1) improving overall plan quality and consistency and (2) detecting abnormal plan behavior for retrospective analysis. Methods: The authors propose a self-evolving methodology and have developed an in-house prototype software suite that (1) extracts the dose endpoints from a treatment plan and evaluates them against both national standard and institution-specific criteria and (2) evolves the statistics for the dose endpoints and updates institution-specific criteria. Results: The validity of the proposed methodology was demonstrated with a database of prostate stereotactic body radiotherapy cases. As more data sets are accumulated, the evolving institution-specific criteria can serve as a reliable and stable consistency measure for plan quality and reveals the potential use of the ''tighter'' criteria than national standards or projected criteria, leading to practice that may push to shrink the gap between plans deemed acceptable and the underlying unknown optimality. Conclusions: The authors have developed

  19. Zygomorphy evolved from disymmetry in Fumarioideae (Papaveraceae, Ranunculales): new evidence from an expanded molecular phylogenetic framework

    PubMed Central

    Sauquet, Hervé; Carrive, Laetitia; Poullain, Noëlie; Sannier, Julie; Damerval, Catherine; Nadot, Sophie

    2015-01-01

    Background and Aims Fumarioideae (20 genera, 593 species) is a clade of Papaveraceae (Ranunculales) characterized by flowers that are either disymmetric (i.e. two perpendicular planes of bilateral symmetry) or zygomorphic (i.e. one plane of bilateral symmetry). In contrast, the other subfamily of Papaveraceae, Papaveroideae (23 genera, 230 species), has actinomorphic flowers (i.e. more than two planes of symmetry). Understanding of the evolution of floral symmetry in this clade has so far been limited by the lack of a reliable phylogenetic framework. Pteridophyllum (one species) shares similarities with Fumarioideae but has actinomorphic flowers, and the relationships among Pteridophyllum, Papaveroideae and Fumarioideae have remained unclear. This study reassesses the evolution of floral symmetry in Papaveraceae based on new molecular phylogenetic analyses of the family. Methods Maximum likelihood, Bayesian and maximum parsimony phylogenetic analyses of Papaveraceae were conducted using six plastid markers and one nuclear marker, sampling Pteridophyllum, 18 (90 %) genera and 73 species of Fumarioideae, 11 (48 %) genera and 11 species of Papaveroideae, and a wide selection of outgroup taxa. Floral characters recorded from the literature were then optimized onto phylogenetic trees to reconstruct ancestral states using parsimony, maximum likelihood and reversible-jump Bayesian approaches. Key Results Pteridophyllum is not nested in Fumarioideae. Fumarioideae are monophyletic and Hypecoum (18 species) is the sister group of the remaining genera. Relationships within the core Fumarioideae are well resolved and supported. Dactylicapnos and all zygomorphic genera form a well-supported clade nested among disymmetric taxa. Conclusions Disymmetry of the corolla is a synapomorphy of Fumarioideae and is strongly correlated with changes in the androecium and differentiation of middle and inner tepal shape (basal spurs on middle tepals). Zygomorphy subsequently evolved from

  20. Hawaii electric system reliability.

    SciTech Connect

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  1. Control pole placement relationships

    NASA Technical Reports Server (NTRS)

    Ainsworth, O. R.

    1982-01-01

    Using a simplified Large Space Structure (LSS) model, a technique was developed which gives algebraic relationships for the unconstrained poles. The relationships, which were obtained by this technique, are functions of the structural characteristics and the control gains. Extremely interesting relationships evolve for the case when the structural damping is zero. If the damping is zero, the constrained poles are uncoupled from the structural mode shapes. These relationships, which are derived for structural damping and without structural damping, provide new insight into the migration of the unconstrained poles for the CFPPS.

  2. Quantifying evolvability in small biological networks

    SciTech Connect

    Nemenman, Ilya; Mugler, Andrew; Ziv, Etay; Wiggins, Chris H

    2008-01-01

    The authors introduce a quantitative measure of the capacity of a small biological network to evolve. The measure is applied to a stochastic description of the experimental setup of Guet et al. (Science 2002, 296, pp. 1466), treating chemical inducers as functional inputs to biochemical networks and the expression of a reporter gene as the functional output. The authors take an information-theoretic approach, allowing the system to set parameters that optimise signal processing ability, thus enumerating each network's highest-fidelity functions. All networks studied are highly evolvable by the measure, meaning that change in function has little dependence on change in parameters. Moreover, each network's functions are connected by paths in the parameter space along which information is not significantly lowered, meaning a network may continuously change its functionality without completely losing it along the way. This property further underscores the evolvability of the networks.

  3. Metanetworks of artificially evolved regulatory networks

    NASA Astrophysics Data System (ADS)

    Danacı, Burçin; Erzan, Ayşe

    2016-04-01

    We study metanetworks arising in genotype and phenotype spaces, in the context of a model population of Boolean graphs evolved under selection for short dynamical attractors. We define the adjacency matrix of a graph as its genotype, which gets mutated in the course of evolution, while its phenotype is its set of dynamical attractors. Metanetworks in the genotype and phenotype spaces are formed, respectively, by genetic proximity and by phenotypic similarity, the latter weighted by the sizes of the basins of attraction of the shared attractors. We find that evolved populations of Boolean graphs form tree-like giant clusters in genotype space, while random populations of Boolean graphs are typically so far removed from each other genetically that they cannot form a metanetwork. In phenotype space, the metanetworks of evolved populations are super robust both under the elimination of weak connections and random removal of nodes.

  4. Evolving networks in the human epileptic brain

    NASA Astrophysics Data System (ADS)

    Lehnertz, Klaus; Ansmann, Gerrit; Bialonski, Stephan; Dickten, Henning; Geier, Christian; Porz, Stephan

    2014-01-01

    Network theory provides novel concepts that promise an improved characterization of interacting dynamical systems. Within this framework, evolving networks can be considered as being composed of nodes, representing systems, and of time-varying edges, representing interactions between these systems. This approach is highly attractive to further our understanding of the physiological and pathophysiological dynamics in human brain networks. Indeed, there is growing evidence that the epileptic process can be regarded as a large-scale network phenomenon. We here review methodologies for inferring networks from empirical time series and for a characterization of these evolving networks. We summarize recent findings derived from studies that investigate human epileptic brain networks evolving on timescales ranging from few seconds to weeks. We point to possible pitfalls and open issues, and discuss future perspectives.

  5. Evolvable, reconfigurable hardware for future space systems

    NASA Technical Reports Server (NTRS)

    Stoica, A.; Zebulum, R. S.; Keymeulen, D.; Ferguson, M. I.; Thakoor, A.

    2002-01-01

    This paper overviews Evolvable Hardware (EHW) technology, examining its potential for enhancing survivability and flexibility of future space systems. EHW refers to selfconfiguration of electronic hardware by evolutionary/genetic search mechanisms. Evolvable Hardware can maintain existing functionality in the presence of faults and degradations due to aging, temperature and radiation. It can also configure itself for new functionality when required for mission changes or encountered opportunities. The paper illustrates hardware evolution in silicon using a JPL-designed programmable device reconfigurable at transistor level as the platform and a genetic algorithm running on a DSP as the reconfiguration mechanism. Rapid reconfiguration allows convergence to circuit solutions in the order of seconds. The experiments demonstrate functional recovery from faults as well as from degradation at extreme temperatures indicating the possibility of expanding the operational range of extreme electronics through evolved circuit solutions.

  6. Distribution characteristics of weighted bipartite evolving networks

    NASA Astrophysics Data System (ADS)

    Zhang, Danping; Dai, Meifeng; Li, Lei; Zhang, Cheng

    2015-06-01

    Motivated by an evolving model of online bipartite networks, we introduce a model of weighted bipartite evolving networks. In this model, there are two disjoint sets of nodes, called user node set and object node set. Edges only exist between two disjoint sets. Edge weights represent the usage amount between a couple of user node and object node. This model not only clinches the bipartite networks' internal mechanism of network growth, but also takes into account the object strength deterioration over time step. User strength and object strength follow power-law distributions, respectively. The weighted bipartite evolving networks have scare-free property in certain situations. Numerical simulations results agree with the theoretical analyses.

  7. Access to space: The Space Shuttle's evolving rolee

    NASA Astrophysics Data System (ADS)

    Duttry, Steven R.

    1993-04-01

    Access to space is of extreme importance to our nation and the world. Military, civil, and commercial space activities all depend on reliable space transportation systems for access to space at a reasonable cost. The Space Transportation System or Space Shuttle was originally planned to provide transportation to and from a manned Earth-orbiting space station. To justify the development and operations costs, the Space Shuttle took on other space transportation requirements to include DoD, civil, and a growing commercial launch market. This research paper or case study examines the evolving role of the Space Shuttle as our nation's means of accessing space. The case study includes a review of the events leading to the development of the Space Shuttle, identifies some of the key players in the decision-making process, examines alternatives developed to mitigate the risks associated with sole reliance on the Space Shuttle, and highlights the impacts of this national space policy following the Challenger accident.

  8. JavaGenes: Evolving Graphs with Crossover

    NASA Technical Reports Server (NTRS)

    Globus, Al; Atsatt, Sean; Lawton, John; Wipke, Todd

    2000-01-01

    Genetic algorithms usually use string or tree representations. We have developed a novel crossover operator for a directed and undirected graph representation, and used this operator to evolve molecules and circuits. Unlike strings or trees, a single point in the representation cannot divide every possible graph into two parts, because graphs may contain cycles. Thus, the crossover operator is non-trivial. A steady-state, tournament selection genetic algorithm code (JavaGenes) was written to implement and test the graph crossover operator. All runs were executed by cycle-scavagging on networked workstations using the Condor batch processing system. The JavaGenes code has evolved pharmaceutical drug molecules and simple digital circuits. Results to date suggest that JavaGenes can evolve moderate sized drug molecules and very small circuits in reasonable time. The algorithm has greater difficulty with somewhat larger circuits, suggesting that directed graphs (circuits) are more difficult to evolve than undirected graphs (molecules), although necessary differences in the crossover operator may also explain the results. In principle, JavaGenes should be able to evolve other graph-representable systems, such as transportation networks, metabolic pathways, and computer networks. However, large graphs evolve significantly slower than smaller graphs, presumably because the space-of-all-graphs explodes combinatorially with graph size. Since the representation strongly affects genetic algorithm performance, adding graphs to the evolutionary programmer's bag-of-tricks should be beneficial. Also, since graph evolution operates directly on the phenotype, the genotype-phenotype translation step, common in genetic algorithm work, is eliminated.

  9. Evolved Massive Stars in the Local Group

    NASA Astrophysics Data System (ADS)

    Drout, M. R.; Massey, P.

    2015-05-01

    In this manuscript we describe a number of recent advances in the study of evolved massive stars in the Local Group, with an emphasis on how representative populations of these stars can be used to test models of massive star evolution. In honor of the 50th anniversary of the Cerro Tololo Inter-American Observatory (CTIO) we attempt to put these finding in some historical context by discussing how our understanding of the various stages in the lives of massive stars has evolved since Cerro Tololo was first selected as the site for the observatory which would become CTIO.

  10. A Stefan problem on an evolving surface

    PubMed Central

    Alphonse, Amal; Elliott, Charles M.

    2015-01-01

    We formulate a Stefan problem on an evolving hypersurface and study the well posedness of weak solutions given L1 data. To do this, we first develop function spaces and results to handle equations on evolving surfaces in order to give a natural treatment of the problem. Then, we consider the existence of solutions for data; this is done by regularization of the nonlinearity. The regularized problem is solved by a fixed point theorem and then uniform estimates are obtained in order to pass to the limit. By using a duality method, we show continuous dependence, which allows us to extend the results to L1 data. PMID:26261364

  11. Dust around main sequence and evolved stars

    NASA Astrophysics Data System (ADS)

    Walker, H. J.; Heinrichsen, I.; Richards, P. J.

    Data for several main sequence and evolved stars, from the photopolarimeter on ISO (ISOPHOT), are presented. Dust shells are resolved for Y CVn and RS Lib at 60mum. Low resolution spectra from ISOPHOT are shown for several evolved stars, and compared to the spectrum of Vega (a stellar photosphere) and HD 169142 (showing emission features from Polycyclic Aromatic Hydrocarbons). W Lyr shows the signature of oxygen-rich circumstellar material around 3mum, V Aql and Y CVn the signature of carbon-rich material.

  12. An Evolvable Multi-Agent Approach to Space Operations Engineering

    NASA Technical Reports Server (NTRS)

    Mandutianu, Sanda; Stoica, Adrian

    1999-01-01

    A complex system of spacecraft and ground tracking stations, as well as a constellation of satellites or spacecraft, has to be able to reliably withstand sudden environment changes, resource fluctuations, dynamic resource configuration, limited communication bandwidth, etc., while maintaining the consistency of the system as a whole. It is not known in advance when a change in the environment might occur or when a particular exchange will happen. A higher degree of sophistication for the communication mechanisms between different parts of the system is required. The actual behavior has to be determined while the system is performing and the course of action can be decided at the individual level. Under such circumstances, the solution will highly benefit from increased on-board and on the ground adaptability and autonomy. An evolvable architecture based on intelligent agents that communicate and cooperate with each other can offer advantages in this direction. This paper presents an architecture of an evolvable agent-based system (software and software/hardware hybrids) as well as some plans for further implementation.

  13. A guide for the design of evolve and resequencing studies.

    PubMed

    Kofler, Robert; Schlötterer, Christian

    2014-02-01

    Standing genetic variation provides a rich reservoir of potentially useful mutations facilitating the adaptation to novel environments. Experimental evolution studies have demonstrated that rapid and strong phenotypic responses to selection can also be obtained in the laboratory. When combined with the next-generation sequencing technology, these experiments promise to identify the individual loci contributing to adaption. Nevertheless, until now, very little is known about the design of such evolve & resequencing (E&R) studies. Here, we use forward simulations of entire genomes to evaluate different experimental designs that aim to maximize the power to detect selected variants. We show that low linkage disequilibrium in the starting population, population size, duration of the experiment, and the number of replicates are the key factors in determining the power and accuracy of E&R studies. Furthermore, replication of E&R is more important for detecting the targets of selection than increasing the population size. Using an optimized design, beneficial loci with a selective advantage as low as s = 0.005 can be identified at the nucleotide level. Even when a large number of loci are selected simultaneously, up to 56% can be reliably detected without incurring large numbers of false positives. Our computer simulations suggest that, with an adequate experimental design, E&R studies are a powerful tool to identify adaptive mutations from standing genetic variation and thereby provide an excellent means to analyze the trajectories of selected alleles in evolving populations. PMID:24214537

  14. Surveying The Digital Landscape: Evolving Technologies 2004. The EDUCAUSE Evolving Technologies Committee

    ERIC Educational Resources Information Center

    EDUCAUSE Review, 2004

    2004-01-01

    Each year, the members of the EDUCAUSE Evolving Technologies Committee identify and research the evolving technologies that are having the most direct impact on higher education institutions. The committee members choose the relevant topics, write white papers, and present their findings at the EDUCAUSE annual conference. This year, under the…

  15. Photovoltaic system reliability

    SciTech Connect

    Maish, A.B.; Atcitty, C.; Greenberg, D.

    1997-10-01

    This paper discusses the reliability of several photovoltaic projects including SMUD`s PV Pioneer project, various projects monitored by Ascension Technology, and the Colorado Parks project. System times-to-failure range from 1 to 16 years, and maintenance costs range from 1 to 16 cents per kilowatt-hour. Factors contributing to the reliability of these systems are discussed, and practices are recommended that can be applied to future projects. This paper also discusses the methodology used to collect and analyze PV system reliability data.

  16. Field-evolved resistance to Bt toxins

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Transgenic cotton expressing Bacillus thuringiensis Cry1Ac (Bt cotton) has been used commercially in the United States since 1996. An article by Tabashnik et al. 2008, Nature Biotechnology 26:199-202, states that, for the first time, there is field-evolved Bt resistance in bollworm, Helicoverpa zea...

  17. Apollo 16 Evolved Lithology Sodic Ferrogabbro

    NASA Technical Reports Server (NTRS)

    Zeigler, Ryan; Jolliff, B. L.; Korotev, R. L.

    2014-01-01

    Evolved lunar igneous lithologies, often referred to as the alkali suite, are a minor but important component of the lunar crust. These evolved samples are incompatible-element rich samples, and are, not surprisingly, most common in the Apollo sites in (or near) the incompatible-element rich region of the Moon known as the Procellarum KREEP Terrane (PKT). The most commonly occurring lithologies are granites (A12, A14, A15, A17), monzogabbro (A14, A15), alkali anorthosites (A12, A14), and KREEP basalts (A15, A17). The Feldspathic Highlands Terrane is not entirely devoid of evolved lithologies, and rare clasts of alkali gabbronorite and sodic ferrogabbro (SFG) have been identified in Apollo 16 station 11 breccias 67915 and 67016. Curiously, nearly all pristine evolved lithologies have been found as small clasts or soil particles, exceptions being KREEP basalts 15382/6 and granitic sample 12013 (which is itself a breccia). Here we reexamine the petrography and geochemistry of two SFG-like particles found in a survey of Apollo 16 2-4 mm particles from the Cayley Plains 62283,7-15 and 62243,10-3 (hereafter 7-15 and 10-3 respectively). We will compare these to previously reported SFG samples, including recent analyses on the type specimen of SFG from lunar breccia 67915.

  18. A Course Evolves-Physical Anthropology.

    ERIC Educational Resources Information Center

    O'Neil, Dennis

    2001-01-01

    Describes the development of an online physical anthropology course at Palomar College (California) that evolved from online tutorials. Discusses the ability to update materials on the Web more quickly than in traditional textbooks; creating Web pages that are readable by most Web browsers; test security issues; and clarifying ownership of online…

  19. Project Evolve User-Adopter Manual.

    ERIC Educational Resources Information Center

    Joiner, Lee M.

    An adult basic education (ABE) program for mentally retarded young adults between the ages of 14 and 26 years, Project Evolve can provide education agencies for educationally handicapped children with detailed information concerning an innovative program. The manual format was developed through interviews with professional educators concerning the…

  20. The Evolving Leadership Path of Visual Analytics

    SciTech Connect

    Kluse, Michael; Peurrung, Anthony J.; Gracio, Deborah K.

    2012-01-02

    This is a requested book chapter for an internationally authored book on visual analytics and related fields, coordianted by a UK university and to be published by Springer in 2012. This chapter is an overview of the leadship strategies that PNNL's Jim Thomas and other stakeholders used to establish visual analytics as a field, and how those strategies may evolve in the future.

  1. The Evolving Office of the Registrar

    ERIC Educational Resources Information Center

    Pace, Harold L.

    2011-01-01

    A healthy registrar's office will continue to evolve as it considers student, faculty, and institutional needs; staff talents and expectations; technological opportunities; economic realities; space issues; work environments; and where the strategic plan is taking the institution in support of the mission. Several recognized leaders in the field…

  2. Did Language Evolve Like the Vertebrate Eye?

    ERIC Educational Resources Information Center

    Botha, Rudolf P.

    2002-01-01

    Offers a critical appraisal of the way in which the idea that human language or some of its features evolved like the vertebrate eye by natural selection is articulated in Pinker and Bloom's (1990) selectionist account of language evolution. Argues that this account is less than insightful because it fails to draw some of the conceptual…

  3. Origins of multicellular evolvability in snowflake yeast.

    PubMed

    Ratcliff, William C; Fankhauser, Johnathon D; Rogers, David W; Greig, Duncan; Travisano, Michael

    2015-01-01

    Complex life has arisen through a series of 'major transitions' in which collectives of formerly autonomous individuals evolve into a single, integrated organism. A key step in this process is the origin of higher-level evolvability, but little is known about how higher-level entities originate and gain the capacity to evolve as an individual. Here we report a single mutation that not only creates a new level of biological organization, but also potentiates higher-level evolvability. Disrupting the transcription factor ACE2 in Saccharomyces cerevisiae prevents mother-daughter cell separation, generating multicellular 'snowflake' yeast. Snowflake yeast develop through deterministic rules that produce geometrically defined clusters that preclude genetic conflict and display a high broad-sense heritability for multicellular traits; as a result they are preadapted to multicellular adaptation. This work demonstrates that simple microevolutionary changes can have profound macroevolutionary consequences, and suggests that the formation of clonally developing clusters may often be the first step to multicellularity. PMID:25600558

  4. Organizational Innovation: Current Research and Evolving Concepts

    ERIC Educational Resources Information Center

    Rowe, Lloyd A.; Boise, William B.

    1974-01-01

    A conceptual framework for organizational innovation can evolve from such ideas as the process of innovation, the climate(s) required, the organizational and societal space affected by an innovation, innovation radicalness, and innovation strategies such as organizational development, functional specialization, and periodicity. (Author/WM)

  5. Leadership for Literacy Coaching: Evolving Research

    ERIC Educational Resources Information Center

    Taylor, Rosemarye T.; Moxley, Dale E.

    2008-01-01

    Leadership for literacy coaching is evolving in both the skills of the literacy coaches and the skills of those they coach. Issues of role clarification, communication with administration, and hesitancy to provide authentic feedback are consistently identified. Trends associated with literacy coaching indicate that they continue on their…

  6. [Families and psychiatry: models and evolving links].

    PubMed

    Frankhauser, Adeline

    2016-01-01

    The role of the families of persons with severe psychiatric disorders (schizophrenia in particular) in the care of their relatives has recently evolved: once seen as pathogenic to be kept at a distance, the family is now recognised by professionals as a partner in the care process. The links between families and psychiatric institutions remain complex and marked by ambivalence and paradoxes. PMID:27157191

  7. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Wilson, Larry W.

    1989-01-01

    The longterm goal of this research is to identify or create a model for use in analyzing the reliability of flight control software. The immediate tasks addressed are the creation of data useful to the study of software reliability and production of results pertinent to software reliability through the analysis of existing reliability models and data. The completed data creation portion of this research consists of a Generic Checkout System (GCS) design document created in cooperation with NASA and Research Triangle Institute (RTI) experimenters. This will lead to design and code reviews with the resulting product being one of the versions used in the Terminal Descent Experiment being conducted by the Systems Validations Methods Branch (SVMB) of NASA/Langley. An appended paper details an investigation of the Jelinski-Moranda and Geometric models for software reliability. The models were given data from a process that they have correctly simulated and asked to make predictions about the reliability of that process. It was found that either model will usually fail to make good predictions. These problems were attributed to randomness in the data and replication of data was recommended.

  8. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  9. Evolving impact of Ada on a production software environment

    NASA Technical Reports Server (NTRS)

    Mcgarry, F.; Esker, L.; Quimby, K.

    1988-01-01

    Many aspects of software development with Ada have evolved as our Ada development environment has matured and personnel have become more experienced in the use of Ada. The Software Engineering Laboratory (SEL) has seen differences in the areas of cost, reliability, reuse, size, and use of Ada features. A first Ada project can be expected to cost about 30 percent more than an equivalent FORTRAN project. However, the SEL has observed significant improvements over time as a development environment progresses to second and third uses of Ada. The reliability of Ada projects is initially similar to what is expected in a mature FORTRAN environment. However, with time, one can expect to gain improvements as experience with the language increases. Reuse is one of the most promising aspects of Ada. The proportion of reusable Ada software on our Ada projects exceeds the proportion of reusable FORTRAN software on our FORTRAN projects. This result was noted fairly early in our Ada projects, and experience shows an increasing trend over time.

  10. Scaled CMOS Technology Reliability Users Guide

    NASA Technical Reports Server (NTRS)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  11. Mission Reliability Estimation for Repairable Robot Teams

    NASA Technical Reports Server (NTRS)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  12. Reliable aluminum contact formation by electrostatic bonding

    NASA Astrophysics Data System (ADS)

    Kárpáti, T.; Pap, A. E.; Radnóczi, Gy; Beke, B.; Bársony, I.; Fürjes, P.

    2015-07-01

    The paper presents a detailed study of a reliable method developed for aluminum fusion wafer bonding assisted by the electrostatic force evolving during the anodic bonding process. The IC-compatible procedure described allows the parallel formation of electrical and mechanical contacts, facilitating a reliable packaging of electromechanical systems with backside electrical contacts. This fusion bonding method supports the fabrication of complex microelectromechanical systems (MEMS) and micro-opto-electromechanical systems (MOEMS) structures with enhanced temperature stability, which is crucial in mechanical sensor applications such as pressure or force sensors. Due to the applied electrical potential of  -1000 V the Al metal layers are compressed by electrostatic force, and at the bonding temperature of 450 °C intermetallic diffusion causes aluminum ions to migrate between metal layers.

  13. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  14. Orbiter Autoland reliability analysis

    NASA Technical Reports Server (NTRS)

    Welch, D. Phillip

    1993-01-01

    The Space Shuttle Orbiter is the only space reentry vehicle in which the crew is seated upright. This position presents some physiological effects requiring countermeasures to prevent a crewmember from becoming incapacitated. This also introduces a potential need for automated vehicle landing capability. Autoland is a primary procedure that was identified as a requirement for landing following and extended duration orbiter mission. This report documents the results of the reliability analysis performed on the hardware required for an automated landing. A reliability block diagram was used to evaluate system reliability. The analysis considers the manual and automated landing modes currently available on the Orbiter. (Autoland is presently a backup system only.) Results of this study indicate a +/- 36 percent probability of successfully extending a nominal mission to 30 days. Enough variations were evaluated to verify that the reliability could be altered with missions planning and procedures. If the crew is modeled as being fully capable after 30 days, the probability of a successful manual landing is comparable to that of Autoland because much of the hardware is used for both manual and automated landing modes. The analysis indicates that the reliability for the manual mode is limited by the hardware and depends greatly on crew capability. Crew capability for a successful landing after 30 days has not been determined yet.

  15. Photovoltaic module reliability workshop

    NASA Astrophysics Data System (ADS)

    Mrig, L.

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986 to 1990. The reliability photovoltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warrantees available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the U.S., PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  16. Photovoltaic module reliability workshop

    SciTech Connect

    Mrig, L.

    1990-01-01

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  17. Evolving networks-Using past structure to predict the future

    NASA Astrophysics Data System (ADS)

    Shang, Ke-ke; Yan, Wei-sheng; Small, Michael

    2016-08-01

    Many previous studies on link prediction have focused on using common neighbors to predict the existence of links between pairs of nodes. More broadly, research into the structural properties of evolving temporal networks and temporal link prediction methods have recently attracted increasing attention. In this study, for the first time, we examine the use of links between a pair of nodes to predict their common neighbors and analyze the relationship between the weight and the structure in static networks, evolving networks, and in the corresponding randomized networks. We propose both new unweighted and weighted prediction methods and use six kinds of real networks to test our algorithms. In unweighted networks, we find that if a pair of nodes connect to each other in the current network, they will have a higher probability to connect common nodes both in the current and the future networks-and the probability will decrease with the increase of the number of neighbors. Furthermore, we find that the original networks have their particular structure and statistical characteristics which benefit link prediction. In weighted networks, the prediction algorithm performance of networks which are dominated by human factors decrease with the decrease of weight and are in general better in static networks. Furthermore, we find that geographical position and link weight both have significant influence on the transport network. Moreover, the evolving financial network has the lowest predictability. In addition, we find that the structure of non-social networks has more robustness than social networks. The structure of engineering networks has both best predictability and also robustness.

  18. Where the Wild Things Are: The Evolving Iconography of Rural Fauna

    ERIC Educational Resources Information Center

    Buller, Henry

    2004-01-01

    This paper explores the changing relationship between "nature" and rurality through an examination of the shifting iconography of animals, and particularly "wild" animals, in a rural setting. Drawing upon a set of examples, the paper argues that the faunistic icons of rural areas are evolving as alternative conceptions of the countryside, of…

  19. Evolved gas analysis of secondary organic aerosols

    SciTech Connect

    Grosjean, D.; Williams, E.L. II; Grosjean, E. ); Novakov, T. )

    1994-11-01

    Secondary organic aerosols have been characterized by evolved gas analysis (EGA). Hydrocarbons selected as aerosol precursors were representative of anthropogenic emissions (cyclohexene, cyclopentene, 1-decene and 1-dodecene, n-dodecane, o-xylene, and 1,3,5-trimethylbenzene) and of biogenic emissions (the terpenes [alpha]-pinene, [beta]-pinene and d-limonene and the sesquiterpene trans-caryophyllene). Also analyzed by EGA were samples of secondary, primary (highway tunnel), and ambient (urban) aerosols before and after exposure to ozone and other photochemical oxidants. The major features of the EGA thermograms (amount of CO[sub 2] evolved as a function of temperature) are described. The usefulness and limitations of EGA data for source apportionment of atmospheric particulate carbon are briefly discussed. 28 refs., 7 figs., 4 tabs.

  20. Traceless protein splicing utilizing evolved split inteins

    PubMed Central

    Lockless, Steve W.; Muir, Tom W.

    2009-01-01

    Split inteins are parasitic genetic elements frequently found inserted into reading frames of essential proteins. Their association and excision restore host protein function through a protein self-splicing reaction. They have gained an increasingly important role in the chemical modification of proteins to create cyclical, segmentally labeled, and fluorescently tagged proteins. Ideally, inteins would seamlessly splice polypeptides together with no remnant sequences and at high efficiency. Here, we describe experiments that identify the branched intermediate, a transient step in the overall splicing reaction, as a key determinant of the splicing efficiency at different splice-site junctions. To alter intein specificity, we developed a cell-based selection scheme to evolve split inteins that splice with high efficiency at different splice junctions and at higher temperatures. Mutations within these evolved inteins occur at sites distant from the active site. We present a hypothesis that a network of conserved coevolving amino acids in inteins mediates these long-range effects. PMID:19541616

  1. Nursing administration research: an evolving science.

    PubMed

    Murphy, Lyn Stankiewicz; Scott, Elaine S; Warshawsky, Nora E

    2014-12-01

    The nature and focus of nursing administrative research have evolved over time. Recently, the research agenda has primarily reflected the national health policy agenda. Although nursing research has traditionally been dominated by clinical interests, nursing administrative research has historically addressed the interface of reimbursement, quality, and care delivery systems. This article traces the evolution of nursing administrative research to answer questions relevant to scope, practice, and policy and suggests future directions. PMID:25393136

  2. Design Space Issues for Intrinsic Evolvable Hardware

    NASA Technical Reports Server (NTRS)

    Hereford, James; Gwaltney, David

    2004-01-01

    This paper discusses the problem of increased programming time for intrinsic evolvable hardware (EM) as the complexity of the circuit grows. As the circuit becomes more complex, then more components will be required and a longer programming string, L, is required. We develop equations for the size of the population, n, and the number of generations required for the population to converge, based on L. Our analytical results show that even though the design search space grows as 2L (assuming a binary programming string), the number of circuit evaluations, n*ngen, only grows as O(Lg3), or slightly less than O(L). This makes evolvable techniques a good tool for exploring large design spaces. The major hurdle for intrinsic EHW is evaluation time for each possible circuit. The evaluation time involves downloading the bit string to the device, updating the device configuration, measuring the output and then transferring the output data to the control processor. Each of these steps must be done for each member of the population. The processing time of the computer becomes negligible since the selection/crossover/mutation steps are only done once per generation. Evaluation time presently limits intrinsic evolvable hardware techniques to designing only small or medium-sized circuits. To evolve large or complicated circuits, several researchers have proposed using hierarchical design or reuse techniques where submodules are combined together to form complex circuits. However, these practical approaches limit the search space of available designs and preclude utilizing parasitic coupling or other effects within the programmable device. The practical approaches also raise the issue of why intrinsic EHW techniques do not easily apply to large design spaces, since the analytical results show only an O(L) complexity growth.

  3. The evolving epidemiology of stone disease.

    PubMed

    Roudakova, Ksenia; Monga, Manoj

    2014-01-01

    The epidemiology of kidney stones is evolving - not only is the prevalence increasing, but also the gender gap has narrowed. What drives these changes? Diet, obesity or environmental factors? This article will review the possible explanations for a shift in the epidemiology, with the hope of gaining a better understanding of the extent to which modifiable risk factors play a role on stone formation and what measures may be undertaken for disease prevention in view of these changing trends. PMID:24497682

  4. Quantum games on evolving random networks

    NASA Astrophysics Data System (ADS)

    Pawela, Łukasz

    2016-09-01

    We study the advantages of quantum strategies in evolutionary social dilemmas on evolving random networks. We focus our study on the two-player games: prisoner's dilemma, snowdrift and stag-hunt games. The obtained result show the benefits of quantum strategies for the prisoner's dilemma game. For the other two games, we obtain regions of parameters where the quantum strategies dominate, as well as regions where the classical strategies coexist.

  5. Chemical evolution of viscously evolving galactic discs

    NASA Technical Reports Server (NTRS)

    Clarke, Catherine J.

    1989-01-01

    The ability of the Lin-Pringle (1987) model of galactic disk formation to reproduce the observed radial distributions of total gas surface density and metals in disk galaxies is investigated. It is found that a satisfactory fit is obtained provided that there exists an outer cut-off to the star-forming disk beyond which gas is allowed to viscously evolve. The metallicity gradient is then established by radial inflow of gas from beyond this cut-off.

  6. Continuous evaluation of evolving behavioral intervention technologies.

    PubMed

    Mohr, David C; Cheung, Ken; Schueller, Stephen M; Hendricks Brown, C; Duan, Naihua

    2013-10-01

    Behavioral intervention technologies (BITs) are web-based and mobile interventions intended to support patients and consumers in changing behaviors related to health, mental health, and well-being. BITs are provided to patients and consumers in clinical care settings and commercial marketplaces, frequently with little or no evaluation. Current evaluation methods, including RCTs and implementation studies, can require years to validate an intervention. This timeline is fundamentally incompatible with the BIT environment, where technology advancement and changes in consumer expectations occur quickly, necessitating rapidly evolving interventions. However, BITs can routinely and iteratively collect data in a planned and strategic manner and generate evidence through systematic prospective analyses, thereby creating a system that can "learn." A methodologic framework, Continuous Evaluation of Evolving Behavioral Intervention Technologies (CEEBIT), is proposed that can support the evaluation of multiple BITs or evolving versions, eliminating those that demonstrate poorer outcomes, while allowing new BITs to be entered at any time. CEEBIT could be used to ensure the effectiveness of BITs provided through deployment platforms in clinical care organizations or BIT marketplaces. The features of CEEBIT are described, including criteria for the determination of inferiority, determination of BIT inclusion, methods of assigning consumers to BITs, definition of outcomes, and evaluation of the usefulness of the system. CEEBIT offers the potential to collapse initial evaluation and postmarketing surveillance, providing ongoing assurance of safety and efficacy to patients and consumers, payers, and policymakers. PMID:24050429

  7. Transistor Level Circuit Experiments using Evolvable Hardware

    NASA Technical Reports Server (NTRS)

    Stoica, A.; Zebulum, R. S.; Keymeulen, D.; Ferguson, M. I.; Daud, Taher; Thakoor, A.

    2005-01-01

    The Jet Propulsion Laboratory (JPL) performs research in fault tolerant, long life, and space survivable electronics for the National Aeronautics and Space Administration (NASA). With that focus, JPL has been involved in Evolvable Hardware (EHW) technology research for the past several years. We have advanced the technology not only by simulation and evolution experiments, but also by designing, fabricating, and evolving a variety of transistor-based analog and digital circuits at the chip level. EHW refers to self-configuration of electronic hardware by evolutionary/genetic search mechanisms, thereby maintaining existing functionality in the presence of degradations due to aging, temperature, and radiation. In addition, EHW has the capability to reconfigure itself for new functionality when required for mission changes or encountered opportunities. Evolution experiments are performed using a genetic algorithm running on a DSP as the reconfiguration mechanism and controlling the evolvable hardware mounted on a self-contained circuit board. Rapid reconfiguration allows convergence to circuit solutions in the order of seconds. The paper illustrates hardware evolution results of electronic circuits and their ability to perform under 230 C temperature as well as radiations of up to 250 kRad.

  8. Continuous Evaluation of Evolving Behavioral Intervention Technologies

    PubMed Central

    Mohr, David C.; Cheung, Ken; Schueller, Stephen M.; Brown, C. Hendricks; Duan, Naihua

    2013-01-01

    Behavioral intervention technologies (BITs) are web-based and mobile interventions intended to support patients and consumers in changing behaviors related to health, mental health, and well-being. BITs are provided to patients and consumers in clinical care settings and commercial marketplaces, frequently with little or no evaluation. Current evaluation methods, including RCTs and implementation studies, can require years to validate an intervention. This timeline is fundamentally incompatible with the BIT environment, where technology advancement and changes in consumer expectations occur quickly, necessitating rapidly evolving interventions. However, BITs can routinely and iteratively collect data in a planned and strategic manner and generate evidence through systematic prospective analyses, thereby creating a system that can “learn.” A methodologic framework, Continuous Evaluation of Evolving Behavioral Intervention Technologies (CEEBIT), is proposed that can support the evaluation of multiple BITs or evolving versions, eliminating those that demonstrate poorer outcomes, while allowing new BITs to be entered at any time. CEEBIT could be used to ensure the effectiveness of BITs provided through deployment platforms in clinical care organizations or BIT marketplaces. The features of CEEBIT are described, including criteria for the determination of inferiority, determination of BIT inclusion, methods of assigning consumers to BITs, definition of outcomes, and evaluation of the usefulness of the system. CEEBIT offers the potential to collapse initial evaluation and postmarketing surveillance, providing ongoing assurance of safety and efficacy to patients and consumers, payers, and policymakers. PMID:24050429

  9. Evolving hardware as model of enzyme evolution.

    PubMed

    Lahoz-Beltra, R

    2001-06-01

    Organism growth and survival is based on thousands of enzymes organized in networks. The motivation to understand how a large number of enzymes evolved so fast inside cells may be relevant to explaining the origin and maintenance of life on Earth. This paper presents electronic circuits called 'electronic enzymes' that model the catalytic function performed by biological enzymes. Electronic enzymes are the hardware realization of enzymes defined as molecular automata with a finite number of internal conformational states and a set of Boolean operators modelling the active groups of the active site. One of the main features of electronic enzymes is the possibility of evolution finding the proper active site by means of a genetic algorithm yielding a metabolic ring or k-cycle that bears a resemblance to Krebs (k=7) or Calvin (k=4) cycles present in organisms. The simulations are consistent with those results obtained in vitro evolving enzymes based on polymerase chain reaction (PCR) as well as with the general view that suggests the main role of recombination during enzyme evolution. The proposed methodology shows how molecular automata with evolvable features that model enzymes or other processing molecules provide an experimental framework for simulation of the principles governing metabolic pathways evolution and self-organization. PMID:11448522

  10. Software reliability perspectives

    NASA Technical Reports Server (NTRS)

    Wilson, Larry; Shen, Wenhui

    1987-01-01

    Software which is used in life critical functions must be known to be highly reliable before installation. This requires a strong testing program to estimate the reliability, since neither formal methods, software engineering nor fault tolerant methods can guarantee perfection. Prior to the final testing software goes through a debugging period and many models have been developed to try to estimate reliability from the debugging data. However, the existing models are poorly validated and often give poor performance. This paper emphasizes the fact that part of their failures can be attributed to the random nature of the debugging data given to these models as input, and it poses the problem of correcting this defect as an area of future research.

  11. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  12. Evolving Stochastic Learning Algorithm based on Tsallis entropic index

    NASA Astrophysics Data System (ADS)

    Anastasiadis, A. D.; Magoulas, G. D.

    2006-03-01

    In this paper, inspired from our previous algorithm, which was based on the theory of Tsallis statistical mechanics, we develop a new evolving stochastic learning algorithm for neural networks. The new algorithm combines deterministic and stochastic search steps by employing a different adaptive stepsize for each network weight, and applies a form of noise that is characterized by the nonextensive entropic index q, regulated by a weight decay term. The behavior of the learning algorithm can be made more stochastic or deterministic depending on the trade off between the temperature T and the q values. This is achieved by introducing a formula that defines a time-dependent relationship between these two important learning parameters. Our experimental study verifies that there are indeed improvements in the convergence speed of this new evolving stochastic learning algorithm, which makes learning faster than using the original Hybrid Learning Scheme (HLS). In addition, experiments are conducted to explore the influence of the entropic index q and temperature T on the convergence speed and stability of the proposed method.

  13. Reliability Degradation Due to Stockpile Aging

    SciTech Connect

    Robinson, David G.

    1999-04-01

    The objective of this reseach is the investigation of alternative methods for characterizing the reliability of systems with time dependent failure modes associated with stockpile aging. Reference to 'reliability degradation' has, unfortunately, come to be associated with all types of aging analyes: both deterministic and stochastic. In this research, in keeping with the true theoretical definition, reliability is defined as a probabilistic description of system performance as a funtion of time. Traditional reliability methods used to characterize stockpile reliability depend on the collection of a large number of samples or observations. Clearly, after the experiments have been performed and the data has been collected, critical performance problems can be identified. A Major goal of this research is to identify existing methods and/or develop new mathematical techniques and computer analysis tools to anticipate stockpile problems before they become critical issues. One of the most popular methods for characterizing the reliability of components, particularly electronic components, assumes that failures occur in a completely random fashion, i.e. uniformly across time. This method is based primarily on the use of constant failure rates for the various elements that constitute the weapon system, i.e. the systems do not degrade while in storage. Experience has shown that predictions based upon this approach should be regarded with great skepticism since the relationship between the life predicted and the observed life has been difficult to validate. In addition to this fundamental problem, the approach does not recognize that there are time dependent material properties and variations associated with the manufacturing process and the operational environment. To appreciate the uncertainties in predicting system reliability a number of alternative methods are explored in this report. All of the methods are very different from those currently used to assess stockpile

  14. Gearbox Reliability Collaborative Update (Presentation)

    SciTech Connect

    Sheng, S.

    2013-10-01

    This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.

  15. Materials reliability issues in microelectronics

    SciTech Connect

    Lloyd, J.R. ); Yost, F.G. ); Ho, P.S. )

    1991-01-01

    This book covers the proceedings of a MRS symposium on materials reliability in microelectronics. Topics include: electromigration; stress effects on reliability; stress and packaging; metallization; device, oxide and dielectric reliability; new investigative techniques; and corrosion.

  16. An Examination of the Reliability of Scores from Zuckerman's Sensation Seeking Scales, Form V.

    ERIC Educational Resources Information Center

    Deditius-Island, Heide K.; Caruso, John C.

    2002-01-01

    Conducted a reliability generalization study on Zuckerman's Sensation Seeking Scale (M. Zuckerman and others, 1964) using 113 reliability coefficients from 21 published studies. The reliability of scores was marginal for four of the five scales, and low for the other. Mean age of subjects has a significant relationship with score reliability. (SLD)

  17. The organization and control of an evolving interdependent population

    PubMed Central

    Vural, Dervis C.; Isakov, Alexander; Mahadevan, L.

    2015-01-01

    Starting with Darwin, biologists have asked how populations evolve from a low fitness state that is evolutionarily stable to a high fitness state that is not. Specifically of interest is the emergence of cooperation and multicellularity where the fitness of individuals often appears in conflict with that of the population. Theories of social evolution and evolutionary game theory have produced a number of fruitful results employing two-state two-body frameworks. In this study, we depart from this tradition and instead consider a multi-player, multi-state evolutionary game, in which the fitness of an agent is determined by its relationship to an arbitrary number of other agents. We show that populations organize themselves in one of four distinct phases of interdependence depending on one parameter, selection strength. Some of these phases involve the formation of specialized large-scale structures. We then describe how the evolution of independence can be manipulated through various external perturbations. PMID:26040593

  18. The organization and control of an evolving interdependent population.

    PubMed

    Vural, Dervis C; Isakov, Alexander; Mahadevan, L

    2015-07-01

    Starting with Darwin, biologists have asked how populations evolve from a low fitness state that is evolutionarily stable to a high fitness state that is not. Specifically of interest is the emergence of cooperation and multicellularity where the fitness of individuals often appears in conflict with that of the population. Theories of social evolution and evolutionary game theory have produced a number of fruitful results employing two-state two-body frameworks. In this study, we depart from this tradition and instead consider a multi-player, multi-state evolutionary game, in which the fitness of an agent is determined by its relationship to an arbitrary number of other agents. We show that populations organize themselves in one of four distinct phases of interdependence depending on one parameter, selection strength. Some of these phases involve the formation of specialized large-scale structures. We then describe how the evolution of independence can be manipulated through various external perturbations. PMID:26040593

  19. Wild Origins: The Evolving Nature of Animal Behavior

    NASA Astrophysics Data System (ADS)

    Flores, Ifigenia

    For billions of years, evolution has been the driving force behind the incredible range of biodiversity on our planet. Wild Origins is a concept plan for an exhibition at the National Zoo that uses case studies of animal behavior to explain the theory of evolution. Behaviors evolve, just as physical forms do. Understanding natural selection can help us interpret animal behavior and vice-versa. A living collection, digital media, interactives, fossils, and photographs will relay stories of social behavior, sex, navigation and migration, foraging, domestication, and relationships between different species. The informal learning opportunities visitors are offered at the zoo will create a connection with the exhibition's teaching points. Visitors will leave with an understanding and sense of wonder at the evolutionary view of life.

  20. Bioharness™ Multivariable Monitoring Device: Part. II: Reliability

    PubMed Central

    Johnstone, James A.; Ford, Paul A.; Hughes, Gerwyn; Watson, Tim; Garrett, Andrew T.

    2012-01-01

    The Bioharness™ monitoring system may provide physiological information on human performance but the reliability of this data is fundamental for confidence in the equipment being used. The objective of this study was to assess the reliability of each of the 5 Bioharness™ variables using a treadmill based protocol. 10 healthy males participated. A between and within subject design to assess the reliability of Heart rate (HR), Breathing Frequency (BF), Accelerometry (ACC) and Infra-red skin temperature (ST) was completed via a repeated, discontinuous, incremental treadmill protocol. Posture (P) was assessed by a tilt table, moved through 160°. Between subject data reported low Coefficient of Variation (CV) and strong correlations(r) for ACC and P (CV< 7.6; r = 0.99, p < 0.01). In contrast, HR and BF (CV~19.4; r~0.70, p < 0.01) and ST (CV 3.7; r = 0.61, p < 0.01), present more variable data. Intra and inter device data presented strong relationships (r > 0.89, p < 0.01) and low CV (<10.1) for HR, ACC, P and ST. BF produced weaker relationships (r < 0.72) and higher CV (<17.4). In comparison to the other variables BF variable consistently presents less reliability. Global results suggest that the Bioharness™ is a reliable multivariable monitoring device during laboratory testing within the limits presented. Key pointsHeart rate and breathing frequency data increased in variance at higher velocities (i.e. ≥ 10 km.h-1)In comparison to the between subject testing, the intra and inter reliability presented good reliability in data suggesting placement or position of device relative to performer could be important for data collectionUnderstanding a devices variability in measurement is important before it can be used within an exercise testing or monitoring setting PMID:24149347

  1. Reliable solar cookers

    SciTech Connect

    Magney, G.K.

    1992-12-31

    The author describes the activities of SERVE, a Christian relief and development agency, to introduce solar ovens to the Afghan refugees in Pakistan. It has provided 5,000 solar cookers since 1984. The experience has demonstrated the potential of the technology and the need for a durable and reliable product. Common complaints about the cookers are discussed and the ideal cooker is described.

  2. Parametric Mass Reliability Study

    NASA Technical Reports Server (NTRS)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  3. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the ``factories,`` reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  4. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the factories,'' reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  5. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  6. Software Reliability Measurement Experience

    NASA Technical Reports Server (NTRS)

    Nikora, A. P.

    1993-01-01

    In this chapter, we describe a recent study of software reliability measurement methods that was conducted at the Jet Propulsion Laboratory. The first section of the chapter, sections 8.1, summarizes the study, characterizes the participating projects, describes the available data, and summarizes the tudy's results.

  7. Nonparametric Methods in Reliability

    PubMed Central

    Hollander, Myles; Peña, Edsel A.

    2005-01-01

    Probabilistic and statistical models for the occurrence of a recurrent event over time are described. These models have applicability in the reliability, engineering, biomedical and other areas where a series of events occurs for an experimental unit as time progresses. Nonparametric inference methods, in particular, the estimation of a relevant distribution function, are described. PMID:16710444

  8. Conceptualizing Essay Tests' Reliability and Validity: From Research to Theory

    ERIC Educational Resources Information Center

    Badjadi, Nour El Imane

    2013-01-01

    The current paper on writing assessment surveys the literature on the reliability and validity of essay tests. The paper aims to examine the two concepts in relationship with essay testing as well as to provide a snapshot of the current understandings of the reliability and validity of essay tests as drawn in recent research studies. Bearing in…

  9. The Mineralogy of Dust Around Evolved Stars

    NASA Astrophysics Data System (ADS)

    Speck, A. K.

    1998-11-01

    Infrared (IR) observations of evolved red giant stars (AGB stars) have shown that many are surrounded by dust envelopes, which are ejected into the interstellar medium and seed the next generation of stars and planets. By studying these one can understand the origins of interstellar and solar system materials. AGB stars fall into two main categories: oxygen-rich and carbon-rich. The prominent features of the IR spectra of AGB stars are: the 11.3microns feature of C-stars, attributed to silicon carbide (SiC); and the 9.7microns feature of O-rich stars, attributed to silicates. There are also various minor features with less secure identifications. Identifying dust around stars requires the use of laboratory spectra of dust species analogous to those one expects to observe. I have compiled a database of such spectra, and thereby constrained the identifications of circumstellar dust, which I have also tried to ensure are compatible with data from meteoritic presolar grains. Some laboratory spectra need to be modified before they are relevant to the problem in hand, i.e. stardust. The techniques used for such modifications are outlined in the thesis. In order to fully comprehend the problems that can arise from using laboratory spectra, the way in which light interacts with matter must be understood. To this end the optical properties of matter are discussed. While the mineral constituents of the Earth have been reprocessed so extensively that they no longer contain any evidence of their stellar origins, the same is not true of primitive meteorites which contain "presolar" dust grains with isotopic fingerprints identifying their stellar sources. By comparing these presolar grains with nucleosynthesis models, grains expected to form around various stars and observational evidence of dust, we can gain a better picture of the formation mechanisms and sites of the various dust grains. I have investigated the mineralogy of SiC of 32 C-stars and its relationship to

  10. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection

    NASA Astrophysics Data System (ADS)

    Janković, Srdja; Ćirković, Milan M.

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  11. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection.

    PubMed

    Janković, Srdja; Ćirković, Milan M

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection. PMID:26419865

  12. Information entropy as a measure of genetic diversity and evolvability in colonization.

    PubMed

    Day, Troy

    2015-05-01

    In recent years, several studies have examined the relationship between genetic diversity and establishment success in colonizing species. Many of these studies have shown that genetic diversity enhances establishment success. There are several hypotheses that might explain this pattern, and here I focus on the possibility that greater genetic diversity results in greater evolvability during colonization. Evaluating the importance of this mechanism first requires that we quantify evolvability. Currently, most measures of evolvability have been developed for quantitative traits whereas many studies of colonization success deal with discrete molecular markers or phenotypes. The purpose of this study is to derive a suitable measure of evolvability for such discrete data. I show that under certain assumptions, Shannon's information entropy of the allelic distribution provides a natural measure of evolvability. This helps to alleviate previous concerns about the interpretation of information entropy for genetic data. I also suggest that information entropy provides a natural generalization to previous measures of evolvability for quantitative traits when the trait distributions are not necessarily multivariate normal. PMID:25604806

  13. Survivability Is More Fundamental Than Evolvability

    PubMed Central

    Palmer, Michael E.; Feldman, Marcus W.

    2012-01-01

    For a lineage to survive over long time periods, it must sometimes change. This has given rise to the term evolvability, meaning the tendency to produce adaptive variation. One lineage may be superior to another in terms of its current standing variation, or it may tend to produce more adaptive variation. However, evolutionary outcomes depend on more than standing variation and produced adaptive variation: deleterious variation also matters. Evolvability, as most commonly interpreted, is not predictive of evolutionary outcomes. Here, we define a predictive measure of the evolutionary success of a lineage that we call the k-survivability, defined as the probability that the lineage avoids extinction for k generations. We estimate the k-survivability using multiple experimental replicates. Because we measure evolutionary outcomes, the initial standing variation, the full spectrum of generated variation, and the heritability of that variation are all incorporated. Survivability also accounts for the decreased joint likelihood of extinction of sub-lineages when they 1) disperse in space, or 2) diversify in lifestyle. We illustrate measurement of survivability with in silico models, and suggest that it may also be measured in vivo using multiple longitudinal replicates. The k-survivability is a metric that enables the quantitative study of, for example, the evolution of 1) mutation rates, 2) dispersal mechanisms, 3) the genotype-phenotype map, and 4) sexual reproduction, in temporally and spatially fluctuating environments. Although these disparate phenomena evolve by well-understood microevolutionary rules, they are also subject to the macroevolutionary constraint of long-term survivability. PMID:22723844

  14. Production and decay of evolving horizons

    NASA Astrophysics Data System (ADS)

    Nielsen, Alex B.; Visser, Matt

    2006-07-01

    We consider a simple physical model for an evolving horizon that is strongly interacting with its environment, exchanging arbitrarily large quantities of matter with its environment in the form of both infalling material and outgoing Hawking radiation. We permit fluxes of both lightlike and timelike particles to cross the horizon, and ask how the horizon grows and shrinks in response to such flows. We place a premium on providing a clear and straightforward exposition with simple formulae. To be able to handle such a highly dynamical situation in a simple manner we make one significant physical restriction—that of spherical symmetry—and two technical mathematical restrictions: (1) we choose to slice the spacetime in such a way that the spacetime foliations (and hence the horizons) are always spherically symmetric. (2) Furthermore, we adopt Painlevé Gullstrand coordinates (which are well suited to the problem because they are nonsingular at the horizon) in order to simplify the relevant calculations. Of course physics results are ultimately independent of the choice of coordinates, but this particular coordinate system yields a clean physical interpretation of the relevant physics. We find particularly simple forms for surface gravity, and for the first and second law of black hole thermodynamics, in this general evolving horizon situation. Furthermore, we relate our results to Hawking's apparent horizon, Ashtekar and co-worker's isolated and dynamical horizons, and Hayward's trapping horizon. The evolving black hole model discussed here will be of interest, both from an astrophysical viewpoint in terms of discussing growing black holes and from a purely theoretical viewpoint in discussing black hole evaporation via Hawking radiation.

  15. Evolving Black Holes with Wavy Initial Data

    NASA Astrophysics Data System (ADS)

    Kelly, Bernard; Tichy, Wolfgang; Zlochower, Yosef; Campanelli, Manuela; Whiting, Bernard

    2009-05-01

    In Kelly et al. [Phys. Rev. D v. 76, 024008 (2007)], we presented new binary black-hole initial data adapted to puncture evolutions in numerical relativity. This data satisfies the constraint equations to 2.5 post-Newtonian order, and contains a transverse-traceless ``wavy'' metric contribution, violating the standard assumption of conformal flatness. We report on progress in evolving this data with a modern moving-puncture implementation of the BSSN equations in several numerical codes. We will discuss the effect of the new metric terms on junk radiation and continuity of physical radiation extracted.

  16. Present weather and climate: evolving conditions

    USGS Publications Warehouse

    Hoerling, Martin P; Dettinger, Michael; Wolter, Klaus; Lukas, Jeff; Eischeid, Jon K.; Nemani, Rama; Liebmann, Brant; Kunkel, Kenneth E.

    2013-01-01

    This chapter assesses weather and climate variability and trends in the Southwest, using observed climate and paleoclimate records. It analyzes the last 100 years of climate variability in comparison to the last 1,000 years, and links the important features of evolving climate conditions to river flow variability in four of the region’s major drainage basins. The chapter closes with an assessment of the monitoring and scientific research needed to increase confidence in understanding when climate episodes, events, and phenomena are attributable to human-caused climate change.

  17. Evolvable circuit with transistor-level reconfigurability

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian (Inventor); Salazar-Lazaro, Carlos Harold (Inventor)

    2004-01-01

    An evolvable circuit includes a plurality of reconfigurable switches, a plurality of transistors within a region of the circuit, the plurality of transistors having terminals, the plurality of transistors being coupled between a power source terminal and a power sink terminal so as to be capable of admitting power between the power source terminal and the power sink terminal, the plurality of transistors being coupled so that every transistor terminal to transistor terminal coupling within the region of the circuit comprises a reconfigurable switch.

  18. Earth As an Evolving Planetary System

    NASA Astrophysics Data System (ADS)

    Meert, Joseph G.

    2005-05-01

    ``System'' is an overused buzzword in textbooks covering geological sciences. Describing the Earth as a system of component parts is a reasonable concept, but providing a comprehensive framework for detailing the system is a more formidable task. Kent Condie lays out the systems approach in an easy-to-read introductory chapter in Earth as an Evolving Planetary System. In the book, Condie makes a valiant attempt at taking the mélange of diverse subjects in the solid Earth sciences and weaving them into a coherent tapestry.

  19. Mobile computing acceptance grows as applications evolve.

    PubMed

    Porn, Louis M; Patrick, Kelly

    2002-01-01

    Handheld devices are becoming more cost-effective to own, and their use in healthcare environments is increasing. Handheld devices currently are being used for e-prescribing, charge capture, and accessing daily schedules and reference tools. Future applications may include education on medications, dictation, order entry, and test-results reporting. Selecting the right handheld device requires careful analysis of current and future applications, as well as vendor expertise. It is important to recognize the technology will continue to evolve over the next three years. PMID:11806321

  20. Scar State on Time-evolving Wavepacket

    NASA Astrophysics Data System (ADS)

    Tomiya, Mitsuyoshi; Tsuyuki, Hiroyoshi; Kawamura, Kentaro; Sakamoto, Shoichi; Heller, Eric J.

    2015-09-01

    The scar-like enhancement is found in the accumulation of the time-evolving wavepacket in stadium billiard. It appears close to unstable periodic orbits, when the wavepackets are launched along the orbits. The enhancement is essentially due to the same mechanism of the well-known scar states in stationary eigenstates. The weighted spectral function reveals that the enhancement is the pileup of contributions from scar states on the same periodic orbit. The availavility of the weighted spectrum to the semiclassical approximation is also disscussed.

  1. Evolving techniques for gastrointestinal endoscopic hemostasis treatment.

    PubMed

    Ghassemi, Kevin A; Jensen, Dennis M

    2016-05-01

    With mortality due to gastrointestinal (GI) bleeding remaining stable, the focus on endoscopic hemostasis has been on improving other outcomes such as rebleeding rate, need for transfusions, and need for angiographic embolization or surgery. Over the past few years, a number of devices have emerged to help endoscopically assess and treat bleeding GI lesions. These include the Doppler endoscopic probe, hemostatic powder, and over-the-scope clip. Also, new applications have been described for radiofrequency ablation. In this article, we will discuss these evolving tools and techniques that have been developed, including an analysis of their efficacy and limitations. PMID:26651414

  2. Reliability-based design optimization under stationary stochastic process loads

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Du, Xiaoping

    2016-08-01

    Time-dependent reliability-based design ensures the satisfaction of reliability requirements for a given period of time, but with a high computational cost. This work improves the computational efficiency by extending the sequential optimization and reliability analysis (SORA) method to time-dependent problems with both stationary stochastic process loads and random variables. The challenge of the extension is the identification of the most probable point (MPP) associated with time-dependent reliability targets. Since a direct relationship between the MPP and reliability target does not exist, this work defines the concept of equivalent MPP, which is identified by the extreme value analysis and the inverse saddlepoint approximation. With the equivalent MPP, the time-dependent reliability-based design optimization is decomposed into two decoupled loops: deterministic design optimization and reliability analysis, and both are performed sequentially. Two numerical examples are used to show the efficiency of the proposed method.

  3. Initial value sensitivity of the Chinese stock market and its relationship with the investment psychology

    NASA Astrophysics Data System (ADS)

    Ying, Shangjun; Li, Xiaojun; Zhong, Xiuqin

    2015-04-01

    This paper discusses the initial value sensitivity (IVS) of Chinese stock market, including the single stock market and the Chinese A-share stock market, with respect to real markets and evolving models. The aim is to explore the relationship between IVS of the Chinese A-share stock market and the investment psychology based on the evolving model of genetic cellular automaton (GCA). We find: (1) The Chinese stock market is sensitively dependent on the initial conditions. (2) The GCA model provides a considerable reliability in complexity simulation (e.g. the IVS). (3) The IVS of stock market is positively correlated with the imitation probability when the intensity of the imitation psychology reaches a certain threshold. The paper suggests that the government should seek to keep the imitation psychology under a certain level, otherwise it may induce severe fluctuation to the market.

  4. Space Shuttle Propulsion System Reliability

    NASA Technical Reports Server (NTRS)

    Welzyn, Ken; VanHooser, Katherine; Moore, Dennis; Wood, David

    2011-01-01

    This session includes the following sessions: (1) External Tank (ET) System Reliability and Lessons, (2) Space Shuttle Main Engine (SSME), Reliability Validated by a Million Seconds of Testing, (3) Reusable Solid Rocket Motor (RSRM) Reliability via Process Control, and (4) Solid Rocket Booster (SRB) Reliability via Acceptance and Testing.

  5. Human Reliability Program Workshop

    SciTech Connect

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  6. Reliable broadcast protocols

    NASA Technical Reports Server (NTRS)

    Joseph, T. A.; Birman, Kenneth P.

    1989-01-01

    A number of broadcast protocols that are reliable subject to a variety of ordering and delivery guarantees are considered. Developing applications that are distributed over a number of sites and/or must tolerate the failures of some of them becomes a considerably simpler task when such protocols are available for communication. Without such protocols the kinds of distributed applications that can reasonably be built will have a very limited scope. As the trend towards distribution and decentralization continues, it will not be surprising if reliable broadcast protocols have the same role in distributed operating systems of the future that message passing mechanisms have in the operating systems of today. On the other hand, the problems of engineering such a system remain large. For example, deciding which protocol is the most appropriate to use in a certain situation or how to balance the latency-communication-storage costs is not an easy question.

  7. Data networks reliability

    NASA Astrophysics Data System (ADS)

    Gallager, Robert G.

    1988-10-01

    The research from 1984 to 1986 on Data Network Reliability had the objective of developing general principles governing the reliable and efficient control of data networks. The research was centered around three major areas: congestion control, multiaccess networks, and distributed asynchronous algorithms. The major topics within congestion control were the use of flow control algorithms. The major topics within congestion control were the use of flow control to reduce congestion and the use of routing to reduce congestion. The major topics within multiaccess networks were the communication properties of multiaccess channels, collision resolution, and packet radio networks. The major topics within asynchronous distributed algorithms were failure recovery, time vs. communication tradeoffs, and the general theory of distributed algorithms.

  8. Reliability of photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1986-01-01

    In order to assess the reliability of photovoltaic modules, four categories of known array failure and degradation mechanisms are discussed, and target reliability allocations have been developed within each category based on the available technology and the life-cycle-cost requirements of future large-scale terrestrial applications. Cell-level failure mechanisms associated with open-circuiting or short-circuiting of individual solar cells generally arise from cell cracking or the fatigue of cell-to-cell interconnects. Power degradation mechanisms considered include gradual power loss in cells, light-induced effects, and module optical degradation. Module-level failure mechanisms and life-limiting wear-out mechanisms are also explored.

  9. Early formation of evolved asteroidal crust.

    PubMed

    Day, James M D; Ash, Richard D; Liu, Yang; Bellucci, Jeremy J; Rumble, Douglas; McDonough, William F; Walker, Richard J; Taylor, Lawrence A

    2009-01-01

    Mechanisms for the formation of crust on planetary bodies remain poorly understood. It is generally accepted that Earth's andesitic continental crust is the product of plate tectonics, whereas the Moon acquired its feldspar-rich crust by way of plagioclase flotation in a magma ocean. Basaltic meteorites provide evidence that, like the terrestrial planets, some asteroids generated crust and underwent large-scale differentiation processes. Until now, however, no evolved felsic asteroidal crust has been sampled or observed. Here we report age and compositional data for the newly discovered, paired and differentiated meteorites Graves Nunatak (GRA) 06128 and GRA 06129. These meteorites are feldspar-rich, with andesite bulk compositions. Their age of 4.52 +/- 0.06 Gyr demonstrates formation early in Solar System history. The isotopic and elemental compositions, degree of metamorphic re-equilibration and sulphide-rich nature of the meteorites are most consistent with an origin as partial melts from a volatile-rich, oxidized asteroid. GRA 06128 and 06129 are the result of a newly recognized style of evolved crust formation, bearing witness to incomplete differentiation of their parent asteroid and to previously unrecognized diversity of early-formed materials in the Solar System. PMID:19129845

  10. Have plants evolved to self-immolate?

    PubMed Central

    Bowman, David M. J. S.; French, Ben J.; Prior, Lynda D.

    2014-01-01

    By definition fire prone ecosystems have highly combustible plants, leading to the hypothesis, first formally stated by Mutch in 1970, that community flammability is the product of natural selection of flammable traits. However, proving the “Mutch hypothesis” has presented an enormous challenge for fire ecologists given the difficulty in establishing cause and effect between landscape fire and flammable plant traits. Individual plant traits (such as leaf moisture content, retention of dead branches and foliage, oil rich foliage) are known to affect the flammability of plants but there is no evidence these characters evolved specifically to self-immolate, although some of these traits may have been secondarily modified to increase the propensity to burn. Demonstrating individual benefits from self-immolation is extraordinarily difficult, given the intersection of the physical environmental factors that control landscape fire (fuel production, dryness and ignitions) with community flammability properties that emerge from numerous traits of multiple species (canopy cover and litter bed bulk density). It is more parsimonious to conclude plants have evolved mechanisms to tolerate, but not promote, landscape fire. PMID:25414710