Science.gov

Sample records for evolving reliable relationships

  1. An Evolving Relationship.

    ERIC Educational Resources Information Center

    May, Therese M.

    1990-01-01

    Responds to five major articles by Duckworth, Goldman, Healy, Sampson, and Goodyear on issues pertaining to testing and assessment in counseling psychology. Suggests that the interactive, collaborative aspects of the assessment relationship between psychologist and client need more attention. (TE)

  2. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, T.; Toon, J.; Conner, A.; Adams, T.; Miranda, D.

    2016-01-01

    This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) programs subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.

  3. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, Troy; Toon, Jamie; Conner, Angelo C.; Adams, Timothy C.; Miranda, David J.

    2016-01-01

    This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program’s subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.

  4. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, Jamie; Toon, Troy; Adams, Timothy C.; Miranda, David J.

    2016-01-01

    This paper describes the methodology that was developed to allocate reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program's subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. Allocating is an iterative process; as systems moved beyond their conceptual and preliminary design phases this provided an opportunity for the reliability engineering team to reevaluate allocations based on updated designs and maintainability characteristics of the components. Trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper will discuss the value of modifying reliability and maintainability allocations made for the GSDO subsystems as the program nears the end of its design phase.

  5. Evolvability

    PubMed Central

    Kirschner, Marc; Gerhart, John

    1998-01-01

    Evolvability is an organism’s capacity to generate heritable phenotypic variation. Metazoan evolution is marked by great morphological and physiological diversification, although the core genetic, cell biological, and developmental processes are largely conserved. Metazoan diversification has entailed the evolution of various regulatory processes controlling the time, place, and conditions of use of the conserved core processes. These regulatory processes, and certain of the core processes, have special properties relevant to evolutionary change. The properties of versatile protein elements, weak linkage, compartmentation, redundancy, and exploratory behavior reduce the interdependence of components and confer robustness and flexibility on processes during embryonic development and in adult physiology. They also confer evolvability on the organism by reducing constraints on change and allowing the accumulation of nonlethal variation. Evolvability may have been generally selected in the course of selection for robust, flexible processes suitable for complex development and physiology and specifically selected in lineages undergoing repeated radiations. PMID:9671692

  6. Host-parasite relationship in cystic echinococcosis: an evolving story.

    PubMed

    Siracusano, Alessandra; Delunardo, Federica; Teggi, Antonella; Ortona, Elena

    2012-01-01

    The larval stage of Echinococcus granulosus causes cystic echinococcosis, a neglected infectious disease that constitutes a major public health problem in developing countries. Despite being under constant barrage by the immune system, E. granulosus modulates antiparasite immune responses and persists in the human hosts with detectable humoral and cellular responses against the parasite. In vitro and in vivo immunological approaches, together with molecular biology and immunoproteomic technologies, provided us exciting insights into the mechanisms involved in the initiation of E. granulosus infection and the consequent induction and regulation of the immune response. Although the last decade has clarified many aspects of host-parasite relationship in human cystic echinococcosis, establishing the full mechanisms that cause the disease requires more studies. Here, we review some of the recent developments and discuss new avenues in this evolving story of E. granulosus infection in man. PMID:22110535

  7. Towards resolving Lamiales relationships: insights from rapidly evolving chloroplast sequences

    PubMed Central

    2010-01-01

    Background In the large angiosperm order Lamiales, a diverse array of highly specialized life strategies such as carnivory, parasitism, epiphytism, and desiccation tolerance occur, and some lineages possess drastically accelerated DNA substitutional rates or miniaturized genomes. However, understanding the evolution of these phenomena in the order, and clarifying borders of and relationships among lamialean families, has been hindered by largely unresolved trees in the past. Results Our analysis of the rapidly evolving trnK/matK, trnL-F and rps16 chloroplast regions enabled us to infer more precise phylogenetic hypotheses for the Lamiales. Relationships among the nine first-branching families in the Lamiales tree are now resolved with very strong support. Subsequent to Plocospermataceae, a clade consisting of Carlemanniaceae plus Oleaceae branches, followed by Tetrachondraceae and a newly inferred clade composed of Gesneriaceae plus Calceolariaceae, which is also supported by morphological characters. Plantaginaceae (incl. Gratioleae) and Scrophulariaceae are well separated in the backbone grade; Lamiaceae and Verbenaceae appear in distant clades, while the recently described Linderniaceae are confirmed to be monophyletic and in an isolated position. Conclusions Confidence about deep nodes of the Lamiales tree is an important step towards understanding the evolutionary diversification of a major clade of flowering plants. The degree of resolution obtained here now provides a first opportunity to discuss the evolution of morphological and biochemical traits in Lamiales. The multiple independent evolution of the carnivorous syndrome, once in Lentibulariaceae and a second time in Byblidaceae, is strongly supported by all analyses and topological tests. The evolution of selected morphological characters such as flower symmetry is discussed. The addition of further sequence data from introns and spacers holds promise to eventually obtain a fully resolved plastid tree of

  8. Cats: their history and our evolving relationship with them.

    PubMed

    2016-07-01

    Cats have had a long relationship with people, and their history as a domesticated animal can be traced back as far as 2000 BC. Delegates at a recent conference titled 'People, cats and vets through history' delved a little deeper into the changing nature of this relationship. Georgina Mills reports. PMID:27389749

  9. Young People and Alcohol in Italy: An Evolving Relationship

    ERIC Educational Resources Information Center

    Beccaria, Franca; Prina, Franco

    2010-01-01

    In Italy, commonly held opinions and interpretations about the relationship between young people and alcohol are often expressed as generalizations and approximations. In order to further understanding of the relationship between young people and alcohol in contemporary Italy, we have gathered, compared and discussed all the available data, both…

  10. Models of Shared Leadership: Evolving Structures and Relationships.

    ERIC Educational Resources Information Center

    Hallinger, Philip; Richardson, Don

    1988-01-01

    Explores potential changes in the power relationships among teachers and principals. Describes and analyzes the following models of teacher decision-making: (1) Instructional Leadership Teams; (2) Principals' Advisory Councils; (3) School Improvement Teams; and (4) Lead Teacher Committees. (FMW)

  11. Risk and responsibility: a complex and evolving relationship.

    PubMed

    Kermisch, Céline

    2012-03-01

    This paper analyses the nature of the relationship between risk and responsibility. Since neither the concept of risk nor the concept of responsibility has an unequivocal definition, it is obvious that there is no single interpretation of their relationship. After introducing the different meanings of responsibility used in this paper, we analyse four conceptions of risk. This allows us to make their link with responsibility explicit and to determine if a shift in the connection between risk and responsibility can be outlined. (1) In the engineer's paradigm, the quantitative conception of risk does not include any concept of responsibility. Their relationship is indirect, the locus of responsibility being risk management. (2) In Mary Douglas' cultural theory, risks are constructed through the responsibilities they engage. (3) Rayner and (4) Wolff go further by integrating forms of responsibility in the definition of risk itself. Analysis of these four frameworks shows that the concepts of risk and responsibility are increasingly intertwined. This tendency is reinforced by increasing public awareness and a call for the integration of a moral dimension in risk management. Therefore, we suggest that a form of virtue-responsibility should also be integrated in the concept of risk. PMID:21103951

  12. Evolving Cross-Group Relationships: The Story of Miller High, 1950-2000

    ERIC Educational Resources Information Center

    Eick, Caroline

    2011-01-01

    This paper examines students' evolving cross-group relationships in a comprehensive high school in Baltimore County, Maryland, USA, between 1950 and 2000. The findings of this research, situated at the intersections of two lenses of inquiry: oral historical analysis and critical studies, uncover both the power of students accustomed to integrated…

  13. [Creating a reliable therapeutic relationship with the patient].

    PubMed

    Matsuki, Kunihiro

    2012-01-01

    The factors necessary to create a reliable therapeutic relationship are presented in this paper. They include a demeanor and calmness of temperament as a psychiatric professional, a feeling of respect for the patient that is based on our common sense as human beings, an attitude of listening attentively to what the patient is revealing, maintaining an attitude of receptive neutrality, the ability to withstand the emotional burdens imposed on one by the patient, patience with any difficulty on one's own part to understand the patient, the ability to communicate clearly, including on the patient's negative aspects, and the ability to end psychiatric consultation sessions in a friendly and intimate manner. Creating a beneficial therapeutic relationship is about the building of a trusting relationship, in which the patient can constructively endure being questioned by us, or cope with the tough burdens we may place on them. However, a reliable relationship such as this contains paradoxes. Patients are able to talk to us about their suspicions, anxieties, dissatisfactions or anger only if the therapeutic relationship is good or based on trust. In other words, just like our patients, psychiatrists, too, must deal with what that the patient brings and directs toward us. It is at this point that what we call a true therapeutic relationship starts. PMID:23367840

  14. ELECTRICAL SUBSTATION RELIABILITY EVALUATION WITH EMPHASIS ON EVOLVING INTERDEPENDENCE ON COMMUNICATION INFRASTRUCTURE.

    SciTech Connect

    AZARM,M.A.; BARI,R.; YUE,M.; MUSICKI,Z.

    2004-09-12

    This study developed a probabilistic methodology for assessment of the reliability and security of electrical energy distribution networks. This included consideration of the future grid system, which will rely heavily on the existing digitally based communication infrastructure for monitoring and protection. Event tree and fault tree methods were utilized. The approach extensively modeled the types of faults that a grid could potentially experience, the response of the grid, and the specific design of the protection schemes. We demonstrated the methods by applying it to a small sub-section of a hypothetical grid based on an existing electrical grid system of a metropolitan area. The results showed that for a typical design that relies on communication network for protection, the communication network reliability could contribute significantly to the frequency of loss of electrical power. The reliability of the communication network could become a more important contributor to the electrical grid reliability as the utilization of the communication network significantly increases in the near future to support ''smart'' transmission and/or distributed generation.

  15. Relationship Skills in a Clinical Performance Examination: Reliability and Validity of the Relationship Instrument.

    ERIC Educational Resources Information Center

    Bolton, Cynthia; And Others

    Among the repertoire of clinical skills necessary for the professional development of medical students is the ability to create a positive doctor-patient relationship through effective communication skills. The purpose of this study was to create an instrument that reliably measures the relationship between physician and patient. The Relationship…

  16. ELECTRICAL SUBSTATION RELIABILITY EVALUATION WITH EMPHASIS ON EVOLVING INTERDEPENDENCE ON COMMUNICATION INFRASTRUCTURE.

    SciTech Connect

    AZARM,M.A.BARI,R.A.MUSICKI,Z.

    2004-01-15

    The objective of this study is to develop a methodology for a probabilistic assessment of the reliability and security of electrical energy distribution networks. This includes consideration of the future grid system, which will rely heavily on the existing digitally based communication infrastructure for monitoring and protection. Another important objective of this study is to provide information and insights from this research to Consolidated Edison Company (Con Edison) that could be useful in the design of the new network segment to be installed in the area of the World Trade Center in lower Manhattan. Our method is microscopic in nature and relies heavily on the specific design of the portion of the grid being analyzed. It extensively models the types of faults that a grid could potentially experience, the response of the grid, and the specific design of the protection schemes. We demonstrate that the existing technology can be extended and applied to the electrical grid and to the supporting communication network. A small subsection of a hypothetical grid based on the existing New York City electrical grid system of Con Edison is used to demonstrate the methods. Sensitivity studies show that in the current design the frequency for the loss of the main station is sensitive to the communication network reliability. The reliability of the communication network could become a more important contributor to the electrical grid reliability as the utilization of the communication network significantly increases in the near future to support ''smart'' transmission and/or distributed generation. The identification of potential failure modes and their likelihood can support decisions on potential modifications to the network including hardware, monitoring instrumentation, and protection systems.

  17. Craniosacral rhythm: reliability and relationships with cardiac and respiratory rates.

    PubMed

    Hanten, W P; Dawson, D D; Iwata, M; Seiden, M; Whitten, F G; Zink, T

    1998-03-01

    Craniosacral rhythm (CSR) has long been the subject of debate, both over its existence and its use as a therapeutic tool in evaluation and treatment. Origins of this rhythm are unknown, and palpatory findings lack scientific support. The purpose of this study was to determine the intra- and inter-examiner reliabilities of the palpation of the rate of the CSR and the relationship between the rate of the CSR and the heart or respiratory rates of subjects and examiners. The rates of the CSR of 40 healthy adults were palpated twice by each of two examiners. The heart and respiratory rates of the examiners and the subjects were recorded while the rates of the subjects' CSR were palpated by the examiners. Intraclass correlation coefficients were calculated to determine the intra- and inter-examiner reliabilities of the palpation. Two multiple regression analyses, one for each examiner, were conducted to analyze the relationships between the rate of the CSR and the heart and respiratory rates of the subjects and the examiners. The intraexaminer reliability coefficients were 0.78 for examiner A and 0.83 for examiner B, and the interexaminer reliability coefficient was 0.22. The result of the multiple regression analysis for examiner A was R = 0.46 and adjusted R2 = 0.12 (p = 0.078) and for examiner B was R = 0.63 and adjusted R2 = 0.32 (p = 0.001). The highest bivariate correlation was found between the CSR and the subject's heart rate (r = 0.30) for examiner A and between the CSR and the examiner's heart rate (r = 0.42) for examiner B. The results indicated that a single examiner may be able to palpate the rate of the CSR consistently, if that is what we truly measured. It is possible that the perception of CSR is illusory. The rate of the CSR palpated by two examiners is not consistent. The results of the regression analysis of one examiner offered no validation to those of the other. It appears that a subject's CSR is not related to the heart or respiratory rates of the

  18. How Mentoring Relationships Evolve: A Longitudinal Study of Academic Pediatricians in a Physician Educator Faculty Development Program

    ERIC Educational Resources Information Center

    Balmer, Dorene; D'Alessandro, Donna; Risko, Wanessa; Gusic, Maryellen E.

    2011-01-01

    Introduction: Mentoring is increasingly recognized as central to career development. Less attention has been paid, however, to how mentoring relationships evolve over time. To provide a more complete picture of these complex relationships, the authors explored mentoring from a mentee's perspective within the context of a three-year faculty…

  19. The Unidimensional Relationship Closeness Scale (URCS): Reliability and Validity Evidence for a New Measure of Relationship Closeness

    ERIC Educational Resources Information Center

    Dibble, Jayson L.; Levine, Timothy R.; Park, Hee Sun

    2012-01-01

    A fundamental dimension along which all social and personal relationships vary is closeness. The Unidimensional Relationship Closeness Scale (URCS) is a 12-item self-report scale measuring the closeness of social and personal relationships. The reliability and validity of the URCS were assessed with college dating couples (N = 192), female friends…

  20. Reliability assurance program and its relationship to other regulations

    SciTech Connect

    Polich, T.J.

    1994-12-31

    The need for a safety-oriented reliability effort for the nuclear industry was identified by the U.S. Nuclear Regulatory Commission (NRC) in the Three Mile Island Action Plan (NUREG-0660) Item II.C.4. In SECY-89-013, {open_quotes}Design Requirements Related to the Evolutionary ALWR,{close_quotes} the staff stated that the reliability assurance program (RAP) would be required for design certification to ensure that the design reliability of safety-significant structures, systems, and components (SSCs) is maintained over the life of a plant. In November 1988, the staff informed the advanced light water reactor (ALWR) vendors and the Electric Power Research Institute (EPRI) that it was considering this matter. Since that time, the staff has had numerous interactions with industry regarding RAP. These include discussions and subsequent safety evaluation reports on the EPRI utilities requirements document and for both Evolutionary Designs. The RAP has also been discussed in SECY-93-087, {open_quotes}Policy, Technical, and Licensing Issues Pertaining to Evolutionary and Advanced Light-Water Reactor (ALWR) Designs{close_quotes} and SECY-94-084, {open_quotes}Policy and Technical Issues Associated With the Regulatory Treatment of Non-Safety Systems in Passive Plant Designs.{close_quotes}

  1. The relationship between reliability and bonding techniques in hybrid microcircuits

    NASA Technical Reports Server (NTRS)

    Caruso, S. V.; Kinser, D. L.; Graff, S. M.; Allen, R. V.

    1975-01-01

    Differential thermal expansion was shown to be responsible for many observed failures in ceramic chip capacitors mounted on alumina substrates. It is shown that the mounting techniques used in bonding the capacitors have a marked effect upon the thermally induced mechanical stress and thus the failure rate. A mathematical analysis was conducted of a composite model of the capacitor-substrate system to predict the magnitude of thermally induced stresses. It was experimentally observed that the stresses in more compliant bonding systems such as soft lead tin and indium solders are significantly lower than those in hard solder and epoxy systems. The marked dependence upon heating and cooling rate was proven to be a determining factor in the prediction of failure solder systems. It was found that the harder or higher melting solders are less susceptible to thermal cycling effects but that they are more likely to fail during initial processing operations. Strain gage techniques were used to determine thermally induced expansion stresses of the capacitors and the alumina substrates. The compliance of the different bonding mediums was determined. From the data obtained, several recommendations are made concerning the optimum bonding system for the achievement of maximum reliability.

  2. On the Relationships between Generative Encodings, Regularity, and Learning Abilities when Evolving Plastic Artificial Neural Networks

    PubMed Central

    Tonelli, Paul; Mouret, Jean-Baptiste

    2013-01-01

    A major goal of bio-inspired artificial intelligence is to design artificial neural networks with abilities that resemble those of animal nervous systems. It is commonly believed that two keys for evolving nature-like artificial neural networks are (1) the developmental process that links genes to nervous systems, which enables the evolution of large, regular neural networks, and (2) synaptic plasticity, which allows neural networks to change during their lifetime. So far, these two topics have been mainly studied separately. The present paper shows that they are actually deeply connected. Using a simple operant conditioning task and a classic evolutionary algorithm, we compare three ways to encode plastic neural networks: a direct encoding, a developmental encoding inspired by computational neuroscience models, and a developmental encoding inspired by morphogen gradients (similar to HyperNEAT). Our results suggest that using a developmental encoding could improve the learning abilities of evolved, plastic neural networks. Complementary experiments reveal that this result is likely the consequence of the bias of developmental encodings towards regular structures: (1) in our experimental setup, encodings that tend to produce more regular networks yield networks with better general learning abilities; (2) whatever the encoding is, networks that are the more regular are statistically those that have the best learning abilities. PMID:24236099

  3. Animals Used in Research and Education, 1966-2016: Evolving Attitudes, Policies, and Relationships.

    PubMed

    Lairmore, Michael D; Ilkiw, Jan

    2015-01-01

    Since the inception of the Association of American Veterinary Medical Colleges (AAVMC), the use of animals in research and education has been a central element of the programs of member institutions. As veterinary education and research programs have evolved over the past 50 years, so too have societal views and regulatory policies. AAVMC member institutions have continually responded to these events by exchanging best practices in training their students in the framework of comparative medicine and the needs of society. Animals provide students and faculty with the tools to learn the fundamental knowledge and skills of veterinary medicine and scientific discovery. The study of animal models has contributed extensively to medicine, veterinary medicine, and basic sciences as these disciplines seek to understand life processes. Changing societal views over the past 50 years have provided active examination and continued refinement of the use of animals in veterinary medical education and research. The future use of animals to educate and train veterinarians will likely continue to evolve as technological advances are applied to experimental design and educational systems. Natural animal models of both human and animal health will undoubtedly continue to serve a significant role in the education of veterinarians and in the development of new treatments of animal and human disease. As it looks to the future, the AAVMC as an organization will need to continue to support and promote best practices in the humane care and appropriate use of animals in both education and research. PMID:26673210

  4. The Evolving Symbiotic Relationship of Arts Education and U.S. Business.

    ERIC Educational Resources Information Center

    Sterling, Carol

    1995-01-01

    Proposes an extension of the arts education/business community relationship moving beyond issues of patronage and support. Maintains that the complexity of the 21st-century economy and society will be well served by the flexibility and creativity manifested in arts education. Recommends national goals and standards for arts education. (MJP)

  5. Suprafamilial relationships among Rodentia and the phylogenetic effect of removing fast-evolving nucleotides in mitochondrial, exon and intron fragments

    PubMed Central

    2008-01-01

    Background The number of rodent clades identified above the family level is contentious, and to date, no consensus has been reached on the basal evolutionary relationships among all rodent families. Rodent suprafamilial phylogenetic relationships are investigated in the present study using ~7600 nucleotide characters derived from two mitochondrial genes (Cytochrome b and 12S rRNA), two nuclear exons (IRBP and vWF) and four nuclear introns (MGF, PRKC, SPTBN, THY). Because increasing the number of nucleotides does not necessarily increase phylogenetic signal (especially if the data is saturated), we assess the potential impact of saturation for each dataset by removing the fastest-evolving positions that have been recognized as sources of inconsistencies in phylogenetics. Results Taxonomic sampling included multiple representatives of all five rodent suborders described. Fast-evolving positions for each dataset were identified individually using a discrete gamma rate category and sites belonging to the most rapidly evolving eighth gamma category were removed. Phylogenetic tree reconstructions were performed on individual and combined datasets using Parsimony, Bayesian, and partitioned Maximum Likelihood criteria. Removal of fast-evolving positions enhanced the phylogenetic signal to noise ratio but the improvement in resolution was not consistent across different data types. The results suggested that elimination of fastest sites only improved the support for nodes moderately affected by homoplasy (the deepest nodes for introns and more recent nodes for exons and mitochondrial genes). Conclusion The present study based on eight DNA fragments supports a fully resolved higher level rodent phylogeny with moderate to significant nodal support. Two inter-suprafamilial associations emerged. The first comprised a monophyletic assemblage containing the Anomaluromorpha (Anomaluridae + Pedetidae) + Myomorpha (Muridae + Dipodidae) as sister clade to the Castorimorpha

  6. Evolving Relationship Structures in Multi-sourcing Arrangements: The Case of Mission Critical Outsourcing

    NASA Astrophysics Data System (ADS)

    Heitlager, Ilja; Helms, Remko; Brinkkemper, Sjaak

    Information Technology Outsourcing practice and research mainly considers the outsourcing phenomenon as a generic fulfilment of the IT function by external parties. Inspired by the logic of commodity, core competencies and economies of scale; assets, existing departments and IT functions are transferred to external parties. Although the generic approach might work for desktop outsourcing, where standardisation is the dominant factor, it does not work for the management of mission critical applications. Managing mission critical applications requires a different approach where building relationships is critical. The relationships involve inter and intra organisational parties in a multi-sourcing arrangement, called an IT service chain, consisting of multiple (specialist) parties that have to collaborate closely to deliver high quality services.

  7. A Proposed New "What If Reliability" Analysis for Assessing the Statistical Significance of Bivariate Relationships.

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Daniel, Larry G.; Roberts, J. Kyle

    The purpose of this paper is to illustrate how displaying disattenuated correlation coefficients along with their unadjusted counterparts will allow the reader to assess the impact of unreliability on each bivariate relationship. The paper also demonstrates how a proposed new "what if reliability" analysis can complement the conventional null…

  8. A Proposed New "What if Reliability" Analysis for Assessing the Statistical Significance of Bivariate Relationships

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Roberts, J. Kyle; Daniel, Larry G.

    2005-01-01

    In this article, the authors (a) illustrate how displaying disattenuated correlation coefficients alongside their unadjusted counterparts will allow researchers to assess the impact of unreliability on bivariate relationships and (b) demonstrate how a proposed new "what if reliability" analysis can complement null hypothesis significance tests of…

  9. Assessing the Complex and Evolving Relationship between Charges and Payments in US Hospitals: 1996 – 2012

    PubMed Central

    Bulchis, Anne G.; Lomsadze, Liya; Joseph, Jonathan; Baral, Ranju; Bui, Anthony L.; Horst, Cody; Johnson, Elizabeth; Dieleman, Joseph L.

    2016-01-01

    Background In 2013 the United States spent $2.9 trillion on health care, more than in any previous year. Much of the debate around slowing health care spending growth focuses on the complicated pricing system for services. Our investigation contributes to knowledge of health care spending by assessing the relationship between charges and payments in the inpatient hospital setting. In the US, charges and payments differ because of a complex set of incentives that connect health care providers and funders. Our methodology can also be applied to adjust charge data to reflect actual spending. Methods We extracted cause of health care encounter (cause), primary payer (payer), charge, and payment information for 50,172 inpatient hospital stays from 1996 through 2012. We used linear regression to assess the relationship between charges and payments, stratified by payer, year, and cause. We applied our estimates to a large, nationally representative hospital charge sample to estimate payments. Results The average amount paid per $1 charged varies significantly across three dimensions: payer, year, and cause. Among the 10 largest causes of health care spending, average payments range from 23 to 55 cents per dollar charged. Over time, the amount paid per dollar charged is decreasing for those with private or public insurance, signifying that inpatient charges are increasing faster than the amount insurers pay. Conversely, the amount paid by out-of-pocket payers per dollar charged is increasing over time for several causes. Applying our estimates to a nationally representative hospital charge sample generates payment estimates which align with the official US estimates of inpatient spending. Conclusions The amount paid per $1 charged fluctuates significantly depending on the cause of a health care encounter and the primary payer. In addition, the amount paid per charge is changing over time. Transparent accounting of hospital spending requires a detailed assessment of the

  10. The relationship between unstandardized and standardized alpha, true reliability, and the underlying measurement model.

    PubMed

    Falk, Carl F; Savalei, Victoria

    2011-01-01

    Popular computer programs print 2 versions of Cronbach's alpha: unstandardized alpha, α(Σ), based on the covariance matrix, and standardized alpha, α(R), based on the correlation matrix. Sources that accurately describe the theoretical distinction between the 2 coefficients are lacking, which can lead to the misconception that the differences between α(R) and α(Σ) are unimportant and to the temptation to report the larger coefficient. We explore the relationship between α(R) and α(Σ) and the reliability of the standardized and unstandardized composite under 3 popular measurement models; we clarify the theoretical meaning of each coefficient and conclude that researchers should choose an appropriate reliability coefficient based on theoretical considerations. We also illustrate that α(R) and α(Σ) estimate the reliability of different composite scores, and in most cases cannot be substituted for one another. PMID:21859284

  11. Performance and reliability of empirical mobility relationships for the prediction of Debris Flow inundated areas

    NASA Astrophysics Data System (ADS)

    Simoni, Alessandro; Berti, Matteo; Mammoliti, Maria

    2010-05-01

    Empirical mobility relationships can be used for preliminary DF Hazard assessment. An adaptation of the original relationships has been proposed for alpine debris flows (DFLOWZ model; Berti and Simoni, 2007). Once a reference debris flow volume is chosen, the code DFLOWZ allows to estimate the area potentially affected by the event based on the mutual relationships between channel cross-sectional area, planimetric area of the deposit and overall volume. We back-analyzed 25 DF events occurred in the Bolzano province (Italy), ranging in volume from 3,000 to 300,000 m3 and evalutated the performance of the automated method through an objective reliability index. Our aim is: - evaluate the effects of uncertainty associated with the empirical mobility relationships; - assess other possible sources of error or violations of the assumptions that underlie the model. Results indicate that a high-resolution DEM (≤ 2.5 m) is essential to get a reliable inundation prediction over a fan. The code itself performs well, in a wide range of situations, demonstrating the conceptual correctness of underlying assumptions. The most relevant source of error remains the uncertainty associated with the empirical mobility relationships, due mainly to errors in volume measurements of DF deposits. Their improvement can be achieved through the collection of high quality field data of DF events.

  12. Structural and reliability analysis of quality of relationship index in cancer patients.

    PubMed

    Cousson-Gélie, Florence; de Chalvron, Stéphanie; Zozaya, Carole; Lafaye, Anaïs

    2013-01-01

    Among psychosocial factors affecting emotional adjustment and quality of life, social support is one of the most important and widely studied in cancer patients, but little is known about the perception of support in specific significant relationships in patients with cancer. This study examined the psychometric properties of the Quality of Relationship Inventory (QRI) by evaluating its factor structure and its convergent and discriminant validity in a sample of cancer patients. A total of 388 patients completed the QRI. Convergent validity was evaluated by testing the correlations between the QRI subscales and measures of general social support, anxiety and depression symptoms. Discriminant validity was examined by testing group comparison. The QRI's longitudinal invariance across time was also tested. Principal axis factor analysis with promax rotation identified three factors accounting for 42.99% of variance: perceived social support, depth, and interpersonal conflict. Estimates of reliability with McDonald's ω coefficient were satisfactory for all the QRI subscales (ω ranging from 0.75 - 0.85). Satisfaction from general social support was negatively correlated with the interpersonal conflict subscale and positively with the depth subscale. The interpersonal conflict and social support scales were correlated with depression and anxiety scores. We also found a relative stability of QRI subscales (measured 3 months after the first evaluation) and differences between partner status and gender groups. The Quality of Relationship Inventory is a valid tool for assessing the quality of social support in a particular relationship with cancer patients. PMID:23514252

  13. Impact of relationships between test and training animals and among training animals on reliability of genomic prediction.

    PubMed

    Wu, X; Lund, M S; Sun, D; Zhang, Q; Su, G

    2015-10-01

    One of the factors affecting the reliability of genomic prediction is the relationship among the animals of interest. This study investigated the reliability of genomic prediction in various scenarios with regard to the relationship between test and training animals, and among animals within the training data set. Different training data sets were generated from EuroGenomics data and a group of Nordic Holstein bulls (born in 2005 and afterwards) as a common test data set. Genomic breeding values were predicted using a genomic best linear unbiased prediction model and a Bayesian mixture model. The results showed that a closer relationship between test and training animals led to a higher reliability of genomic predictions for the test animals, while a closer relationship among training animals resulted in a lower reliability. In addition, the Bayesian mixture model in general led to a slightly higher reliability of genomic prediction, especially for the scenario of distant relationships between training and test animals. Therefore, to prevent a decrease in reliability, constant updates of the training population with animals from more recent generations are required. Moreover, a training population consisting of less-related animals is favourable for reliability of genomic prediction. PMID:26010512

  14. Palmar Creases: Classification, Reliability and Relationships to Fetal Alcohol Spectrum Disorders (FASD).

    PubMed

    Mattison, Siobhán M; Brunson, Emily K; Holman, Darryl J

    2015-09-01

    A normal human palm contains 3 major creases: the distal transverse crease; the proximal transverse crease; and the thenar crease. Because permanent crease patterns are thought to be laid down during the first trimester, researchers have speculated that deviations in crease patterns could be indicative of insults during fetal development. The purpose of this study was twofold: (1) to compare the efficacy and reliability of two coding methods, the first (M1) classifying both "simiana" and Sydney line variants and the second (M2) counting the total number of crease points of origin on the radial border of the hand; and (2) to ascertain the relationship between palmar crease patterns and fetal alcohol spectrum disorders (FASD). Bilateral palm prints were taken using the carbon paper and tape method from 237 individuals diagnosed with FASD and 190 unexposed controls. All prints were coded for crease variants under M1 and M2. Additionally, a random sample of 98 matched (right and left) prints was selected from the controls to determine the reliabilities of M1 and M2. For this analysis, each palm was read twice, at different times, by two readers. Intra-observer Kappa coefficients were similar under both methods, ranging from 0.804-0.910. Inter-observer Kappa coefficients ranged from 0.582-0.623 under M1 and from 0.647-0.757 under M2. Using data from the entire sample of 427 prints and controlling for sex and ethnicity (white v. non-white), no relationship was found between palmar crease variants and FASD. Our results suggest that palmar creases can be classified reliably, but palmar crease patterns may not be affected by fetal alcohol exposure. PMID:26898079

  15. Fold and fabric relationships in temporally and spatially evolving slump systems: A multi-cell flow model

    NASA Astrophysics Data System (ADS)

    Alsop, G. Ian; Marco, Shmuel

    2014-06-01

    Folds generated in ductile metamorphic terranes and within unlithified sediments affected by slumping are geometrically identical to one another, and distinguishing the origin of such folds in ancient lithified rocks is therefore challenging. Foliation is observed to lie broadly parallel to the axial planes of tectonic folds, whilst it is frequently regarded as absent in slump folds. The presence of foliation is therefore often considered as a reliable criterion for distinguishing tectonic folds from those created during slumping. To test this assertion, we have examined a series of well exposed slump folds within the late Pleistocene Lisan Formation of the Dead Sea Basin. These slumps contain a number of different foliation types, including an axial-planar grain-shape fabric and a crenulation cleavage formed via microfolding of bedding laminae. Folds also contain a spaced disjunctive foliation characterised by extensional displacements across shear fractures. This spaced foliation fans around recumbent fold hinges, with kinematics reversing across the axial plane indicating a flexural shear fold mechanism. Overall, the spaced foliation is penecontemporaneous with each individual slump where it occurs, although in detail it is pre, syn or post the local folds. The identification of foliations within undoubted slump folds indicates that the presence or absence of foliation is not in itself a robust criterion to distinguish tectonic from soft-sediment folds. Extensional shear fractures displaying a range of temporal relationships with slump folds suggests that traditional single-cell flow models, where extension is focussed at the head and contraction in the lower toe of the slump, are a gross simplification. We therefore propose a new multi-cell flow model involving coeval second-order flow cells that interact with neighbouring cells during translation of the slump.

  16. Food Thought Suppression Inventory: Test-retest reliability and relationship to weight loss treatment outcomes.

    PubMed

    Barnes, Rachel D; Ivezaj, Valentina; Grilo, Carlos M

    2016-08-01

    This study examined the test-retest reliability of the Food Thought Suppression Inventory (FTSI) and its relationship with weight loss during weight loss treatment. Participants were 89 adults with and without binge eating disorder (BED) recruited through primary care for weight loss treatment who completed the FTSI twice prior to starting treatment. Intra-class correlations for the FTSI ranged from .74-.93. Participants with BED scored significantly higher on the FTSI than those without BED at baseline only. Percent weight loss from baseline to mid-treatment was significantly negatively correlated with the FTSI at baseline and at post-treatment. Participants reaching 5% loss of original body weight by post-treatment had significantly lower FTSI scores at post assessment when compared to those who did not reach this weight loss goal. While baseline binge-eating episodes were significantly positively correlated with baseline FTSI scores, change in binge-eating episodes during treatment were not significantly related to FTSI scores. The FTSI showed satisfactory one week test-retest reliability. Higher levels of food thought suppression may impair individuals' ability to lose weight while receiving weight loss treatment. PMID:27112114

  17. Establishing a Reliable Depth-Age Relationship for the Denali Ice Core

    NASA Astrophysics Data System (ADS)

    Wake, C. P.; Osterberg, E. C.; Winski, D.; Ferris, D.; Kreutz, K. J.; Introne, D.; Dalton, M.

    2015-12-01

    Reliable climate reconstruction from ice core records requires the development of a reliable depth-age relationship. We have established a sub-annual resolution depth-age relationship for the upper 198 meters of a 208 m ice core recovered in 2013 from Mt. Hunter (3,900 m asl), Denali National Park, central Alaska. The dating of the ice core was accomplished via annual layer counting of glaciochemical time-series combined with identification of reference horizons from volcanic eruptions and atmospheric nuclear weapons testing. Using the continuous ice core melter system at Dartmouth College, sub-seasonal samples have been collected and analyzed for major ions, liquid conductivity, particle size and concentration, and stable isotope ratios. Annual signals are apparent in several of the chemical species measured in the ice core samples. Calcium and magnesium peak in the spring, ammonium peaks in the summer, methanesulfonic acid (MSA) peaks in the autumn, and stable isotopes display a strong seasonal cycle with the most depleted values occurring during the winter. Thin ice layers representing infrequent summertime melt were also used to identify summer layers in the core. Analysis of approximately one meter sections of the core via nondestructive gamma spectrometry over depths from 84 to 124 m identified a strong radioactive cesium-137 peak at 89 m which corresponds to the 1963 layer deposited during extensive atmospheric nuclear weapons testing. Peaks in the sulfate and chloride record have been used for the preliminary identification of volcanic signals preserved in the ice core, including ten events since 1883. We are confident that the combination of robust annual layers combined with reference horizons provides a timescale for the 20th century that has an error of less than 0.5 years, making calibrations between ice core records and the instrumental climate data particularly robust. Initial annual layer counting through the entire 198 m suggests the Denali Ice

  18. An Examination of Coach and Player Relationships According to the Adapted LMX 7 Scale: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Caliskan, Gokhan

    2015-01-01

    The current study aims to test the reliability and validity of the Leader-Member Exchange (LMX 7) scale with regard to coach--player relationships in sports settings. A total of 330 professional soccer players from the Turkish Super League as well as from the First and Second Leagues participated in this study. Factor analyses were performed to…

  19. Merlino-Perkins Father-Daughter Relationship Inventory (MP-FDI): Construction, Reliability, Validity, and Implications for Counseling and Research

    ERIC Educational Resources Information Center

    Merlino Perkins, Rose J.

    2008-01-01

    The Merlino-Perkins Father-Daughter Relationship Inventory, a self-report instrument, assesses women's childhood interactions with supportive, doting, distant, controlling, tyrannical, physically abusive, absent, and seductive fathers. Item and scale development, psychometric findings drawn from factor analyses, reliability assessments, and…

  20. Quality of Relationships between Youth and Community Service Providers: Reliability and Validity of the Trusting Relationship Questionnaire

    ERIC Educational Resources Information Center

    Mustillo, Sarah A.; Dorsey, Shannon; Farmer, Elizabeth M. Z.

    2005-01-01

    We examined the factor structure and psychometric properties of the Trusting Relationship Questionnaire, a brief measure of relationship quality between youth and community-based service providers involved in their care. Data on youth residing in Therapeutic Foster Care and in Group Homes (N = 296) were collected. We identified a one-factor…

  1. The Relationship Quality Interview: Evidence of Reliability, Convergent and Divergent Validity, and Incremental Utility

    ERIC Educational Resources Information Center

    Lawrence, Erika; Barry, Robin A.; Brock, Rebecca L.; Bunde, Mali; Langer, Amie; Ro, Eunyoe; Fazio, Emily; Mulryan, Lorin; Hunt, Sara; Madsen, Lisa; Dzankovic, Sandra

    2011-01-01

    Relationship satisfaction and adjustment have been the target outcome variables for almost all couple research and therapies. In contrast, far less attention has been paid to the assessment of relationship quality. The present study introduces the Relationship Quality Interview (RQI), a semistructured, behaviorally anchored individual interview.…

  2. On the Relationship between Maximal Reliability and Maximal Validity of Linear Composites

    ERIC Educational Resources Information Center

    Penev, Spiridon; Raykov, Tenko

    2006-01-01

    A linear combination of a set of measures is often sought as an overall score summarizing subject performance. The weights in this composite can be selected to maximize its reliability or to maximize its validity, and the optimal choice of weights is in general not the same for these two optimality criteria. We explore several relationships…

  3. Reliable Attention Network Scores and Mutually Inhibited Inter-network Relationships Revealed by Mixed Design and Non-orthogonal Method

    PubMed Central

    Wang, Yi-Feng; Jing, Xiu-Juan; Liu, Feng; Li, Mei-Ling; Long, Zhi-Liang; Yan, Jin H.; Chen, Hua-Fu

    2015-01-01

    The attention system can be divided into alerting, orienting, and executive control networks. The efficiency and independence of attention networks have been widely tested with the attention network test (ANT) and its revised versions. However, many studies have failed to find effects of attention network scores (ANSs) and inter-network relationships (INRs). Moreover, the low reliability of ANSs can not meet the demands of theoretical and empirical investigations. Two methodological factors (the inter-trial influence in the event-related design and the inter-network interference in orthogonal contrast) may be responsible for the unreliability of ANT. In this study, we combined the mixed design and non-orthogonal method to explore ANSs and directional INRs. With a small number of trials, we obtained reliable and independent ANSs (split-half reliability of alerting: 0.684; orienting: 0.588; and executive control: 0.616), suggesting an individual and specific attention system. Furthermore, mutual inhibition was observed when two networks were operated simultaneously, indicating a differentiated but integrated attention system. Overall, the reliable and individual specific ANSs and mutually inhibited INRs provide novel insight into the understanding of the developmental, physiological and pathological mechanisms of attention networks, and can benefit future experimental and clinical investigations of attention using ANT. PMID:25997025

  4. An Adaptation, Validity and Reliability of the Lifespan Sibling Relationship Scale to the Turkish Adolescents

    ERIC Educational Resources Information Center

    Öz, F. Selda

    2015-01-01

    The purpose of this study is to adapt the Lifespan Sibling Relationship Scale (LSRS) developed by Riggio (2000) to Turkish. The scale with its original form in English consists of 48 items in total. The original scale was translated into Turkish by three instructors who are proficient both in the field and the language. Later, the original and…

  5. Relationship between evolving epileptiform activity and delayed loss of mitochondrial activity after asphyxia measured by near-infrared spectroscopy in preterm fetal sheep

    PubMed Central

    Bennet, L; Roelfsema, V; Pathipati, P; Quaedackers, J S; Gunn, A J

    2006-01-01

    Early onset cerebral hypoperfusion after birth is highly correlated with neurological injury in premature infants, but the relationship with the evolution of injury remains unclear. We studied changes in cerebral oxygenation, and cytochrome oxidase (CytOx) using near-infrared spectroscopy in preterm fetal sheep (103–104 days of gestation, term is 147 days) during recovery from a profound asphyxial insult (n = 7) that we have shown produces severe subcortical injury, or sham asphyxia (n = 7). From 1 h after asphyxia there was a significant secondary fall in carotid blood flow (P < 0.001), and total cerebral blood volume, as reflected by total haemoglobin (P < 0.005), which only partially recovered after 72 h. Intracerebral oxygenation (difference between oxygenated and deoxygenated haemoglobin concentrations) fell transiently at 3 and 4 h after asphyxia (P < 0.01), followed by a substantial increase to well over sham control levels (P < 0.001). CytOx levels were normal in the first hour after occlusion, was greater than sham control values at 2–3 h (P < 0.05), but then progressively fell, and became significantly suppressed from 10 h onward (P < 0.01). In the early hours after reperfusion the fetal EEG was highly suppressed, with a superimposed mixture of fast and slow epileptiform transients; overt seizures developed from 8 ± 0.5 h. These data strongly indicate that severe asphyxia leads to delayed, evolving loss of mitochondrial oxidative metabolism, accompanied by late seizures and relative luxury perfusion. In contrast, the combination of relative cerebral deoxygenation with evolving epileptiform transients in the early recovery phase raises the possibility that these early events accelerate or worsen the subsequent mitochondrial failure. PMID:16484298

  6. The Inventory of Teacher-Student Relationships: Factor Structure, Reliability, and Validity among African American Youth in Low-Income Urban Schools

    ERIC Educational Resources Information Center

    Murray, Christopher; Zvoch, Keith

    2011-01-01

    This study investigates the factor structure, reliability, and validity of the Inventory of Teacher-Student Relationships (IT-SR), a measure that was developed by adapting the widely used Inventory of Parent and Peer Attachments (Armsden & Greenberg, 1987) for use in the context of teacher-student relationships. The instrument was field tested…

  7. The Bindex(®) ultrasound device: reliability of cortical bone thickness measures and their relationship to regional bone mineral density.

    PubMed

    Behrens, Martin; Felser, Sabine; Mau-Moeller, Anett; Weippert, Matthias; Pollex, Johannes; Skripitz, Ralf; Herlyn, Philipp K E; Fischer, Dagmar-C; Bruhn, Sven; Schober, Hans-Christof; Zschorlich, Volker; Mittlmeier, Thomas

    2016-09-01

    The Bindex(®) quantitative ultrasound (QUS) device is currently available and this study analyzed (I) its relative and absolute intra- and inter-session reliability and (II) the relationship between the data provided by Bindex(®)-QUS and the bone mineral density (BMD) measured by dual-energy x-ray absorptiometry at corresponding skeletal sites in young and healthy subjects (age: 25.0  ±  3.6 years). Bindex(®)-QUS calculates a density index on the basis of the thickness of cortical bone measured at the distal radius and the distal plus proximal tibia. The data show a very good relative and absolute intra- (ICC  =  0.977, CV  =  1.5%) and inter-session reliability (ICC  =  0.978, CV  =  1.4%) for the density index. The highest positive correlations were found between cortical thickness and BMD for the distal radius and distal tibia (r  ⩾  0.71, p  <  0.001). The data indicate that the Bindex(®)-QUS parameters are repeatable within and between measurement sessions. Furthermore, the measurements reflect the BMD at specific skeletal sites. Bindex(®)-QUS might be a useful tool for the measurement of skeletal adaptations. PMID:27511629

  8. [Relationship between hope and subjective well-being: reliability and validity of the dispositional Hope Scale, Japanese version].

    PubMed

    Kato, Tsukasa; Snyder, C R

    2005-08-01

    We conducted three studies to translate the Snyder Hope Sales into Japanese, examine reliability and validity of the Japanese version, and investigate the relationship between the tendency to be hopeful and subjective well-being. In Study 1, confirmatory factor analysis was performed of the Hope Scale in the Japanese version: agency and pathways. Its test-retest reliability coefficients for the data from 113 undergraduates ranged from .81 to .84. In Study 2, concurrent validity of the Japanese version Hope Scale was examined with the data from 550 respondents, which looked at the correlations between hope and optimism, self-esteem, and self-efficacy. Results suggested that the Japanese version had high validity. In addition, the tendency to be hopeful had negative correlations with stress response, hopelessness, depressive tendency, and trait anxiety, and positive one with feeling of happiness. In Study 3, 175 undergraduates completed the Hope Scale and State-Trait Anxiety Inventory (STAI) immediately prior to final examinations. Results of regression analysis suggested that the tendency to be hopeful moderated examination anxiety. Taken together, results of the studies supported the hypothesis that hope had positive effects on subjective well-being. PMID:16200877

  9. Self Evolving Modular Network

    NASA Astrophysics Data System (ADS)

    Tokunaga, Kazuhiro; Kawabata, Nobuyuki; Furukawa, Tetsuo

    We propose a novel modular network called the Self-Evolving Modular Network (SEEM). The SEEM has a modular network architecture with a graph structure and these following advantages: (1) new modules are added incrementally to allow the network to adapt in a self-organizing manner, and (2) graph's paths are formed based on the relationships between the models represented by modules. The SEEM is expected to be applicable to evolving functions of an autonomous robot in a self-organizing manner through interaction with the robot's environment and categorizing large-scale information. This paper presents the architecture and an algorithm for the SEEM. Moreover, performance characteristic and effectiveness of the network are shown by simulations using cubic functions and a set of 3D-objects.

  10. How reliable are randomised controlled trials for studying the relationship between diet and disease? A narrative review.

    PubMed

    Temple, Norman J

    2016-08-01

    Large numbers of randomised controlled trials (RCT) have been carried out in order to investigate diet-disease relationships. This article examines eight sets of studies and compares the findings with those from epidemiological studies (cohort studies in seven of the cases). The studies cover the role of dietary factors in blood pressure, body weight, cancer and heart disease. In some cases, the findings from the two types of study are consistent, whereas in other cases the findings appear to be in conflict. A critical evaluation of this evidence suggests factors that may account for conflicting findings. Very often RCT recruit subjects with a history of the disease under study (or at high risk of it) and have a follow-up of only a few weeks or months. Cohort studies, in contrast, typically recruit healthy subjects and have a follow-up of 5-15 years. Owing to these differences, findings from RCT are not necessarily more reliable than those from well-designed prospective cohort studies. We cannot assume that the results of RCT can be freely applied beyond the specific features of the studies. PMID:27267302

  11. An Investigation of the Relationship between Reliability, Power, and the Type I Error Rate of the Mantel-Haenszel and Simultaneous Item Bias Detection Procedures.

    ERIC Educational Resources Information Center

    Ackerman, Terry A.; Evans, John A.

    The relationship between levels of reliability and the power of two bias and differential item functioning (DIF) detection methods is examined. Both methods, the Mantel-Haenszel (MH) procedure of P. W. Holland and D. T. Thayer (1988) and the Simultaneous Item Bias (SIB) procedure of R. Shealy and W. Stout (1991), use examinees' raw scores as a…

  12. Changing and Evolving Relationships between Two- and Four-Year Colleges and Universities: They're Not Your Parents' Community Colleges Anymore

    ERIC Educational Resources Information Center

    Labov, Jay B.

    2012-01-01

    This paper describes a summit on Community Colleges in the Evolving STEM Education Landscape organized by a committee of the National Research Council (NRC) and the National Academy of Engineering (NAE) and held at the Carnegie Institution for Science on December 15, 2011. This summit followed a similar event organized by Dr. Jill Biden, spouse of…

  13. The Assessment of Positivity and Negativity in Social Networks: The Reliability and Validity of the Social Relationships Index

    ERIC Educational Resources Information Center

    Campo, Rebecca A.; Uchino, Bert N.; Holt-Lunstad, Julianne; Vaughn, Allison; Reblin, Maija; Smith, Timothy W.

    2009-01-01

    The Social Relationships Index (SRI) was designed to examine positivity and negativity in social relationships. Unique features of this scale include its brevity and the ability to examine relationship positivity and negativity at the level of the specific individual and social network. The SRI's psychometric properties were examined in three…

  14. Evolvable synthetic neural system

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A. (Inventor)

    2009-01-01

    An evolvable synthetic neural system includes an evolvable neural interface operably coupled to at least one neural basis function. Each neural basis function includes an evolvable neural interface operably coupled to a heuristic neural system to perform high-level functions and an autonomic neural system to perform low-level functions. In some embodiments, the evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy.

  15. NON-POINT SOURCE--STREAM NUTRIENT LEVEL RELATIONSHIPS: A NATIONWIDE STUDY. SUPPLEMENT 1: NUTRIENT MAP RELIABILITY

    EPA Science Inventory

    The National Eutrophication Survey (NES) national maps of non-point source nitrogen and phosphorus concentrations in streams were evaluated for applicability and reliability. Interpretations on these maps which were based on data from 928 sampling sites associated with non-point ...

  16. Reliability, Validity, and Associations with Sexual Behavior among Ghanaian Teenagers of Scales Measuring Four Dimensions Relationships with Parents and Other Adults

    PubMed Central

    Bingenheimer, Jeffrey B.; Asante, Elizabeth; Ahiadeke, Clement

    2013-01-01

    Little research has been done on the social contexts of adolescent sexual behaviors in sub-Saharan Africa. As part of a longitudinal cohort study (N=1275) of teenage girls and boys in two Ghanaian towns, interviewers administered a 26 item questionnaire module intended to assess four dimensions of youth-adult relationships: monitoring conflict, emotional support, and financial support. Confirmatory factor and traditional psychometric analyses showed the four scales to be reliable. Known-groups comparisons provided evidence of their validity. All four scales had strong bivariate associations with self-reported sexual behavior (odds ratios = 1.66, 0.74, 0.47, and 0.60 for conflict, support, monitoring, and financial support). The instrument is practical for use in sub-Saharan African settings and produces measures that are reliable, valid, and predictive of sexual behavior in youth. PMID:25821286

  17. Making Reliability Arguments in Classrooms

    ERIC Educational Resources Information Center

    Parkes, Jay; Giron, Tilia

    2006-01-01

    Reliability methodology needs to evolve as validity has done into an argument supported by theory and empirical evidence. Nowhere is the inadequacy of current methods more visible than in classroom assessment. Reliability arguments would also permit additional methodologies for evidencing reliability in classrooms. It would liberalize methodology…

  18. Reliability and Validity of the Parent-Child Relationship Inventory (PCRI): Evidence from a Longitudinal Cross-Informant Investigation

    ERIC Educational Resources Information Center

    Coffman, Jacqueline K.; Guerin, Diana Wright; Gottfried, Allen W.

    2006-01-01

    Psychometric properties of the Parent-Child Relationship Inventory (PCRI) were examined using data collected from adolescents and their parents in the Fullerton Longitudinal Study. Results revealed acceptable internal consistency for most scales and moderate to high 1-year stability for all scales. Both parents' PCRI scores correlated with their…

  19. Prokaryote and eukaryote evolvability.

    PubMed

    Poole, Anthony M; Phillips, Matthew J; Penny, David

    2003-05-01

    The concept of evolvability covers a broad spectrum of, often contradictory, ideas. At one end of the spectrum it is equivalent to the statement that evolution is possible, at the other end are untestable post hoc explanations, such as the suggestion that current evolutionary theory cannot explain the evolution of evolvability. We examine similarities and differences in eukaryote and prokaryote evolvability, and look for explanations that are compatible with a wide range of observations. Differences in genome organisation between eukaryotes and prokaryotes meets this criterion. The single origin of replication in prokaryote chromosomes (versus multiple origins in eukaryotes) accounts for many differences because the time to replicate a prokaryote genome limits its size (and the accumulation of junk DNA). Both prokaryotes and eukaryotes appear to switch from genetic stability to genetic change in response to stress. We examine a range of stress responses, and discuss how these impact on evolvability, particularly in unicellular organisms versus complex multicellular ones. Evolvability is also limited by environmental interactions (including competition) and we describe a model that places limits on potential evolvability. Examples are given of its application to predator competition and limits to lateral gene transfer. We suggest that unicellular organisms evolve largely through a process of metabolic change, resulting in biochemical diversity. Multicellular organisms evolve largely through morphological changes, not through extensive changes to cellular biochemistry. PMID:12689728

  20. REFLECTIONS ON EVOLVING CHANGE.

    PubMed

    Angood, Peter B

    2016-01-01

    Physician leadership is increasingly recognized as pivotal for improved change in health care. Multi-professional care teams, education and leadership are evolving trends that are important for health care's future. PMID:27295737

  1. The Skin Cancer and Sun Knowledge (SCSK) Scale: Validity, Reliability, and Relationship to Sun-Related Behaviors Among Young Western Adults.

    PubMed

    Day, Ashley K; Wilson, Carlene; Roberts, Rachel M; Hutchinson, Amanda D

    2014-08-01

    Increasing public knowledge remains one of the key aims of skin cancer awareness campaigns, yet diagnosis rates continue to rise. It is essential we measure skin cancer knowledge adequately so as to determine the nature of its relationship to sun-related behaviors. This study investigated the psychometric properties of a new measure of skin cancer knowledge, the Skin Cancer and Sun Knowledge (SCSK) scale. A total of 514 Western young adults (females n = 320, males n = 194) aged 18 to 26 years completed measures of skin type, skin cancer knowledge, tanning behavior, sun exposure, and sun protection. Two-week test-retest of the SCSK was conducted with 52 participants. Internal reliability of the SCSK scale was acceptable (KR-20 = .69), test-retest reliability was high (r = .83, n = 52), and acceptable levels of face, content, and incremental validity were demonstrated. Skin cancer knowledge (as measured by SCSK) correlated with sun protection, sun exposure, and tanning behaviors in the female sample, but not in the males. Skin cancer knowledge appears to be more relevant to the behavior of young women than that of young males. We recommend that future research establish the validity of the SCSK across a range of participant groups. PMID:24722215

  2. Evolving Digital Ecological Networks

    PubMed Central

    Wagner, Aaron P.; Ofria, Charles

    2013-01-01

    “It is hard to realize that the living world as we know it is just one among many possibilities” [1]. Evolving digital ecological networks are webs of interacting, self-replicating, and evolving computer programs (i.e., digital organisms) that experience the same major ecological interactions as biological organisms (e.g., competition, predation, parasitism, and mutualism). Despite being computational, these programs evolve quickly in an open-ended way, and starting from only one or two ancestral organisms, the formation of ecological networks can be observed in real-time by tracking interactions between the constantly evolving organism phenotypes. These phenotypes may be defined by combinations of logical computations (hereafter tasks) that digital organisms perform and by expressed behaviors that have evolved. The types and outcomes of interactions between phenotypes are determined by task overlap for logic-defined phenotypes and by responses to encounters in the case of behavioral phenotypes. Biologists use these evolving networks to study active and fundamental topics within evolutionary ecology (e.g., the extent to which the architecture of multispecies networks shape coevolutionary outcomes, and the processes involved). PMID:23533370

  3. Relationship Between Agility Tests and Short Sprints: Reliability and Smallest Worthwhile Difference in National Collegiate Athletic Association Division-I Football Players.

    PubMed

    Mann, J Bryan; Ivey, Pat A; Mayhew, Jerry L; Schumacher, Richard M; Brechue, William F

    2016-04-01

    The Pro-Agility test (I-Test) and 3-cone drill (3-CD) are widely used in football to assess quickness in change of direction. Likewise, the 10-yard (yd) sprint, a test of sprint acceleration, is gaining popularity for testing physical competency in football players. Despite their frequent use, little information exists on the relationship between agility and sprint tests as well the reliability and degree of change necessary to indicate meaningful improvement resulting from training. The purpose of this study was to determine the reliability and smallest worthwhile difference (SWD) of the I-Test and 3-CD and the relationship of sprint acceleration to their performance. Division-I football players (n = 64, age = 20.5 ± 1.2 years, height = 185.2 ± 6.1 cm, body mass = 107.8 ± 20.7 kg) performed duplicate trials in each test during 2 separate weeks at the conclusion of a winter conditioning period. The better time of the 2 trials for each week was used for comparison. The 10-yd sprint was timed electronically, whereas the I-Test and 3-CD were hand timed by experienced testers. Each trial was performed on an indoor synthetic turf, with players wearing multicleated turf shoes. There was no significant difference (p > 0.06) between test weeks for the I-Test (4.53 ± 0.35 vs. 4.54 ± 0.31 seconds), 3-CD (7.45 ± 0.06 vs. 7.49 ± 0.06 seconds), or 10-yd sprint (1.85 ± 0.12 vs. 1.84 ± 0.12 seconds). The intraclass correlation coefficients (ICC) for 3-CD (ICC = 0.962) and 10-yd sprint (ICC = 0.974) were slightly higher than for the I-Test (ICC = 0.914). These values lead to acceptable levels of the coefficient of variation for each test (1.2, 1.2, and 1.9%, respectively). The SWD% indicated that a meaningful improvement due to training would require players to decrease their times by 6.6% for I-Test, 3.7% for 3-CD, and 3.8% for 10-yd sprint. Performance in agility and short sprint tests are highly related and reliable in college football players, providing quantifiable

  4. On Component Reliability and System Reliability for Space Missions

    NASA Technical Reports Server (NTRS)

    Chen, Yuan; Gillespie, Amanda M.; Monaghan, Mark W.; Sampson, Michael J.; Hodson, Robert F.

    2012-01-01

    This paper is to address the basics, the limitations and the relationship between component reliability and system reliability through a study of flight computing architectures and related avionics components for NASA future missions. Component reliability analysis and system reliability analysis need to be evaluated at the same time, and the limitations of each analysis and the relationship between the two analyses need to be understood.

  5. An Evolving Astrobiology Glossary

    NASA Astrophysics Data System (ADS)

    Meech, K. J.; Dolci, W. W.

    2009-12-01

    One of the resources that evolved from the Bioastronomy 2007 meeting was an online interdisciplinary glossary of terms that might not be universally familiar to researchers in all sub-disciplines feeding into astrobiology. In order to facilitate comprehension of the presentations during the meeting, a database driven web tool for online glossary definitions was developed and participants were invited to contribute prior to the meeting. The glossary was downloaded and included in the conference registration materials for use at the meeting. The glossary web tool is has now been delivered to the NASA Astrobiology Institute so that it can continue to grow as an evolving resource for the astrobiology community.

  6. ILZRO-sponsored field data collection and analysis to determine relationships between service conditions and reliability of VRLA batteries in stationary applications

    SciTech Connect

    Taylor, P.A.; Moseley, P.T.; Butler, P.C.

    1998-09-01

    Although valve-regulated lead-acid (VRLA) batteries have served in stationary applications for more than a decade, proprietary concerns of battery manufacturers and users and varying approaches to record-keeping have made the data available on performance and life relatively sparse and inconsistent. Such incomplete data are particularly detrimental to understanding the cause or causes of premature capacity loss (PCL) reported in VRLA batteries after as little as two years of service. The International Lead Zinc Research Organization (ILZRO), in cooperation with Sandia National Laboratories, has initiated a multi-phase project to characterize relationships between batteries, service conditions, and failure modes; establish the degree of correlation between specific operating procedures and PCL; identify operating procedures that mitigate PCL; identify best-fits between the operating requirements of specific applications and the capabilities of specific VRLA technologies; and recommend combinations of battery design, manufacturing processes, and operating conditions that enhance VRLA performance and reliability. This paper, prepared before preliminary conclusions were possible, presents the surveys distributed to manufacturers and end-users; discusses the analytic approach; presents an overview of the responses to the surveys and trends that emerge in the early analysis of the data; and previews the functionality of the database being constructed. The presentation of this paper will include preliminary results and information regarding the follow-on workshop for the study.

  7. Evolvable Neural Software System

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  8. Evolving Robust Gene Regulatory Networks

    PubMed Central

    Noman, Nasimul; Monjo, Taku; Moscato, Pablo; Iba, Hitoshi

    2015-01-01

    Design and implementation of robust network modules is essential for construction of complex biological systems through hierarchical assembly of ‘parts’ and ‘devices’. The robustness of gene regulatory networks (GRNs) is ascribed chiefly to the underlying topology. The automatic designing capability of GRN topology that can exhibit robust behavior can dramatically change the current practice in synthetic biology. A recent study shows that Darwinian evolution can gradually develop higher topological robustness. Subsequently, this work presents an evolutionary algorithm that simulates natural evolution in silico, for identifying network topologies that are robust to perturbations. We present a Monte Carlo based method for quantifying topological robustness and designed a fitness approximation approach for efficient calculation of topological robustness which is computationally very intensive. The proposed framework was verified using two classic GRN behaviors: oscillation and bistability, although the framework is generalized for evolving other types of responses. The algorithm identified robust GRN architectures which were verified using different analysis and comparison. Analysis of the results also shed light on the relationship among robustness, cooperativity and complexity. This study also shows that nature has already evolved very robust architectures for its crucial systems; hence simulation of this natural process can be very valuable for designing robust biological systems. PMID:25616055

  9. Regolith Evolved Gas Analyzer

    NASA Technical Reports Server (NTRS)

    Hoffman, John H.; Hedgecock, Jud; Nienaber, Terry; Cooper, Bonnie; Allen, Carlton; Ming, Doug

    2000-01-01

    The Regolith Evolved Gas Analyzer (REGA) is a high-temperature furnace and mass spectrometer instrument for determining the mineralogical composition and reactivity of soil samples. REGA provides key mineralogical and reactivity data that is needed to understand the soil chemistry of an asteroid, which then aids in determining in-situ which materials should be selected for return to earth. REGA is capable of conducting a number of direct soil measurements that are unique to this instrument. These experimental measurements include: (1) Mass spectrum analysis of evolved gases from soil samples as they are heated from ambient temperature to 900 C; and (2) Identification of liberated chemicals, e.g., water, oxygen, sulfur, chlorine, and fluorine. REGA would be placed on the surface of a near earth asteroid. It is an autonomous instrument that is controlled from earth but does the analysis of regolith materials automatically. The REGA instrument consists of four primary components: (1) a flight-proven mass spectrometer, (2) a high-temperature furnace, (3) a soil handling system, and (4) a microcontroller. An external arm containing a scoop or drill gathers regolith samples. A sample is placed in the inlet orifice where the finest-grained particles are sifted into a metering volume and subsequently moved into a crucible. A movable arm then places the crucible in the furnace. The furnace is closed, thereby sealing the inner volume to collect the evolved gases for analysis. Owing to the very low g forces on an asteroid compared to Mars or the moon, the sample must be moved from inlet to crucible by mechanical means rather than by gravity. As the soil sample is heated through a programmed pattern, the gases evolved at each temperature are passed through a transfer tube to the mass spectrometer for analysis and identification. Return data from the instrument will lead to new insights and discoveries including: (1) Identification of the molecular masses of all of the gases

  10. Redfield's evolving legacy

    NASA Astrophysics Data System (ADS)

    Gruber, Nicolas; Deutsch, Curtis A.

    2014-12-01

    The ratio of nitrogen to phosphorus in organic matter is close to that in seawater, a relationship maintained through a set of biological feedbacks. The rapid delivery of nutrients from human activities may test the efficacy of these processes.

  11. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1989-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predictions for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates produced by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  12. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1990-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predicitons for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates prodcued by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  13. Stochastically evolving networks

    NASA Astrophysics Data System (ADS)

    Chan, Derek Y.; Hughes, Barry D.; Leong, Alex S.; Reed, William J.

    2003-12-01

    We discuss a class of models for the evolution of networks in which new nodes are recruited into the network at random times, and links between existing nodes that are not yet directly connected may also form at random times. The class contains both models that produce “small-world” networks and less tightly linked models. We produce both trees, appropriate in certain biological applications, and networks in which closed loops can appear, which model communication networks and networks of human sexual interactions. One of our models is closely related to random recursive trees, and some exact results known in that context can be exploited. The other models are more subtle and difficult to analyze. Our analysis includes a number of exact results for moments, correlations, and distributions of coordination number and network size. We report simulations and also discuss some mean-field approximations. If the system has evolved for a long time and the state of a random node (which thus has a random age) is observed, power-law distributions for properties of the system arise in some of these models.

  14. Fat: an evolving issue

    PubMed Central

    Speakman, John R.; O’Rahilly, Stephen

    2012-01-01

    Summary Work on obesity is evolving, and obesity is a consequence of our evolutionary history. In the space of 50 years, we have become an obese species. The reasons why can be addressed at a number of different levels. These include separating between whether the primary cause lies on the food intake or energy expenditure side of the energy balance equation, and determining how genetic and environmental effects contribute to weight variation between individuals. Opinion on whether increased food intake or decreased energy expenditure drives the obesity epidemic is still divided, but recent evidence favours the idea that food intake, rather than altered expenditure, is most important. There is more of a consensus that genetics explains most (probably around 65%) of weight variation between individuals. Recent advances in genome-wide association studies have identified many polymorphisms that are linked to obesity, yet much of the genetic variance remains unexplained. Finding the causes of this unexplained variation will be an impetus of genetic and epigenetic research on obesity over the next decade. Many environmental factors – including gut microbiota, stress and endocrine disruptors – have been linked to the risk of developing obesity. A better understanding of gene-by-environment interactions will also be key to understanding obesity in the years to come. PMID:22915015

  15. Evolving endoscopic surgery.

    PubMed

    Sakai, Paulo; Faintuch, Joel

    2014-06-01

    Since the days of Albukasim in medieval Spain, natural orifices have been regarded not only as a rather repugnant source of bodily odors, fluids and excreta, but also as a convenient invitation to explore and treat the inner passages of the organism. However, surgical ingenuity needed to be matched by appropriate tools and devices. Lack of technologically advanced instrumentation was a strong deterrent during almost a millennium until recent decades when a quantum jump materialized. Endoscopic surgery is currently a vibrant and growing subspecialty, which successfully handles millions of patients every year. Additional opportunities lie ahead which might benefit millions more, however, requiring even more sophisticated apparatuses, particularly in the field of robotics, artificial intelligence, and tissue repair (surgical suturing). This is a particularly exciting and worthwhile challenge, namely of larger and safer endoscopic interventions, followed by seamless and scarless recovery. In synthesis, the future is widely open for those who use together intelligence and creativity to develop new prototypes, new accessories and new techniques. Yet there are many challenges in the path of endoscopic surgery. In this new era of robotic endoscopy, one will likely need a virtual simulator to train and assess the performance of younger doctors. More evidence will be essential in multiple evolving fields, particularly to elucidate whether more ambitious and complex pathways, such as intrathoracic and intraperitoneal surgery via natural orifice transluminal endoscopic surgery (NOTES), are superior or not to conventional techniques. PMID:24628672

  16. Communicability across evolving networks

    NASA Astrophysics Data System (ADS)

    Grindrod, Peter; Parsons, Mark C.; Higham, Desmond J.; Estrada, Ernesto

    2011-04-01

    Many natural and technological applications generate time-ordered sequences of networks, defined over a fixed set of nodes; for example, time-stamped information about “who phoned who” or “who came into contact with who” arise naturally in studies of communication and the spread of disease. Concepts and algorithms for static networks do not immediately carry through to this dynamic setting. For example, suppose A and B interact in the morning, and then B and C interact in the afternoon. Information, or disease, may then pass from A to C, but not vice versa. This subtlety is lost if we simply summarize using the daily aggregate network given by the chain A-B-C. However, using a natural definition of a walk on an evolving network, we show that classic centrality measures from the static setting can be extended in a computationally convenient manner. In particular, communicability indices can be computed to summarize the ability of each node to broadcast and receive information. The computations involve basic operations in linear algebra, and the asymmetry caused by time’s arrow is captured naturally through the noncommutativity of matrix-matrix multiplication. Illustrative examples are given for both synthetic and real-world communication data sets. We also discuss the use of the new centrality measures for real-time monitoring and prediction.

  17. Evolving synergetic interactions

    PubMed Central

    Wu, Bin; Arranz, Jordi; Du, Jinming; Zhou, Da; Traulsen, Arne

    2016-01-01

    Cooperators forgo their own interests to benefit others. This reduces their fitness and thus cooperators are not likely to spread based on natural selection. Nonetheless, cooperation is widespread on every level of biological organization ranging from bacterial communities to human society. Mathematical models can help to explain under which circumstances cooperation evolves. Evolutionary game theory is a powerful mathematical tool to depict the interactions between cooperators and defectors. Classical models typically involve either pairwise interactions between individuals or a linear superposition of these interactions. For interactions within groups, however, synergetic effects may arise: their outcome is not just the sum of its parts. This is because the payoffs via a single group interaction can be different from the sum of any collection of two-player interactions. Assuming that all interactions start from pairs, how can such synergetic multiplayer games emerge from simpler pairwise interactions? Here, we present a mathematical model that captures the transition from pairwise interactions to synergetic multiplayer ones. We assume that different social groups have different breaking rates. We show that non-uniform breaking rates do foster the emergence of synergy, even though individuals always interact in pairs. Our work sheds new light on the mechanisms underlying such synergetic interactions. PMID:27466437

  18. Evolving synergetic interactions.

    PubMed

    Wu, Bin; Arranz, Jordi; Du, Jinming; Zhou, Da; Traulsen, Arne

    2016-07-01

    Cooperators forgo their own interests to benefit others. This reduces their fitness and thus cooperators are not likely to spread based on natural selection. Nonetheless, cooperation is widespread on every level of biological organization ranging from bacterial communities to human society. Mathematical models can help to explain under which circumstances cooperation evolves. Evolutionary game theory is a powerful mathematical tool to depict the interactions between cooperators and defectors. Classical models typically involve either pairwise interactions between individuals or a linear superposition of these interactions. For interactions within groups, however, synergetic effects may arise: their outcome is not just the sum of its parts. This is because the payoffs via a single group interaction can be different from the sum of any collection of two-player interactions. Assuming that all interactions start from pairs, how can such synergetic multiplayer games emerge from simpler pairwise interactions? Here, we present a mathematical model that captures the transition from pairwise interactions to synergetic multiplayer ones. We assume that different social groups have different breaking rates. We show that non-uniform breaking rates do foster the emergence of synergy, even though individuals always interact in pairs. Our work sheds new light on the mechanisms underlying such synergetic interactions. PMID:27466437

  19. Reliability training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  20. Photovoltaic performance and reliability workshop

    SciTech Connect

    Mrig, L.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986--1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the US, PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  1. Photovoltaic performance and reliability workshop

    NASA Astrophysics Data System (ADS)

    Mrig, L.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986-1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the U.S., PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  2. Hyper massive black holes in evolved galaxies

    NASA Astrophysics Data System (ADS)

    Romero-Cruz, Fernando J.

    2015-09-01

    From the SDSS DR7 we took a sample of 16733 galaxies which do not show all of the emission lines required to classify their activity according to the classical BPT diagram (Baldwin et al. 1981 PASP). Since they do not show these emission lines they are thought to be evolved enough so to host Hyper Massive Black holes. We compared their statistical properties with other galaxies from the SDSS DR7 which do show emission lines and confirmed that their M-sigma relationship correspond to HMBHs (Gutelkin et al. 2009 ApJ) and also that their SFH confirms evolution. We also analyzed them with a new Diagnostic Diagram in the IR (Coziol et al. 2015 AJ) and found that their position in the IR color space (W3W4 vs W2W3) correspond to AGN activity with current low SF, another confirmation of an evolved galaxy. The position of our final sample in the IR diagram is in the same region in which Holm 15A lies, this galaxy is considered to host the most massive BHs in the nearby universe (Lopez-Cruz et al. 2014 ApJL). The morphology of these galaxies (all of them are classified as elliptical) confirms that they are very evolved. We claim that the hyper massive BH lie in galaxies very evolved and with very low SF and without clear AGN activity in the BPT diagram.

  3. Disgust: Evolved Function and Structure

    ERIC Educational Resources Information Center

    Tybur, Joshua M.; Lieberman, Debra; Kurzban, Robert; DeScioli, Peter

    2013-01-01

    Interest in and research on disgust has surged over the past few decades. The field, however, still lacks a coherent theoretical framework for understanding the evolved function or functions of disgust. Here we present such a framework, emphasizing 2 levels of analysis: that of evolved function and that of information processing. Although there is…

  4. Reliability physics

    NASA Technical Reports Server (NTRS)

    Cuddihy, E. F.; Ross, R. G., Jr.

    1984-01-01

    Speakers whose topics relate to the reliability physics of solar arrays are listed and their topics briefly reviewed. Nine reports are reviewed ranging in subjects from studies of photothermal degradation in encapsulants and polymerizable ultraviolet stabilizers to interface bonding stability to electrochemical degradation of photovoltaic modules.

  5. The Reliability of Density Measurements.

    ERIC Educational Resources Information Center

    Crothers, Charles

    1978-01-01

    Data from a land-use study of small- and medium-sized towns in New Zealand are used to ascertain the relationship between official and effective density measures. It was found that the reliability of official measures of density is very low overall, although reliability increases with community size. (Author/RLV)

  6. Automated gas burette system for evolved hydrogen measurements

    SciTech Connect

    Zheng Feng; Rassat, Scot D.; Helderandt, David J.; Caldwell, Dustin D.; Aardahl, Christopher L.; Autrey, Tom; Linehan, John C.; Rappe, Kenneth G.

    2008-08-15

    This paper reports a simple and efficient gas burette system that allows automated determination of evolved gas volume in real time using only temperature and pressure measurements. The system is reliable and has been used successfully to study the hydrogen release kinetics of ammonia borane thermolysis. The system is especially suitable for bench scale studies involving small batches and potentially rapid reaction kinetics.

  7. The Skin Cancer and Sun Knowledge (SCSK) Scale: Validity, Reliability, and Relationship to Sun-Related Behaviors among Young Western Adults

    ERIC Educational Resources Information Center

    Day, Ashley K.; Wilson, Carlene; Roberts, Rachel M.; Hutchinson, Amanda D.

    2014-01-01

    Increasing public knowledge remains one of the key aims of skin cancer awareness campaigns, yet diagnosis rates continue to rise. It is essential we measure skin cancer knowledge adequately so as to determine the nature of its relationship to sun-related behaviors. This study investigated the psychometric properties of a new measure of skin cancer…

  8. Spacetimes containing slowly evolving horizons

    SciTech Connect

    Kavanagh, William; Booth, Ivan

    2006-08-15

    Slowly evolving horizons are trapping horizons that are ''almost'' isolated horizons. This paper reviews their definition and discusses several spacetimes containing such structures. These include certain Vaidya and Tolman-Bondi solutions as well as (perturbatively) tidally distorted black holes. Taking into account the mass scales and orders of magnitude that arise in these calculations, we conjecture that slowly evolving horizons are the norm rather than the exception in astrophysical processes that involve stellar-scale black holes.

  9. Natural Selection Promotes Antigenic Evolvability

    PubMed Central

    Graves, Christopher J.; Ros, Vera I. D.; Stevenson, Brian; Sniegowski, Paul D.; Brisson, Dustin

    2013-01-01

    The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide an experimentally tractable system to test whether natural selection has favored mechanisms that increase evolvability. Many antigenic variation systems consist of paralogous unexpressed ‘cassettes’ that recombine into an expression site to rapidly alter the expressed protein. Importantly, the magnitude of antigenic change is a function of the genetic diversity among the unexpressed cassettes. Thus, evidence that selection favors among-cassette diversity is direct evidence that natural selection promotes antigenic evolvability. We used the Lyme disease bacterium, Borrelia burgdorferi, as a model to test the prediction that natural selection favors amino acid diversity among unexpressed vls cassettes and thereby promotes evolvability in a primary surface antigen, VlsE. The hypothesis that diversity among vls cassettes is favored by natural selection was supported in each B. burgdorferi strain analyzed using both classical (dN/dS ratios) and Bayesian population genetic analyses of genetic sequence data. This hypothesis was also supported by the conservation of highly mutable tandem-repeat structures across B. burgdorferi strains despite a near complete absence of sequence conservation. Diversification among vls cassettes due to natural selection and mutable repeat structures promotes long-term antigenic evolvability of VlsE. These findings provide a direct demonstration that molecular mechanisms that enhance evolvability of surface antigens are an evolutionary adaptation. The molecular evolutionary processes identified here can serve as a model for the evolution of antigenic evolvability in many pathogens which utilize similar strategies to establish chronic infections

  10. On the Discovery of Evolving Truth

    PubMed Central

    Li, Yaliang; Li, Qi; Gao, Jing; Su, Lu; Zhao, Bo; Fan, Wei; Han, Jiawei

    2015-01-01

    In the era of big data, information regarding the same objects can be collected from increasingly more sources. Unfortunately, there usually exist conflicts among the information coming from different sources. To tackle this challenge, truth discovery, i.e., to integrate multi-source noisy information by estimating the reliability of each source, has emerged as a hot topic. In many real world applications, however, the information may come sequentially, and as a consequence, the truth of objects as well as the reliability of sources may be dynamically evolving. Existing truth discovery methods, unfortunately, cannot handle such scenarios. To address this problem, we investigate the temporal relations among both object truths and source reliability, and propose an incremental truth discovery framework that can dynamically update object truths and source weights upon the arrival of new data. Theoretical analysis is provided to show that the proposed method is guaranteed to converge at a fast rate. The experiments on three real world applications and a set of synthetic data demonstrate the advantages of the proposed method over state-of-the-art truth discovery methods. PMID:26705502

  11. Proposed reliability cost model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  12. Network reliability

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1985-01-01

    Network control (or network management) functions are essential for efficient and reliable operation of a network. Some control functions are currently included as part of the Open System Interconnection model. For local area networks, it is widely recognized that there is a need for additional control functions, including fault isolation functions, monitoring functions, and configuration functions. These functions can be implemented in either a central or distributed manner. The Fiber Distributed Data Interface Medium Access Control and Station Management protocols provide an example of distributed implementation. Relative information is presented here in outline form.

  13. Robustness to Faults Promotes Evolvability: Insights from Evolving Digital Circuits

    PubMed Central

    Nolfi, Stefano

    2016-01-01

    We demonstrate how the need to cope with operational faults enables evolving circuits to find more fit solutions. The analysis of the results obtained in different experimental conditions indicates that, in absence of faults, evolution tends to select circuits that are small and have low phenotypic variability and evolvability. The need to face operation faults, instead, drives evolution toward the selection of larger circuits that are truly robust with respect to genetic variations and that have a greater level of phenotypic variability and evolvability. Overall our results indicate that the need to cope with operation faults leads to the selection of circuits that have a greater probability to generate better circuits as a result of genetic variation with respect to a control condition in which circuits are not subjected to faults. PMID:27409589

  14. Evolving phenotypic networks in silico.

    PubMed

    François, Paul

    2014-11-01

    Evolved gene networks are constrained by natural selection. Their structures and functions are consequently far from being random, as exemplified by the multiple instances of parallel/convergent evolution. One can thus ask if features of actual gene networks can be recovered from evolutionary first principles. I review a method for in silico evolution of small models of gene networks aiming at performing predefined biological functions. I summarize the current implementation of the algorithm, insisting on the construction of a proper "fitness" function. I illustrate the approach on three examples: biochemical adaptation, ligand discrimination and vertebrate segmentation (somitogenesis). While the structure of the evolved networks is variable, dynamics of our evolved networks are usually constrained and present many similar features to actual gene networks, including properties that were not explicitly selected for. In silico evolution can thus be used to predict biological behaviours without a detailed knowledge of the mapping between genotype and phenotype. PMID:24956562

  15. The evolved function of the oedipal conflict.

    PubMed

    Josephs, Lawrence

    2010-08-01

    Freud based his oedipal theory on three clinical observations of adult romantic relationships: (1) Adults tend to split love and lust; (2) There tend to be sex differences in the ways that men and women split love and lust; (3) Adult romantic relationships are unconsciously structured by the dynamics of love triangles in which dramas of seduction and betrayal unfold. Freud believed that these aspects of adult romantic relationships were derivative expressions of a childhood oedipal conflict that has been repressed. Recent research conducted by evolutionary psychologists supports many of Freud's original observations and suggests that Freud's oedipal conflict may have evolved as a sexually selected adaptation for reproductive advantage. The evolution of bi-parental care based on sexually exclusive romantic bonds made humans vulnerable to the costs of sexual infidelity, a situation of danger that seriously threatens monogamous bonds. A childhood oedipal conflict enables humans to better adapt to this longstanding evolutionary problem by providing the child with an opportunity to develop working models of love triangles. On the one hand, the oedipal conflict facilitates monogamous resolutions by creating intense anxiety about the dangers of sexual infidelity and mate poaching. On the other hand, the oedipal conflict in humans may facilitate successful cheating and mate poaching by cultivating a talent for hiding our true sexual intentions from others and even from ourselves. The oedipal conflict in humans may be disguised by evolutionary design in order to facilitate tactical deception in adult romantic relationships. PMID:20840647

  16. Sequentially evolved bilateral epidural haematomas.

    PubMed

    Rochat, P; Johannesen, H H; Poulsgård, L; Bøgeskov, L

    2002-12-01

    Sequentially evolved bilateral epidural haematomas, where the second haematoma evolves after surgical removal of the first haematoma, are rarely reported. We report two cases of this entity. One patient was involved in a road traffic accident and the other was suffering from a head injury after an assault. CT scans showed that both patients had an unilateral epidural haematoma with a thin presumably epidural haemorrhage on the opposite side. Both patients were operated for their epidural haematomas, but did not improve after surgical treatment, and postoperative CT scans revealed evolving of an epidural haematoma on the opposite side. After evacuation of the second epidural haematoma both patients recovered quickly. Sequentially evolved bilateral epidural haematomas are rare, but must be considered in the postoperative intensive care treatment in patients with epidural haematomas. Both cases emphasize the need for intensive care monitoring after an operation for an epidural haematoma and the need for CT scans if the patient does not improve quickly after removal of the haematoma. This is especially important if a small contralateral haematoma is seen on the initial CT scan. PMID:12445923

  17. Slippery Texts and Evolving Literacies

    ERIC Educational Resources Information Center

    Mackey, Margaret

    2007-01-01

    The idea of "slippery texts" provides a useful descriptor for materials that mutate and evolve across different media. Eight adult gamers, encountering the slippery text "American McGee's Alice," demonstrate a variety of ways in which players attempt to manage their attention as they encounter a new text with many resonances. The range of their…

  18. Signing Apes and Evolving Linguistics.

    ERIC Educational Resources Information Center

    Stokoe, William C.

    Linguistics retains from its antecedents, philology and the study of sacred writings, some of their apologetic and theological bias. Thus it has not been able to face squarely the question how linguistic function may have evolved from animal communication. Chimpanzees' use of signs from American Sign Language forces re-examination of language…

  19. Diversity sustains an evolving network

    PubMed Central

    Mehrotra, Ravi; Soni, Vikram; Jain, Sanjay

    2009-01-01

    We study an evolutionary model of a complex system that evolves under catalytic dynamics and Darwinian selection and exhibits spontaneous growth, stasis and then a collapse of its structure. We find that the typical lifetime of the system increases sharply with the diversity of its components or species. We also find that the prime reason for crashes is a naturally occurring internal fragility of the system. This fragility is captured in the network organizational character and is related to a reduced multiplicity of pathways or feedback loops between its components. These results apply to several generalizations of the model as well. This work suggests new parameters for understanding the robustness of evolving molecular networks, ecosystems, societies and markets. PMID:19033136

  20. You 3.0: The Most Important Evolving Technology

    ERIC Educational Resources Information Center

    Tamarkin, Molly; Bantz, David A.; Childs, Melody; diFilipo, Stephen; Landry, Stephen G.; LoPresti, Frances; McDonald, Robert H.; McGuthry, John W.; Meier, Tina; Rodrigo, Rochelle; Sparrow, Jennifer; Diggs, D. Teddy; Yang, Catherine W.

    2010-01-01

    That technology evolves is a given. Not as well understood is the impact of technological evolution on each individual--on oneself, one's skill development, one's career, and one's relationship with the work community. The authors believe that everyone in higher education will become an IT worker and that IT workers will be managing a growing…

  1. Evolvable Hardware for Space Applications

    NASA Technical Reports Server (NTRS)

    Lohn, Jason; Globus, Al; Hornby, Gregory; Larchev, Gregory; Kraus, William

    2004-01-01

    This article surveys the research of the Evolvable Systems Group at NASA Ames Research Center. Over the past few years, our group has developed the ability to use evolutionary algorithms in a variety of NASA applications ranging from spacecraft antenna design, fault tolerance for programmable logic chips, atomic force field parameter fitting, analog circuit design, and earth observing satellite scheduling. In some of these applications, evolutionary algorithms match or improve on human performance.

  2. When did oxygenic photosynthesis evolve?

    PubMed

    Buick, Roger

    2008-08-27

    The atmosphere has apparently been oxygenated since the 'Great Oxidation Event' ca 2.4 Ga ago, but when the photosynthetic oxygen production began is debatable. However, geological and geochemical evidence from older sedimentary rocks indicates that oxygenic photosynthesis evolved well before this oxygenation event. Fluid-inclusion oils in ca 2.45 Ga sandstones contain hydrocarbon biomarkers evidently sourced from similarly ancient kerogen, preserved without subsequent contamination, and derived from organisms producing and requiring molecular oxygen. Mo and Re abundances and sulphur isotope systematics of slightly older (2.5 Ga) kerogenous shales record a transient pulse of atmospheric oxygen. As early as ca 2.7 Ga, stromatolites and biomarkers from evaporative lake sediments deficient in exogenous reducing power strongly imply that oxygen-producing cyanobacteria had already evolved. Even at ca 3.2 Ga, thick and widespread kerogenous shales are consistent with aerobic photoautrophic marine plankton, and U-Pb data from ca 3.8 Ga metasediments suggest that this metabolism could have arisen by the start of the geological record. Hence, the hypothesis that oxygenic photosynthesis evolved well before the atmosphere became permanently oxygenated seems well supported. PMID:18468984

  3. Evolving Systems and Adaptive Key Component Control

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Balas, Mark J.

    2009-01-01

    We propose a new framework called Evolving Systems to describe the self-assembly, or autonomous assembly, of actively controlled dynamical subsystems into an Evolved System with a higher purpose. An introduction to Evolving Systems and exploration of the essential topics of the control and stability properties of Evolving Systems is provided. This chapter defines a framework for Evolving Systems, develops theory and control solutions for fundamental characteristics of Evolving Systems, and provides illustrative examples of Evolving Systems and their control with adaptive key component controllers.

  4. Bolometric Flux Estimation for Cool Evolved Stars

    NASA Astrophysics Data System (ADS)

    van Belle, Gerard T.; Creech-Eakman, Michelle J.; Ruiz-Velasco, Alma E.

    2016-07-01

    Estimation of bolometric fluxes ({F}{{BOL}}) is an essential component of stellar effective temperature determination with optical and near-infrared interferometry. Reliable estimation of {F}{{BOL}} simply from broadband K-band photometry data is a useful tool in those cases were contemporaneous and/or wide-range photometry is unavailable for a detailed spectral energy distribution (SED) fit, as was demonstrated in Dyck et al. Recalibrating the intrinsic {F}{{BOL}} versus observed {F}{{2.2}μ {{m}}} relationship of that study with modern SED fitting routines, which incorporate the significantly non-blackbody, empirical spectral templates of the INGS spectral library (an update of the library in Pickles) and estimation of reddening, serves to greatly improve the accuracy and observational utility of this relationship. We find that {F}{{BOL}} values predicted are roughly 11% less than the corresponding values predicted in Dyck et al., indicating the effects of SED absorption features across bolometric flux curves.

  5. Development and the evolvability of human limbs

    PubMed Central

    Young, Nathan M.; Wagner, Günter P.; Hallgrímsson, Benedikt

    2010-01-01

    The long legs and short arms of humans are distinctive for a primate, the result of selection acting in opposite directions on each limb at different points in our evolutionary history. This mosaic pattern challenges our understanding of the relationship of development and evolvability because limbs are serially homologous and genetic correlations should act as a significant constraint on their independent evolution. Here we test a developmental model of limb covariation in anthropoid primates and demonstrate that both humans and apes exhibit significantly reduced integration between limbs when compared to quadrupedal monkeys. This result indicates that fossil hominins likely escaped constraints on independent limb variation via reductions to genetic pleiotropy in an ape-like last common ancestor (LCA). This critical change in integration among hominoids, which is reflected in macroevolutionary differences in the disparity between limb lengths, facilitated selection for modern human limb proportions and demonstrates how development helps shape evolutionary change. PMID:20133636

  6. Reliability and Confidence.

    ERIC Educational Resources Information Center

    Test Service Bulletin, 1952

    1952-01-01

    Some aspects of test reliability are discussed. Topics covered are: (1) how high should a reliability coefficient be?; (2) two factors affecting the interpretation of reliability coefficients--range of talent and interval between testings; (3) some common misconceptions--reliability of speed tests, part vs. total reliability, reliability for what…

  7. A slowly evolving host moves first in symbiotic interactions

    NASA Astrophysics Data System (ADS)

    Damore, James; Gore, Jeff

    2011-03-01

    Symbiotic relationships, both parasitic and mutualistic, are ubiquitous in nature. Understanding how these symbioses evolve, from bacteria and their phages to humans and our gut microflora, is crucial in understanding how life operates. Often, symbioses consist of a slowly evolving host species with each host only interacting with its own sub-population of symbionts. The Red Queen hypothesis describes coevolutionary relationships as constant arms races with each species rushing to evolve an advantage over the other, suggesting that faster evolution is favored. Here, we use a simple game theoretic model of host- symbiont coevolution that includes population structure to show that if the symbionts evolve much faster than the host, the equilibrium distribution is the same as it would be if it were a sequential game where the host moves first against its symbionts. For the slowly evolving host, this will prove to be advantageous in mutualisms and a handicap in antagonisms. The model allows for symbiont adaptation to its host, a result that is robust to changes in the parameters and generalizes to continuous and multiplayer games. Our findings provide insight into a wide range of symbiotic phenomena and help to unify the field of coevolutionary theory.

  8. Synchronization in an evolving network

    NASA Astrophysics Data System (ADS)

    Singh, R. K.; Bagarti, Trilochan

    2015-09-01

    In this work we study the dynamics of Kuramoto oscillators on a stochastically evolving network whose evolution is governed by the phases of the individual oscillators and degree distribution. Synchronization is achieved after a threshold connection density is reached. This cumulative effect of topology and dynamics has many real-world implications, where synchronization in a system emerges as a collective property of its components in a self-organizing manner. The synchronous state remains stable as long as the connection density remains above the threshold value, with additional links providing resilience against network fluctuations.

  9. International Lead Zinc Research Organization-sponsored field-data collection and analysis to determine relationships between service conditions and reliability of valve-regulated lead-acid batteries in stationary applications

    NASA Astrophysics Data System (ADS)

    Taylor, P. A.; Moseley, P. T.; Butler, P. C.

    The International Lead Zinc Research Organization (ILZRO), in cooperation with Sandia National Laboratories, has initiated a multi-phase project with the following aims: to characterize relationships between valve-regulated lead-acid (VRLA) batteries, service conditions, and failure modes; to establish the degree of correlation between specific operating procedures and PCL; to identify operating procedures that mitigate PCL; to identify best-fits between the operating requirements of specific applications and the capabilities of specific VRLA technologies; to recommend combinations of battery design, manufacturing processes, and operating conditions that enhance VRLA performance and reliability. In the first phase of this project, ILZRO has contracted with Energetics to identify and survey manufacturers and users of VRLA batteries for stationary applications (including electric utilities, telecommunications companies, and government facilities). The confidential survey is collecting the service conditions of specific applications and performance records for specific VRLA technologies. From the data collected, Energetics is constructing a database of the service histories and analyzing the data to determine trends in performance for particular technologies in specific service conditions. ILZRO plans to make the final report of the analysis and a version of the database (that contains no proprietary information) available to ILZRO members, participants in the survey, and participants in a follow-on workshop for stakeholders in VRLA reliability. This paper presents the surveys distributed to manufacturers and end-users, discusses the analytic approach, presents an overview of the responses to the surveys and trends that have emerged in the early analysis of the data, and previews the functionality of the database being constructed.

  10. canEvolve: A Web Portal for Integrative Oncogenomics

    PubMed Central

    Yan, Zhenyu; Wang, Xujun; Cao, Qingyi; Munshi, Nikhil C.; Li, Cheng

    2013-01-01

    Background & Objective Genome-wide profiles of tumors obtained using functional genomics platforms are being deposited to the public repositories at an astronomical scale, as a result of focused efforts by individual laboratories and large projects such as the Cancer Genome Atlas (TCGA) and the International Cancer Genome Consortium. Consequently, there is an urgent need for reliable tools that integrate and interpret these data in light of current knowledge and disseminate results to biomedical researchers in a user-friendly manner. We have built the canEvolve web portal to meet this need. Results canEvolve query functionalities are designed to fulfill most frequent analysis needs of cancer researchers with a view to generate novel hypotheses. canEvolve stores gene, microRNA (miRNA) and protein expression profiles, copy number alterations for multiple cancer types, and protein-protein interaction information. canEvolve allows querying of results of primary analysis, integrative analysis and network analysis of oncogenomics data. The querying for primary analysis includes differential gene and miRNA expression as well as changes in gene copy number measured with SNP microarrays. canEvolve provides results of integrative analysis of gene expression profiles with copy number alterations and with miRNA profiles as well as generalized integrative analysis using gene set enrichment analysis. The network analysis capability includes storage and visualization of gene co-expression, inferred gene regulatory networks and protein-protein interaction information. Finally, canEvolve provides correlations between gene expression and clinical outcomes in terms of univariate survival analysis. Conclusion At present canEvolve provides different types of information extracted from 90 cancer genomics studies comprising of more than 10,000 patients. The presence of multiple data types, novel integrative analysis for identifying regulators of oncogenesis, network analysis and ability

  11. HIV and HLA Class I: an evolving relationship

    PubMed Central

    Goulder, Philip J.R.; Walker, Bruce D

    2014-01-01

    Successful vaccine development for infectious diseases has largely been achieved in settings where natural immunity to the pathogen results in clearance in at least some individuals. HIV presents an additional challenge in that natural clearance of infection does not occur, and the correlates of immune protection are still uncertain. However, partial control of viremia and markedly different outcomes of disease are observed in HIV infected persons. Here we examine the antiviral mechanisms implicated by one variable that has been consistently associated with extremes of outcome, namely HLA class I alleles, and in particular HLA-B, and examine the mechanisms by which this modulation is likely to occur, and the impact of these interactions on evolution of the virus and the host. Studies to date provide evidence for both HLA-dependent and epitope-dependent influences on viral control and viral evolution, and have important implications for the continued quest for an effective HIV vaccine. PMID:22999948

  12. The Evolving Relationship between Researchers and Public Policy

    ERIC Educational Resources Information Center

    Henig, Jeffrey R.

    2008-01-01

    When it comes to the role of research in shaping public policy and debate, one might reasonably argue that this is the best of times. No Child Left Behind (NCLB), with its frequent mention of evidence-based decision making, has underscored the role that objective knowledge should play in a democratic society. The Institute of Education Sciences,…

  13. Primordial evolvability: Impasses and challenges.

    PubMed

    Vasas, Vera; Fernando, Chrisantha; Szilágyi, András; Zachár, István; Santos, Mauro; Szathmáry, Eörs

    2015-09-21

    While it is generally agreed that some kind of replicating non-living compounds were the precursors of life, there is much debate over their possible chemical nature. Metabolism-first approaches propose that mutually catalytic sets of simple organic molecules could be capable of self-replication and rudimentary chemical evolution. In particular, the graded autocatalysis replication domain (GARD) model, depicting assemblies of amphiphilic molecules, has received considerable interest. The system propagates compositional information across generations and is suggested to be a target of natural selection. However, evolutionary simulations indicate that the system lacks selectability (i.e. selection has negligible effect on the equilibrium concentrations). We elaborate on the lessons learnt from the example of the GARD model and, more widely, on the issue of evolvability, and discuss the implications for similar metabolism-first scenarios. We found that simple incorporation-type chemistry based on non-covalent bonds, as assumed in GARD, is unlikely to result in alternative autocatalytic cycles when catalytic interactions are randomly distributed. An even more serious problem stems from the lognormal distribution of catalytic factors, causing inherent kinetic instability of such loops, due to the dominance of efficiently catalyzed components that fail to return catalytic aid. Accordingly, the dynamics of the GARD model is dominated by strongly catalytic, but not auto-catalytic, molecules. Without effective autocatalysis, stable hereditary propagation is not possible. Many repetitions and different scaling of the model come to no rescue. Despite all attempts to show the contrary, the GARD model is not evolvable, in contrast to reflexively autocatalytic networks, complemented by rare uncatalyzed reactions and compartmentation. The latter networks, resting on the creation and breakage of chemical bonds, can generate novel ('mutant') autocatalytic loops from a given set of

  14. Isotopic Analysis and Evolved Gases

    NASA Technical Reports Server (NTRS)

    Swindle, Timothy D.; Boynton, William V.; Chutjian, Ara; Hoffman, John H.; Jordan, Jim L.; Kargel, Jeffrey S.; McEntire, Richard W.; Nyquist, Larry

    1996-01-01

    Precise measurements of the chemical, elemental, and isotopic composition of planetary surface material and gases, and observed variations in these compositions, can contribute significantly to our knowledge of the source(s), ages, and evolution of solar system materials. The analyses discussed in this paper are mostly made by mass spectrometers or some other type of mass analyzer, and address three broad areas of interest: (1) atmospheric composition - isotopic, elemental, and molecular, (2) gases evolved from solids, and (3) solids. Current isotopic data on nine elements, mostly from in situ analysis, but also from meteorites and telescopic observations are summarized. Potential instruments for isotopic analysis of lunar, Martian, Venusian, Mercury, and Pluto surfaces, along with asteroid, cometary and icy satellites, surfaces are discussed.

  15. Drastic events make evolving networks

    NASA Astrophysics Data System (ADS)

    Ausloos, M.; Lambiotte, R.

    2007-05-01

    Co-authorship networks of neighbouring scientific disciplines, i.e. granular (G) media and networks (N) are studied in order to observe drastic structural changes in evolving networks. The data is taken from arXives. The system is described as coupled networks. By considering the 1995-2005 time interval and scanning the author-article network evolution with a mobile time window, we focus on the properties of the links, as well as on the time evolution of the nodes. They can be in three states, N, G or multi-disciplinary (M). This leads to drastic jumps in a so-called order parameter, i.e. the link proportion of a given type, forming the main island, that reminds of features appearing at percolation and during metastable (aggregation-desaggregation) processes. The data analysis also focuses on the way different kinds (N, G or M) of authors collaborate, and on the kind of the resulting collaboration.

  16. Speech processing: An evolving technology

    SciTech Connect

    Crochiere, R.E.; Flanagan, J.L.

    1986-09-01

    As we enter the information age, speech processing is emerging as an important technology for making machines easier and more convenient for humans to use. It is both an old and a new technology - dating back to the invention of the telephone and forward, at least in aspirations, to the capabilities of HAL in 2001. Explosive advances in microelectronics now make it possible to implement economical real-time hardware for sophisticated speech processing - processing that formerly could be demonstrated only in simulations on main-frame computers. As a result, fundamentally new product concepts - as well as new features and functions in existing products - are becoming possible and are being explored in the marketplace. As the introductory piece to this issue, the authors draw a brief perspective on the evolving field of speech processing and assess the technology in the the three constituent sectors: speech coding, synthesis, and recognition.

  17. Planets in Evolved Binary Systems

    NASA Astrophysics Data System (ADS)

    Perets, Hagai B.

    2011-03-01

    Exo-planets are typically thought to form in protoplanetary disks left over from protostellar disk of their newly formed host star. However, additional planetary formation and evolution routes may exist in old evolved binary systems. Here we discuss the implications of binary stellar evolution on planetary systems in such environments. In these binary systems stellar evolution could lead to the formation of symbiotic stars, where mass is lost from one star and could be transferred to its binary companion, and may form an accretion disk around it. This raises the possibility that such a disk could provide the necessary environment for the formation of a new, second generation of planets in both circumstellar or circumbinary configurations. Pre-existing first generation planets surviving the post-MS evolution of such systems would be dynamically effected by the mass loss in the systems and may also interact with the newly formed disk. Such planets and/or planetesimals may also serve as seeds for the formation of the second generation planets, and/or interact with them, possibly forming atypical planetary systems. Second generation planetary systems should be typically found in white dwarf binary systems, and may show various observational signatures. Most notably, second generation planets could form in environment which are inaccessible, or less favorable, for first generation planets. The orbital phase space available for the second generation planets could be forbidden (in terms of the system stability) to first generation planets in the pre-evolved progenitor binaries. In addition planets could form in metal poor environments such as globular clusters and/or in double compact object binaries. Observations of exo-planets in such forbidden or unfavorable regions could possibly serve to uniquely identify their second generation character. Finally, we point out a few observed candidate second generation planetary systems, including Gl 86, HD 27442 and all of the

  18. Evolving toward Laughter in Learning

    ERIC Educational Resources Information Center

    Strean, William B.

    2008-01-01

    Lowman (1995) described the relationship between teacher and student and student engagement as the two most important ingredients in learning in higher education. Humour builds teacher-student connection (Berk, 1998) and engages students in the learning process. The bond between student and teacher is essential for learning, satisfaction, and…

  19. Carl Thoresen: The Evolving Pioneer

    ERIC Educational Resources Information Center

    Harris, Alex H. S.

    2009-01-01

    This interview with Carl E. Thoresen highlights the experiences, relationships, and ideas that have influenced this pioneering psychologist throughout the past half century. His scholarly work, professional service, teaching, and mentorship have motivated many counseling psychologists to radically expand their areas of inquiry. He was among the…

  20. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  1. Reliability model generator

    NASA Technical Reports Server (NTRS)

    McMann, Catherine M. (Inventor); Cohen, Gerald C. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  2. How do drumlin patterns evolve?

    NASA Astrophysics Data System (ADS)

    Ely, Jeremy; Clark, Chris; Spagnolo, Matteo; Hughes, Anna

    2016-04-01

    The flow of a geomorphic agent over a sediment bed creates patterns in the substrate composed of bedforms. Ice is no exception to this, organising soft sedimentary substrates into subglacial bedforms. As we are yet to fully observe their initiation and evolution beneath a contemporary ice mass, little is known about how patterns in subglacial bedforms develop. Here we study 36,222 drumlins, divided into 72 flowsets, left behind by the former British-Irish Ice sheet. These flowsets provide us with 'snapshots' of drumlin pattern development. The probability distribution functions of the size and shape metrics of drumlins within these flowsets were analysed to determine whether behaviour that is common of other patterned phenomena has occurred. Specifically, we ask whether drumlins i) are printed at a specific scale; ii) grow or shrink after they initiate; iii) stabilise at a specific size and shape; and iv) migrate. Our results indicate that drumlins initiate at a minimum size and spacing. After initiation, the log-normal distribution of drumlin size and shape metrics suggests that drumlins grow, or possibly shrink, as they develop. We find no evidence for stabilisation in drumlin length, supporting the idea of a subglacial bedform continuum. Drumlin migration is difficult to determine from the palaeo-record. However, there are some indications that a mixture of static and mobile drumlins occurs, which could potentially lead to collisions, cannibalisation and coarsening. Further images of modern drumlin fields evolving beneath ice are required to capture stages of drumlin pattern evolution.

  3. Magnetic fields around evolved stars

    NASA Astrophysics Data System (ADS)

    Leal-Ferreira, M.; Vlemmings, W.; Kemball, A.; Amiri, N.; Maercker, M.; Ramstedt, S.; Olofsson, G.

    2014-04-01

    A number of mechanisms, such as magnetic fields, (binary) companions and circumstellar disks have been suggested to be the cause of non-spherical PNe and in particular collimated outflows. This work investigates one of these mechanisms: the magnetic fields. While MHD simulations show that the fields can indeed be important, few observations of magnetic fields have been done so far. We used the VLBA to observe five evolved stars, with the goal of detecting the magnetic field by means of water maser polarization. The sample consists in four AGB stars (IK Tau, RT Vir, IRC+60370 and AP Lyn) and one pPN (OH231.8+4.2). In four of the five sources, several strong maser features were detected allowing us to measure the linear and/or circular polarization. Based on the circular polarization detections, we infer the strength of the component of the field along the line of sight to be between ~30 mG and ~330 mG in the water maser regions of these four sources. When extrapolated to the surface of the stars, the magnetic field strength would be between a few hundred mG and a few Gauss when assuming a toroidal field geometry and higher when assuming more complex magnetic fields. We conclude that the magnetic energy we derived in the water maser regions is higher than the thermal and kinetic energy, leading to the conclusion that, indeed, magnetic fields probably play an important role in shaping Planetary Nebulae.

  4. Submillimeter observations of evolved stars

    NASA Technical Reports Server (NTRS)

    Sopka, R. J.; Hildebrand, R.; Jaffe, D. T.; Gatley, I.; Roellig, T.

    1985-01-01

    Broadband submillimeter observations of thermal emission from several evolved stars have been obtained using the United Kingdom Infrared Telescope on Mauna Kea, Hawaii. The observations were carried out at an effective wavelength of 400 microns in order to estimate the mass loss rates in dust from the stars. Direct estimates of mass loss rates are in the range 10 to the -9th to 10 to the -6th solar mass/yr. Analysis of the spectrum of IRC + 10216 confirmed previous estimates of dust grain emissivity in the range 10-1000 microns. The infrared properties of IRC + 10216 are found to be similar to the carbon rich object CRL 3068. No systematic difference was found between the dust masses of carbon rich and oxygen rich envelopes. The largest mass loss rates in dust were obtained for the bipolar objects OH 231.8 + 4.2 CRL 2688, CRL 618, and NGC 7027. It is suggested that the ratios of gas to dust, and the slopes of the far infrared to submillimeter wavelength continua of these stars objects are probably representative of amorphous rather than crystalline grains.

  5. Multiscale modelling of evolving foams

    NASA Astrophysics Data System (ADS)

    Saye, R. I.; Sethian, J. A.

    2016-06-01

    We present a set of multi-scale interlinked algorithms to model the dynamics of evolving foams. These algorithms couple the key effects of macroscopic bubble rearrangement, thin film drainage, and membrane rupture. For each of the mechanisms, we construct consistent and accurate algorithms, and couple them together to work across the wide range of space and time scales that occur in foam dynamics. These algorithms include second order finite difference projection methods for computing incompressible fluid flow on the macroscale, second order finite element methods to solve thin film drainage equations in the lamellae and Plateau borders, multiphase Voronoi Implicit Interface Methods to track interconnected membrane boundaries and capture topological changes, and Lagrangian particle methods for conservative liquid redistribution during rearrangement and rupture. We derive a full set of numerical approximations that are coupled via interface jump conditions and flux boundary conditions, and show convergence for the individual mechanisms. We demonstrate our approach by computing a variety of foam dynamics, including coupled evolution of three-dimensional bubble clusters attached to an anchored membrane and collapse of a foam cluster.

  6. Circumstellar Crystalline Silicates: Evolved Stars

    NASA Astrophysics Data System (ADS)

    Tartar, Josh; Speck, A. K.

    2008-05-01

    One of the most exciting developments in astronomy in the last 15 years was the discovery of crystalline silicate stardust by the Short Wavelength Spectrometer (SWS) on board of ISO; discovery of the crystalline grains was indeed one of the biggest surprises of the ISO mission. Initially discovered around AGB stars (evolved stars in the range of 0.8 > M/M¤>8) at far-infrared (IR) wavelengths, crystalline silicates have since been seen in many astrophysical environments including young stellar objects (T Tauri and Herbig Ae/Be), comets and Ultra Luminous Infrared Galaxies. Low and intermediate mass stars (LIMS) comprise 95% of the contributors to the ISM, so study of the formation of crystalline silicates is critical to our understanding of the ISM, which is thought to be primarily amorphous (one would expect an almost exact match between the composition of AGB dust shells and the dust in the ISM). Whether the crystalline dust is merely undetectable or amorphized remains a mystery. The FORCAST instrument on SOFIA as well as the PACS instrument on Herschel will provide exciting observing opportunities for the further study of crystalline silicates.

  7. Reliability Generalization: "Lapsus Linguae"

    ERIC Educational Resources Information Center

    Smith, Julie M.

    2011-01-01

    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  8. Exploring Evolving Media Discourse Through Event Cueing.

    PubMed

    Lu, Yafeng; Steptoe, Michael; Burke, Sarah; Wang, Hong; Tsai, Jiun-Yi; Davulcu, Hasan; Montgomery, Douglas; Corman, Steven R; Maciejewski, Ross

    2016-01-01

    Online news, microblogs and other media documents all contain valuable insight regarding events and responses to events. Underlying these documents is the concept of framing, a process in which communicators act (consciously or unconsciously) to construct a point of view that encourages facts to be interpreted by others in a particular manner. As media discourse evolves, how topics and documents are framed can undergo change, shifting the discussion to different viewpoints or rhetoric. What causes these shifts can be difficult to determine directly; however, by linking secondary datasets and enabling visual exploration, we can enhance the hypothesis generation process. In this paper, we present a visual analytics framework for event cueing using media data. As discourse develops over time, our framework applies a time series intervention model which tests to see if the level of framing is different before or after a given date. If the model indicates that the times before and after are statistically significantly different, this cues an analyst to explore related datasets to help enhance their understanding of what (if any) events may have triggered these changes in discourse. Our framework consists of entity extraction and sentiment analysis as lenses for data exploration and uses two different models for intervention analysis. To demonstrate the usage of our framework, we present a case study on exploring potential relationships between climate change framing and conflicts in Africa. PMID:26529702

  9. Evolving the ingredients for reciprocity and spite

    PubMed Central

    Hauser, Marc; McAuliffe, Katherine; Blake, Peter R.

    2009-01-01

    Darwin never provided a satisfactory account of altruism, but posed the problem beautifully in light of the logic of natural selection. Hamilton and Williams delivered the necessary satisfaction by appealing to kinship, and Trivers showed that kinship was not necessary as long as the originally altruistic act was conditionally reciprocated. From the late 1970s to the present, the kinship theories in particular have been supported by considerable empirical data and elaborated to explore a number of other social interactions such as cooperation, selfishness and punishment, giving us what is now a rich description of the nature of social relationships among organisms. There are, however, two forms of theoretically possible social interactions—reciprocity and spite—that appear absent or nearly so in non-human vertebrates, despite considerable research efforts on a wide diversity of species. We suggest that the rather weak comparative evidence for these interactions is predicted once we consider the requisite socioecological pressures and psychological mechanisms. That is, a consideration of ultimate demands and proximate prerequisites leads to the prediction that reciprocity and spite should be rare in non-human animals, and common in humans. In particular, reciprocity and spite evolved in humans because of adaptive demands on cooperation among unrelated individuals living in large groups, and the integrative capacities of inequity detection, future-oriented decision-making and inhibitory control. PMID:19805432

  10. Voyages Through Time: Everything Evolves

    NASA Astrophysics Data System (ADS)

    Pendleton, Y. J.; Tarter, J. C.; DeVore, E. K.; O'Sullivan, K. A.; Taylor, S. M.

    2001-12-01

    Evolutionary change is a powerful framework for studying our world and our place therein. It is a recurring theme in every realm of science: over time, the universe, the planet Earth, life, and human technologies all change, albeit on vastly different scales. Evolution offers scientific explanations for the age-old question, "Where did we come from?" In addition, historical perspectives of science show how our understanding has evolved over time. The complexities of all of these systems will never reveal a "finished" story. But it is a story of epic size, capable of inspiring awe and of expanding our sense of time and place, and eminently worthy of investigating. This story is the basis of Voyages Through Time. Voyages Through Time (VTT), provides teachers with not only background science content and pedagogy, but also with materials and resources for the teaching of evolution. The six modules, Cosmic Evolution, Planetary Evolution, Origin of Life, Evolution of Life, Hominid Evolution, and Evolution of Technology, emphasize student inquiry, and promote the nature of science, as recommended in the NSES and BSL. The modules are unified by the overarching theme of evolution and the meta questions: "What is changing?" "What is the rate of change?" and "What is the mechanism of change?" Determination of student outcomes for the project required effective collaboration of scientists, teachers, students and media specialists. The broadest curricula students outcomes are 1) an enjoyment of science, 2) an understanding of the nature of science, especially the understanding of evidence and re-evaluation, and 3) key science content. The curriculum is being developed by the SETI Institute, NASA Ames Research Center, California Academy of Sciences, and San Francisco State University, and is funded by the NSF (IMD 9730693), with support form Hewlett-Packard Company, The Foundation for Microbiology, Combined Federated Charities, NASA Astrobiology Institute, and NASA Fundamental

  11. Submillimeter observations of evolved stars

    SciTech Connect

    Sopka, R.J.; Hildebrand, R.; Jaffe, D.T.; Gatley, I.; Roellig, T.; Werner, M.; Jura, M.; Zuckerman, B.

    1985-07-01

    Broad-band submillimeter observations of the thermal emission from evolved stars have been obtained with the United Kingdom Infrared Telescope on Mauna Kea, Hawaii. These observations, at an effective wavelength of 400 ..mu..m, provide the most direct method for estimating the mass loss rate in dust from these stars and also help to define the long-wavelength thermal spectrum of the dust envelopes. The mass loss rates in dust that we derive range from 10/sup -9/ to 10/sup -6/ M/sub sun/ yr/sup -1/ and are compared with mass loss rates derived from molecular line observations to estimate gas-to-dust ratios in outflowing envelopes. These values are found to be generally compatible with the interstellar gas-to-dust ratio of approx.100 if submillimeter emissivities appropriate to amorphous grain structures are assumed. Our analysis of the spectrum of IRC+10216 confirms previous suggestions that the grain emissivity varies as lambda/sup -1.2/ rather than as lambda/sup -2/ for 10

  12. Evolving issues in surrogate motherhood.

    PubMed

    Erlen, J A; Holzman, I R

    1990-01-01

    Surrogate mothering is an arrangement whereby a woman who gives birth to an infant intends--through a contractual agreement--to give that baby to another couple. The recent Baby M case in the United States has raised numerous legal concerns causing many legislative bodies to consider possible statutes to regulate or prohibit surrogacy. The competing interests among and between the individuals involved in this relationship (i.e., the surrogate mother, the couple, the baby, and society) suggest various ethical issues related to benefits, risks, and autonomy. Legal and ethical concerns surrounding the technologically possible procedure of surrogate motherhood are discussed. PMID:2391288

  13. Can There Be Reliability without "Reliability?"

    ERIC Educational Resources Information Center

    Mislevy, Robert J.

    2004-01-01

    An "Educational Researcher" article by Pamela Moss (1994) asks the title question, "Can there be validity without reliability?" Yes, she answers, if by reliability one means "consistency among independent observations intended as interchangeable" (Moss, 1994, p. 7), quantified by internal consistency indices such as KR-20 coefficients and…

  14. HELIOS Critical Design Review: Reliability

    NASA Technical Reports Server (NTRS)

    Benoehr, H. C.; Herholz, J.; Prem, H.; Mann, D.; Reichert, L.; Rupp, W.; Campbell, D.; Boettger, H.; Zerwes, G.; Kurvin, C.

    1972-01-01

    This paper presents Helios Critical Design Review Reliability form October 16-20, 1972. The topics include: 1) Reliability Requirement; 2) Reliability Apportionment; 3) Failure Rates; 4) Reliability Assessment; 5) Reliability Block Diagram; and 5) Reliability Information Sheet.

  15. Compound estimation procedures in reliability

    NASA Technical Reports Server (NTRS)

    Barnes, Ron

    1990-01-01

    At NASA, components and subsystems of components in the Space Shuttle and Space Station generally go through a number of redesign stages. While data on failures for various design stages are sometimes available, the classical procedures for evaluating reliability only utilize the failure data on the present design stage of the component or subsystem. Often, few or no failures have been recorded on the present design stage. Previously, Bayesian estimators for the reliability of a single component, conditioned on the failure data for the present design, were developed. These new estimators permit NASA to evaluate the reliability, even when few or no failures have been recorded. Point estimates for the latter evaluation were not possible with the classical procedures. Since different design stages of a component (or subsystem) generally have a good deal in common, the development of new statistical procedures for evaluating the reliability, which consider the entire failure record for all design stages, has great intuitive appeal. A typical subsystem consists of a number of different components and each component has evolved through a number of redesign stages. The present investigations considered compound estimation procedures and related models. Such models permit the statistical consideration of all design stages of each component and thus incorporate all the available failure data to obtain estimates for the reliability of the present version of the component (or subsystem). A number of models were considered to estimate the reliability of a component conditioned on its total failure history from two design stages. It was determined that reliability estimators for the present design stage, conditioned on the complete failure history for two design stages have lower risk than the corresponding estimators conditioned only on the most recent design failure data. Several models were explored and preliminary models involving bivariate Poisson distribution and the

  16. The Problem of Evolving a Genetic Code

    ERIC Educational Resources Information Center

    Woese, Carl R.

    1970-01-01

    Proposes models for the evolution of the genetic code and translation mechanisms. Suggests that the translation process is so complex and precise that it must have evolved in many stages, and that the evolution of the code was influenced by the constraints imposed by the evolving translation mechanism. (EB)

  17. What Technology? Reflections on Evolving Services

    ERIC Educational Resources Information Center

    Collins, Sharon

    2009-01-01

    Each year, the members of the EDUCAUSE Evolving Technologies Committee identify and research the evolving technologies that are having--or are predicted to have--the most direct impact on higher education institutions. The committee members choose the relevant topics, write white papers, and present their findings at the EDUCAUSE annual…

  18. Reliability computation from reliability block diagrams

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.; Eckstein, R. E.

    1971-01-01

    A method and a computer program are presented to calculate probability of system success from an arbitrary reliability block diagram. The class of reliability block diagrams that can be handled include any active/standby combination of redundancy, and the computations include the effects of dormancy and switching in any standby redundancy. The mechanics of the program are based on an extension of the probability tree method of computing system probabilities.

  19. Power electronics reliability analysis.

    SciTech Connect

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  20. Human Reliability Program Overview

    SciTech Connect

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  1. Reliable Design Versus Trust

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  2. Evolvable Cryogenics (ECRYO) Pressure Transducer Calibration Test

    NASA Technical Reports Server (NTRS)

    Diaz, Carlos E., Jr.

    2015-01-01

    This paper provides a summary of the findings of recent activities conducted by Marshall Space Flight Center's (MSFC) In-Space Propulsion Branch and MSFC's Metrology and Calibration Lab to assess the performance of current "state of the art" pressure transducers for use in long duration storage and transfer of cryogenic propellants. A brief historical narrative in this paper describes the Evolvable Cryogenics program and the relevance of these activities to the program. This paper also provides a review of three separate test activities performed throughout this effort, including: (1) the calibration of several pressure transducer designs in a liquid nitrogen cryogenic environmental chamber, (2) the calibration of a pressure transducer in a liquid helium Dewar, and (3) the calibration of several pressure transducers at temperatures ranging from 20 to 70 degrees Kelvin (K) using a "cryostat" environmental chamber. These three separate test activities allowed for study of the sensors along a temperature range from 4 to 300 K. The combined data shows that both the slope and intercept of the sensor's calibration curve vary as a function of temperature. This homogeneous function is contrary to the linearly decreasing relationship assumed at the start of this investigation. Consequently, the data demonstrates the need for lookup tables to change the slope and intercept used by any data acquisition system. This ultimately would allow for more accurate pressure measurements at the desired temperature range. This paper concludes with a review of a request for information (RFI) survey conducted amongst different suppliers to determine the availability of current "state of the art" flight-qualified pressure transducers. The survey identifies requirements that are most difficult for the suppliers to meet, most notably the capability to validate the sensor's performance at temperatures below 70 K.

  3. Predicting software reliability

    NASA Technical Reports Server (NTRS)

    Littlewood, B.

    1989-01-01

    A detailed look is given to software reliability techniques. A conceptual model of the failure process is examined, and some software reliability growth models are discussed. Problems for which no current solutions exist are addressed, emphasizing the very difficult problem of safety-critical systems for which the reliability requirements can be enormously demanding.

  4. Reliability model generator specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Mccann, Catherine

    1990-01-01

    The Reliability Model Generator (RMG), a program which produces reliability models from block diagrams for ASSIST, the interface for the reliability evaluation tool SURE is described. An account is given of motivation for RMG and the implemented algorithms are discussed. The appendices contain the algorithms and two detailed traces of examples.

  5. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  6. Properties of artificial networks evolved to contend with natural spectra.

    PubMed

    Morgenstern, Yaniv; Rostami, Mohammad; Purves, Dale

    2014-07-22

    Understanding why spectra that are physically the same appear different in different contexts (color contrast), whereas spectra that are physically different appear similar (color constancy) presents a major challenge in vision research. Here, we show that the responses of biologically inspired neural networks evolved on the basis of accumulated experience with spectral stimuli automatically generate contrast and constancy. The results imply that these phenomena are signatures of a strategy that biological vision uses to circumvent the inverse optics problem as it pertains to light spectra, and that double-opponent neurons in early-level vision evolve to serve this purpose. This strategy provides a way of understanding the peculiar relationship between the objective world and subjective color experience, as well as rationalizing the relevant visual circuitry without invoking feature detection or image representation. PMID:25024184

  7. Properties of artificial networks evolved to contend with natural spectra

    PubMed Central

    Morgenstern, Yaniv; Rostami, Mohammad; Purves, Dale

    2014-01-01

    Understanding why spectra that are physically the same appear different in different contexts (color contrast), whereas spectra that are physically different appear similar (color constancy) presents a major challenge in vision research. Here, we show that the responses of biologically inspired neural networks evolved on the basis of accumulated experience with spectral stimuli automatically generate contrast and constancy. The results imply that these phenomena are signatures of a strategy that biological vision uses to circumvent the inverse optics problem as it pertains to light spectra, and that double-opponent neurons in early-level vision evolve to serve this purpose. This strategy provides a way of understanding the peculiar relationship between the objective world and subjective color experience, as well as rationalizing the relevant visual circuitry without invoking feature detection or image representation. PMID:25024184

  8. Systems approaches in understanding evolution and evolvability.

    PubMed

    Agarwal, Sumeet

    2013-12-01

    Systems and network-based approaches are becoming increasingly popular in cellular biology. One contribution of such approaches has been to shed some light on the evolutionary origins of core organisational principles in biological systems, such as modularity, robustness, and evolvability. Models of interactions between genes (epistasis) have also provided insight into how sexual reproduction may have evolved. Additionally, recent work on viewing evolution as a form of learning from the environment has indicated certain bounds on the complexity of the genetic circuits that can evolve within feasible quantities of time and resources. Here we review the key studies and results in these areas, and discuss possible connections between them. In particular, we speculate on the link between the two notions of 'evolvability': the evolvability of a system in terms of how agile it is in responding to novel goals or environments, and the evolvability of certain kinds of gene network functionality in terms of its computational complexity. Drawing on some recent work on the complexity of graph-theoretic problems on modular networks, we suggest that modularity as an organising principle may have its raison d'etre in its ability to enhance evolvability, in both its senses. PMID:24120732

  9. Evolving communicative complexity: insights from rodents and beyond.

    PubMed

    Pollard, Kimberly A; Blumstein, Daniel T

    2012-07-01

    Social living goes hand in hand with communication, but the details of this relationship are rarely simple. Complex communication may be described by attributes as diverse as a species' entire repertoire, signallers' individualistic signatures, or complex acoustic phenomena within single calls. Similarly, attributes of social complexity are diverse and may include group size, social role diversity, or networks of interactions and relationships. How these different attributes of social and communicative complexity co-evolve is an active question in behavioural ecology. Sciurid rodents (ground squirrels, prairie dogs and marmots) provide an excellent model system for studying these questions. Sciurid studies have found that demographic role complexity predicts alarm call repertoire size, while social group size predicts alarm call individuality. Along with other taxa, sciurids reveal an important insight: different attributes of sociality are linked to different attributes of communication. By breaking social and communicative complexity down to different attributes, focused studies can better untangle the underlying evolutionary relationships and move us closer to a comprehensive theory of how sociality and communication evolve. PMID:22641825

  10. Interactions between planets and evolved stars

    NASA Astrophysics Data System (ADS)

    Shengbang, Qian; Zhongtao, Han; Fernández Lajús, E.; liying, Zhu; Wenping, Liao; Miloslav, Zejda; Linjia, Li; Voloshina, Irina; Liang, Liu; Jiajia., He

    2016-07-01

    Searching for planetary companions to evolved stars (e.g., white dwarfs (WD) and Cataclysmic Variables (CV)) can provide insight into the interaction between planets and evolved stars as well as on the ultimate fate of planets. We have monitored decades of CVs and their progenitors including some detached WD binaries since 2006 to search for planets orbiting these systems. In the present paper, we will show some observational results of circumbinary planets in orbits around CVs and their progenitors. Some of our findings include planets with the shortest distance to the central evolved binaries and a few multiple planetary systems orbiting binary stars. Finally, by comparing the observational properties of planetary companions to single WDs and WD binaries, the interaction between planets and evolved stars and the ultimate fate of planets are discussed.

  11. Neural mechanisms underlying the evolvability of behaviour

    PubMed Central

    Katz, Paul S.

    2011-01-01

    The complexity of nervous systems alters the evolvability of behaviour. Complex nervous systems are phylogenetically constrained; nevertheless particular species-specific behaviours have repeatedly evolved, suggesting a predisposition towards those behaviours. Independently evolved behaviours in animals that share a common neural architecture are generally produced by homologous neural structures, homologous neural pathways and even in the case of some invertebrates, homologous identified neurons. Such parallel evolution has been documented in the chromatic sensitivity of visual systems, motor behaviours and complex social behaviours such as pair-bonding. The appearance of homoplasious behaviours produced by homologous neural substrates suggests that there might be features of these nervous systems that favoured the repeated evolution of particular behaviours. Neuromodulation may be one such feature because it allows anatomically defined neural circuitry to be re-purposed. The developmental, genetic and physiological mechanisms that contribute to nervous system complexity may also bias the evolution of behaviour, thereby affecting the evolvability of species-specific behaviour. PMID:21690127

  12. Human reliability analysis

    SciTech Connect

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach.

  13. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  14. Reliability of fluid systems

    NASA Astrophysics Data System (ADS)

    Kopáček, Jaroslav; Fojtášek, Kamil; Dvořák, Lukáš

    2016-03-01

    This paper focuses on the importance of detection reliability, especially in complex fluid systems for demanding production technology. The initial criterion for assessing the reliability is the failure of object (element), which is seen as a random variable and their data (values) can be processed using by the mathematical methods of theory probability and statistics. They are defined the basic indicators of reliability and their applications in calculations of serial, parallel and backed-up systems. For illustration, there are calculation examples of indicators of reliability for various elements of the system and for the selected pneumatic circuit.

  15. Power Quality and Reliability Project

    NASA Technical Reports Server (NTRS)

    Attia, John O.

    2001-01-01

    One area where universities and industry can link is in the area of power systems reliability and quality - key concepts in the commercial, industrial and public sector engineering environments. Prairie View A&M University (PVAMU) has established a collaborative relationship with the University of'Texas at Arlington (UTA), NASA/Johnson Space Center (JSC), and EP&C Engineering and Technology Group (EP&C) a small disadvantage business that specializes in power quality and engineering services. The primary goal of this collaboration is to facilitate the development and implementation of a Strategic Integrated power/Systems Reliability and Curriculum Enhancement Program. The objectives of first phase of this work are: (a) to develop a course in power quality and reliability, (b) to use the campus of Prairie View A&M University as a laboratory for the study of systems reliability and quality issues, (c) to provide students with NASA/EPC shadowing and Internship experience. In this work, a course, titled "Reliability Analysis of Electrical Facilities" was developed and taught for two semesters. About thirty seven has benefited directly from this course. A laboratory accompanying the course was also developed. Four facilities at Prairie View A&M University were surveyed. Some tests that were performed are (i) earth-ground testing, (ii) voltage, amperage and harmonics of various panels in the buildings, (iii) checking the wire sizes to see if they were the right size for the load that they were carrying, (iv) vibration tests to test the status of the engines or chillers and water pumps, (v) infrared testing to the test arcing or misfiring of electrical or mechanical systems.

  16. The transcriptomics of an experimentally evolved plant-virus interaction.

    PubMed

    Hillung, Julia; García-García, Francisco; Dopazo, Joaquín; Cuevas, José M; Elena, Santiago F

    2016-01-01

    Models of plant-virus interaction assume that the ability of a virus to infect a host genotype depends on the matching between virulence and resistance genes. Recently, we evolved tobacco etch potyvirus (TEV) lineages on different ecotypes of Arabidopsis thaliana, and found that some ecotypes selected for specialist viruses whereas others selected for generalists. Here we sought to evaluate the transcriptomic basis of such relationships. We have characterized the transcriptomic responses of five ecotypes infected with the ancestral and evolved viruses. Genes and functional categories differentially expressed by plants infected with local TEV isolates were identified, showing heterogeneous responses among ecotypes, although significant parallelism existed among lineages evolved in the same ecotype. Although genes involved in immune responses were altered upon infection, other functional groups were also pervasively over-represented, suggesting that plant resistance genes were not the only drivers of viral adaptation. Finally, the transcriptomic consequences of infection with the generalist and specialist lineages were compared. Whilst the generalist induced very similar perturbations in the transcriptomes of the different ecotypes, the perturbations induced by the specialist were divergent. Plant defense mechanisms were activated when the infecting virus was specialist but they were down-regulated when infecting with generalist. PMID:27113435

  17. The transcriptomics of an experimentally evolved plant-virus interaction

    PubMed Central

    Hillung, Julia; García-García, Francisco; Dopazo, Joaquín; Cuevas, José M.; Elena, Santiago F.

    2016-01-01

    Models of plant-virus interaction assume that the ability of a virus to infect a host genotype depends on the matching between virulence and resistance genes. Recently, we evolved tobacco etch potyvirus (TEV) lineages on different ecotypes of Arabidopsis thaliana, and found that some ecotypes selected for specialist viruses whereas others selected for generalists. Here we sought to evaluate the transcriptomic basis of such relationships. We have characterized the transcriptomic responses of five ecotypes infected with the ancestral and evolved viruses. Genes and functional categories differentially expressed by plants infected with local TEV isolates were identified, showing heterogeneous responses among ecotypes, although significant parallelism existed among lineages evolved in the same ecotype. Although genes involved in immune responses were altered upon infection, other functional groups were also pervasively over-represented, suggesting that plant resistance genes were not the only drivers of viral adaptation. Finally, the transcriptomic consequences of infection with the generalist and specialist lineages were compared. Whilst the generalist induced very similar perturbations in the transcriptomes of the different ecotypes, the perturbations induced by the specialist were divergent. Plant defense mechanisms were activated when the infecting virus was specialist but they were down-regulated when infecting with generalist. PMID:27113435

  18. Hawaii electric system reliability.

    SciTech Connect

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  19. Evolving treatment plan quality criteria from institution-specific experience

    SciTech Connect

    Ruan, D.; Shao, W.; DeMarco, J.; Tenn, S.; King, C.; Low, D.; Kupelian, P.; Steinberg, M.

    2012-05-15

    Purpose: The dosimetric aspects of radiation therapy treatment plan quality are usually evaluated and reported with dose volume histogram (DVH) endpoints. For clinical practicality, a small number of representative quantities derived from the DVH are often used as dose endpoints to summarize the plan quality. National guidelines on reference values for such quantities for some standard treatment approaches are often used as acceptance criteria to trigger treatment plan review. On the other hand, treatment prescription and planning approaches specific to each institution warrants the need to report plan quality in terms of practice consistency and with respect to institution-specific experience. The purpose of this study is to investigate and develop a systematic approach to record and characterize the institution-specific plan experience and use such information to guide the design of plan quality criteria. In the clinical setting, this approach will assist in (1) improving overall plan quality and consistency and (2) detecting abnormal plan behavior for retrospective analysis. Methods: The authors propose a self-evolving methodology and have developed an in-house prototype software suite that (1) extracts the dose endpoints from a treatment plan and evaluates them against both national standard and institution-specific criteria and (2) evolves the statistics for the dose endpoints and updates institution-specific criteria. Results: The validity of the proposed methodology was demonstrated with a database of prostate stereotactic body radiotherapy cases. As more data sets are accumulated, the evolving institution-specific criteria can serve as a reliable and stable consistency measure for plan quality and reveals the potential use of the ''tighter'' criteria than national standards or projected criteria, leading to practice that may push to shrink the gap between plans deemed acceptable and the underlying unknown optimality. Conclusions: The authors have developed

  20. Zygomorphy evolved from disymmetry in Fumarioideae (Papaveraceae, Ranunculales): new evidence from an expanded molecular phylogenetic framework

    PubMed Central

    Sauquet, Hervé; Carrive, Laetitia; Poullain, Noëlie; Sannier, Julie; Damerval, Catherine; Nadot, Sophie

    2015-01-01

    Background and Aims Fumarioideae (20 genera, 593 species) is a clade of Papaveraceae (Ranunculales) characterized by flowers that are either disymmetric (i.e. two perpendicular planes of bilateral symmetry) or zygomorphic (i.e. one plane of bilateral symmetry). In contrast, the other subfamily of Papaveraceae, Papaveroideae (23 genera, 230 species), has actinomorphic flowers (i.e. more than two planes of symmetry). Understanding of the evolution of floral symmetry in this clade has so far been limited by the lack of a reliable phylogenetic framework. Pteridophyllum (one species) shares similarities with Fumarioideae but has actinomorphic flowers, and the relationships among Pteridophyllum, Papaveroideae and Fumarioideae have remained unclear. This study reassesses the evolution of floral symmetry in Papaveraceae based on new molecular phylogenetic analyses of the family. Methods Maximum likelihood, Bayesian and maximum parsimony phylogenetic analyses of Papaveraceae were conducted using six plastid markers and one nuclear marker, sampling Pteridophyllum, 18 (90 %) genera and 73 species of Fumarioideae, 11 (48 %) genera and 11 species of Papaveroideae, and a wide selection of outgroup taxa. Floral characters recorded from the literature were then optimized onto phylogenetic trees to reconstruct ancestral states using parsimony, maximum likelihood and reversible-jump Bayesian approaches. Key Results Pteridophyllum is not nested in Fumarioideae. Fumarioideae are monophyletic and Hypecoum (18 species) is the sister group of the remaining genera. Relationships within the core Fumarioideae are well resolved and supported. Dactylicapnos and all zygomorphic genera form a well-supported clade nested among disymmetric taxa. Conclusions Disymmetry of the corolla is a synapomorphy of Fumarioideae and is strongly correlated with changes in the androecium and differentiation of middle and inner tepal shape (basal spurs on middle tepals). Zygomorphy subsequently evolved from

  1. Control pole placement relationships

    NASA Technical Reports Server (NTRS)

    Ainsworth, O. R.

    1982-01-01

    Using a simplified Large Space Structure (LSS) model, a technique was developed which gives algebraic relationships for the unconstrained poles. The relationships, which were obtained by this technique, are functions of the structural characteristics and the control gains. Extremely interesting relationships evolve for the case when the structural damping is zero. If the damping is zero, the constrained poles are uncoupled from the structural mode shapes. These relationships, which are derived for structural damping and without structural damping, provide new insight into the migration of the unconstrained poles for the CFPPS.

  2. Metanetworks of artificially evolved regulatory networks

    NASA Astrophysics Data System (ADS)

    Danacı, Burçin; Erzan, Ayşe

    2016-04-01

    We study metanetworks arising in genotype and phenotype spaces, in the context of a model population of Boolean graphs evolved under selection for short dynamical attractors. We define the adjacency matrix of a graph as its genotype, which gets mutated in the course of evolution, while its phenotype is its set of dynamical attractors. Metanetworks in the genotype and phenotype spaces are formed, respectively, by genetic proximity and by phenotypic similarity, the latter weighted by the sizes of the basins of attraction of the shared attractors. We find that evolved populations of Boolean graphs form tree-like giant clusters in genotype space, while random populations of Boolean graphs are typically so far removed from each other genetically that they cannot form a metanetwork. In phenotype space, the metanetworks of evolved populations are super robust both under the elimination of weak connections and random removal of nodes.

  3. Quantifying evolvability in small biological networks

    SciTech Connect

    Nemenman, Ilya; Mugler, Andrew; Ziv, Etay; Wiggins, Chris H

    2008-01-01

    The authors introduce a quantitative measure of the capacity of a small biological network to evolve. The measure is applied to a stochastic description of the experimental setup of Guet et al. (Science 2002, 296, pp. 1466), treating chemical inducers as functional inputs to biochemical networks and the expression of a reporter gene as the functional output. The authors take an information-theoretic approach, allowing the system to set parameters that optimise signal processing ability, thus enumerating each network's highest-fidelity functions. All networks studied are highly evolvable by the measure, meaning that change in function has little dependence on change in parameters. Moreover, each network's functions are connected by paths in the parameter space along which information is not significantly lowered, meaning a network may continuously change its functionality without completely losing it along the way. This property further underscores the evolvability of the networks.

  4. Distribution characteristics of weighted bipartite evolving networks

    NASA Astrophysics Data System (ADS)

    Zhang, Danping; Dai, Meifeng; Li, Lei; Zhang, Cheng

    2015-06-01

    Motivated by an evolving model of online bipartite networks, we introduce a model of weighted bipartite evolving networks. In this model, there are two disjoint sets of nodes, called user node set and object node set. Edges only exist between two disjoint sets. Edge weights represent the usage amount between a couple of user node and object node. This model not only clinches the bipartite networks' internal mechanism of network growth, but also takes into account the object strength deterioration over time step. User strength and object strength follow power-law distributions, respectively. The weighted bipartite evolving networks have scare-free property in certain situations. Numerical simulations results agree with the theoretical analyses.

  5. Evolving networks in the human epileptic brain

    NASA Astrophysics Data System (ADS)

    Lehnertz, Klaus; Ansmann, Gerrit; Bialonski, Stephan; Dickten, Henning; Geier, Christian; Porz, Stephan

    2014-01-01

    Network theory provides novel concepts that promise an improved characterization of interacting dynamical systems. Within this framework, evolving networks can be considered as being composed of nodes, representing systems, and of time-varying edges, representing interactions between these systems. This approach is highly attractive to further our understanding of the physiological and pathophysiological dynamics in human brain networks. Indeed, there is growing evidence that the epileptic process can be regarded as a large-scale network phenomenon. We here review methodologies for inferring networks from empirical time series and for a characterization of these evolving networks. We summarize recent findings derived from studies that investigate human epileptic brain networks evolving on timescales ranging from few seconds to weeks. We point to possible pitfalls and open issues, and discuss future perspectives.

  6. Evolvable, reconfigurable hardware for future space systems

    NASA Technical Reports Server (NTRS)

    Stoica, A.; Zebulum, R. S.; Keymeulen, D.; Ferguson, M. I.; Thakoor, A.

    2002-01-01

    This paper overviews Evolvable Hardware (EHW) technology, examining its potential for enhancing survivability and flexibility of future space systems. EHW refers to selfconfiguration of electronic hardware by evolutionary/genetic search mechanisms. Evolvable Hardware can maintain existing functionality in the presence of faults and degradations due to aging, temperature and radiation. It can also configure itself for new functionality when required for mission changes or encountered opportunities. The paper illustrates hardware evolution in silicon using a JPL-designed programmable device reconfigurable at transistor level as the platform and a genetic algorithm running on a DSP as the reconfiguration mechanism. Rapid reconfiguration allows convergence to circuit solutions in the order of seconds. The experiments demonstrate functional recovery from faults as well as from degradation at extreme temperatures indicating the possibility of expanding the operational range of extreme electronics through evolved circuit solutions.

  7. Access to space: The Space Shuttle's evolving rolee

    NASA Astrophysics Data System (ADS)

    Duttry, Steven R.

    1993-04-01

    Access to space is of extreme importance to our nation and the world. Military, civil, and commercial space activities all depend on reliable space transportation systems for access to space at a reasonable cost. The Space Transportation System or Space Shuttle was originally planned to provide transportation to and from a manned Earth-orbiting space station. To justify the development and operations costs, the Space Shuttle took on other space transportation requirements to include DoD, civil, and a growing commercial launch market. This research paper or case study examines the evolving role of the Space Shuttle as our nation's means of accessing space. The case study includes a review of the events leading to the development of the Space Shuttle, identifies some of the key players in the decision-making process, examines alternatives developed to mitigate the risks associated with sole reliance on the Space Shuttle, and highlights the impacts of this national space policy following the Challenger accident.

  8. JavaGenes: Evolving Graphs with Crossover

    NASA Technical Reports Server (NTRS)

    Globus, Al; Atsatt, Sean; Lawton, John; Wipke, Todd

    2000-01-01

    Genetic algorithms usually use string or tree representations. We have developed a novel crossover operator for a directed and undirected graph representation, and used this operator to evolve molecules and circuits. Unlike strings or trees, a single point in the representation cannot divide every possible graph into two parts, because graphs may contain cycles. Thus, the crossover operator is non-trivial. A steady-state, tournament selection genetic algorithm code (JavaGenes) was written to implement and test the graph crossover operator. All runs were executed by cycle-scavagging on networked workstations using the Condor batch processing system. The JavaGenes code has evolved pharmaceutical drug molecules and simple digital circuits. Results to date suggest that JavaGenes can evolve moderate sized drug molecules and very small circuits in reasonable time. The algorithm has greater difficulty with somewhat larger circuits, suggesting that directed graphs (circuits) are more difficult to evolve than undirected graphs (molecules), although necessary differences in the crossover operator may also explain the results. In principle, JavaGenes should be able to evolve other graph-representable systems, such as transportation networks, metabolic pathways, and computer networks. However, large graphs evolve significantly slower than smaller graphs, presumably because the space-of-all-graphs explodes combinatorially with graph size. Since the representation strongly affects genetic algorithm performance, adding graphs to the evolutionary programmer's bag-of-tricks should be beneficial. Also, since graph evolution operates directly on the phenotype, the genotype-phenotype translation step, common in genetic algorithm work, is eliminated.

  9. Evolved Massive Stars in the Local Group

    NASA Astrophysics Data System (ADS)

    Drout, M. R.; Massey, P.

    2015-05-01

    In this manuscript we describe a number of recent advances in the study of evolved massive stars in the Local Group, with an emphasis on how representative populations of these stars can be used to test models of massive star evolution. In honor of the 50th anniversary of the Cerro Tololo Inter-American Observatory (CTIO) we attempt to put these finding in some historical context by discussing how our understanding of the various stages in the lives of massive stars has evolved since Cerro Tololo was first selected as the site for the observatory which would become CTIO.

  10. A Stefan problem on an evolving surface

    PubMed Central

    Alphonse, Amal; Elliott, Charles M.

    2015-01-01

    We formulate a Stefan problem on an evolving hypersurface and study the well posedness of weak solutions given L1 data. To do this, we first develop function spaces and results to handle equations on evolving surfaces in order to give a natural treatment of the problem. Then, we consider the existence of solutions for data; this is done by regularization of the nonlinearity. The regularized problem is solved by a fixed point theorem and then uniform estimates are obtained in order to pass to the limit. By using a duality method, we show continuous dependence, which allows us to extend the results to L1 data. PMID:26261364

  11. Dust around main sequence and evolved stars

    NASA Astrophysics Data System (ADS)

    Walker, H. J.; Heinrichsen, I.; Richards, P. J.

    Data for several main sequence and evolved stars, from the photopolarimeter on ISO (ISOPHOT), are presented. Dust shells are resolved for Y CVn and RS Lib at 60mum. Low resolution spectra from ISOPHOT are shown for several evolved stars, and compared to the spectrum of Vega (a stellar photosphere) and HD 169142 (showing emission features from Polycyclic Aromatic Hydrocarbons). W Lyr shows the signature of oxygen-rich circumstellar material around 3mum, V Aql and Y CVn the signature of carbon-rich material.

  12. An Evolvable Multi-Agent Approach to Space Operations Engineering

    NASA Technical Reports Server (NTRS)

    Mandutianu, Sanda; Stoica, Adrian

    1999-01-01

    A complex system of spacecraft and ground tracking stations, as well as a constellation of satellites or spacecraft, has to be able to reliably withstand sudden environment changes, resource fluctuations, dynamic resource configuration, limited communication bandwidth, etc., while maintaining the consistency of the system as a whole. It is not known in advance when a change in the environment might occur or when a particular exchange will happen. A higher degree of sophistication for the communication mechanisms between different parts of the system is required. The actual behavior has to be determined while the system is performing and the course of action can be decided at the individual level. Under such circumstances, the solution will highly benefit from increased on-board and on the ground adaptability and autonomy. An evolvable architecture based on intelligent agents that communicate and cooperate with each other can offer advantages in this direction. This paper presents an architecture of an evolvable agent-based system (software and software/hardware hybrids) as well as some plans for further implementation.

  13. A guide for the design of evolve and resequencing studies.

    PubMed

    Kofler, Robert; Schlötterer, Christian

    2014-02-01

    Standing genetic variation provides a rich reservoir of potentially useful mutations facilitating the adaptation to novel environments. Experimental evolution studies have demonstrated that rapid and strong phenotypic responses to selection can also be obtained in the laboratory. When combined with the next-generation sequencing technology, these experiments promise to identify the individual loci contributing to adaption. Nevertheless, until now, very little is known about the design of such evolve & resequencing (E&R) studies. Here, we use forward simulations of entire genomes to evaluate different experimental designs that aim to maximize the power to detect selected variants. We show that low linkage disequilibrium in the starting population, population size, duration of the experiment, and the number of replicates are the key factors in determining the power and accuracy of E&R studies. Furthermore, replication of E&R is more important for detecting the targets of selection than increasing the population size. Using an optimized design, beneficial loci with a selective advantage as low as s = 0.005 can be identified at the nucleotide level. Even when a large number of loci are selected simultaneously, up to 56% can be reliably detected without incurring large numbers of false positives. Our computer simulations suggest that, with an adequate experimental design, E&R studies are a powerful tool to identify adaptive mutations from standing genetic variation and thereby provide an excellent means to analyze the trajectories of selected alleles in evolving populations. PMID:24214537

  14. Photovoltaic system reliability

    SciTech Connect

    Maish, A.B.; Atcitty, C.; Greenberg, D.

    1997-10-01

    This paper discusses the reliability of several photovoltaic projects including SMUD`s PV Pioneer project, various projects monitored by Ascension Technology, and the Colorado Parks project. System times-to-failure range from 1 to 16 years, and maintenance costs range from 1 to 16 cents per kilowatt-hour. Factors contributing to the reliability of these systems are discussed, and practices are recommended that can be applied to future projects. This paper also discusses the methodology used to collect and analyze PV system reliability data.

  15. Surveying The Digital Landscape: Evolving Technologies 2004. The EDUCAUSE Evolving Technologies Committee

    ERIC Educational Resources Information Center

    EDUCAUSE Review, 2004

    2004-01-01

    Each year, the members of the EDUCAUSE Evolving Technologies Committee identify and research the evolving technologies that are having the most direct impact on higher education institutions. The committee members choose the relevant topics, write white papers, and present their findings at the EDUCAUSE annual conference. This year, under the…

  16. Project Evolve User-Adopter Manual.

    ERIC Educational Resources Information Center

    Joiner, Lee M.

    An adult basic education (ABE) program for mentally retarded young adults between the ages of 14 and 26 years, Project Evolve can provide education agencies for educationally handicapped children with detailed information concerning an innovative program. The manual format was developed through interviews with professional educators concerning the…

  17. The Evolving Leadership Path of Visual Analytics

    SciTech Connect

    Kluse, Michael; Peurrung, Anthony J.; Gracio, Deborah K.

    2012-01-02

    This is a requested book chapter for an internationally authored book on visual analytics and related fields, coordianted by a UK university and to be published by Springer in 2012. This chapter is an overview of the leadship strategies that PNNL's Jim Thomas and other stakeholders used to establish visual analytics as a field, and how those strategies may evolve in the future.

  18. The Evolving Office of the Registrar

    ERIC Educational Resources Information Center

    Pace, Harold L.

    2011-01-01

    A healthy registrar's office will continue to evolve as it considers student, faculty, and institutional needs; staff talents and expectations; technological opportunities; economic realities; space issues; work environments; and where the strategic plan is taking the institution in support of the mission. Several recognized leaders in the field…

  19. Did Language Evolve Like the Vertebrate Eye?

    ERIC Educational Resources Information Center

    Botha, Rudolf P.

    2002-01-01

    Offers a critical appraisal of the way in which the idea that human language or some of its features evolved like the vertebrate eye by natural selection is articulated in Pinker and Bloom's (1990) selectionist account of language evolution. Argues that this account is less than insightful because it fails to draw some of the conceptual…

  20. Origins of multicellular evolvability in snowflake yeast.

    PubMed

    Ratcliff, William C; Fankhauser, Johnathon D; Rogers, David W; Greig, Duncan; Travisano, Michael

    2015-01-01

    Complex life has arisen through a series of 'major transitions' in which collectives of formerly autonomous individuals evolve into a single, integrated organism. A key step in this process is the origin of higher-level evolvability, but little is known about how higher-level entities originate and gain the capacity to evolve as an individual. Here we report a single mutation that not only creates a new level of biological organization, but also potentiates higher-level evolvability. Disrupting the transcription factor ACE2 in Saccharomyces cerevisiae prevents mother-daughter cell separation, generating multicellular 'snowflake' yeast. Snowflake yeast develop through deterministic rules that produce geometrically defined clusters that preclude genetic conflict and display a high broad-sense heritability for multicellular traits; as a result they are preadapted to multicellular adaptation. This work demonstrates that simple microevolutionary changes can have profound macroevolutionary consequences, and suggests that the formation of clonally developing clusters may often be the first step to multicellularity. PMID:25600558

  1. Apollo 16 Evolved Lithology Sodic Ferrogabbro

    NASA Technical Reports Server (NTRS)

    Zeigler, Ryan; Jolliff, B. L.; Korotev, R. L.

    2014-01-01

    Evolved lunar igneous lithologies, often referred to as the alkali suite, are a minor but important component of the lunar crust. These evolved samples are incompatible-element rich samples, and are, not surprisingly, most common in the Apollo sites in (or near) the incompatible-element rich region of the Moon known as the Procellarum KREEP Terrane (PKT). The most commonly occurring lithologies are granites (A12, A14, A15, A17), monzogabbro (A14, A15), alkali anorthosites (A12, A14), and KREEP basalts (A15, A17). The Feldspathic Highlands Terrane is not entirely devoid of evolved lithologies, and rare clasts of alkali gabbronorite and sodic ferrogabbro (SFG) have been identified in Apollo 16 station 11 breccias 67915 and 67016. Curiously, nearly all pristine evolved lithologies have been found as small clasts or soil particles, exceptions being KREEP basalts 15382/6 and granitic sample 12013 (which is itself a breccia). Here we reexamine the petrography and geochemistry of two SFG-like particles found in a survey of Apollo 16 2-4 mm particles from the Cayley Plains 62283,7-15 and 62243,10-3 (hereafter 7-15 and 10-3 respectively). We will compare these to previously reported SFG samples, including recent analyses on the type specimen of SFG from lunar breccia 67915.

  2. A Course Evolves-Physical Anthropology.

    ERIC Educational Resources Information Center

    O'Neil, Dennis

    2001-01-01

    Describes the development of an online physical anthropology course at Palomar College (California) that evolved from online tutorials. Discusses the ability to update materials on the Web more quickly than in traditional textbooks; creating Web pages that are readable by most Web browsers; test security issues; and clarifying ownership of online…

  3. Leadership for Literacy Coaching: Evolving Research

    ERIC Educational Resources Information Center

    Taylor, Rosemarye T.; Moxley, Dale E.

    2008-01-01

    Leadership for literacy coaching is evolving in both the skills of the literacy coaches and the skills of those they coach. Issues of role clarification, communication with administration, and hesitancy to provide authentic feedback are consistently identified. Trends associated with literacy coaching indicate that they continue on their…

  4. Field-evolved resistance to Bt toxins

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Transgenic cotton expressing Bacillus thuringiensis Cry1Ac (Bt cotton) has been used commercially in the United States since 1996. An article by Tabashnik et al. 2008, Nature Biotechnology 26:199-202, states that, for the first time, there is field-evolved Bt resistance in bollworm, Helicoverpa zea...

  5. Organizational Innovation: Current Research and Evolving Concepts

    ERIC Educational Resources Information Center

    Rowe, Lloyd A.; Boise, William B.

    1974-01-01

    A conceptual framework for organizational innovation can evolve from such ideas as the process of innovation, the climate(s) required, the organizational and societal space affected by an innovation, innovation radicalness, and innovation strategies such as organizational development, functional specialization, and periodicity. (Author/WM)

  6. [Families and psychiatry: models and evolving links].

    PubMed

    Frankhauser, Adeline

    2016-01-01

    The role of the families of persons with severe psychiatric disorders (schizophrenia in particular) in the care of their relatives has recently evolved: once seen as pathogenic to be kept at a distance, the family is now recognised by professionals as a partner in the care process. The links between families and psychiatric institutions remain complex and marked by ambivalence and paradoxes. PMID:27157191

  7. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Wilson, Larry W.

    1989-01-01

    The longterm goal of this research is to identify or create a model for use in analyzing the reliability of flight control software. The immediate tasks addressed are the creation of data useful to the study of software reliability and production of results pertinent to software reliability through the analysis of existing reliability models and data. The completed data creation portion of this research consists of a Generic Checkout System (GCS) design document created in cooperation with NASA and Research Triangle Institute (RTI) experimenters. This will lead to design and code reviews with the resulting product being one of the versions used in the Terminal Descent Experiment being conducted by the Systems Validations Methods Branch (SVMB) of NASA/Langley. An appended paper details an investigation of the Jelinski-Moranda and Geometric models for software reliability. The models were given data from a process that they have correctly simulated and asked to make predictions about the reliability of that process. It was found that either model will usually fail to make good predictions. These problems were attributed to randomness in the data and replication of data was recommended.

  8. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  9. Scaled CMOS Technology Reliability Users Guide

    NASA Technical Reports Server (NTRS)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  10. Mission Reliability Estimation for Repairable Robot Teams

    NASA Technical Reports Server (NTRS)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  11. Evolving impact of Ada on a production software environment

    NASA Technical Reports Server (NTRS)

    Mcgarry, F.; Esker, L.; Quimby, K.

    1988-01-01

    Many aspects of software development with Ada have evolved as our Ada development environment has matured and personnel have become more experienced in the use of Ada. The Software Engineering Laboratory (SEL) has seen differences in the areas of cost, reliability, reuse, size, and use of Ada features. A first Ada project can be expected to cost about 30 percent more than an equivalent FORTRAN project. However, the SEL has observed significant improvements over time as a development environment progresses to second and third uses of Ada. The reliability of Ada projects is initially similar to what is expected in a mature FORTRAN environment. However, with time, one can expect to gain improvements as experience with the language increases. Reuse is one of the most promising aspects of Ada. The proportion of reusable Ada software on our Ada projects exceeds the proportion of reusable FORTRAN software on our FORTRAN projects. This result was noted fairly early in our Ada projects, and experience shows an increasing trend over time.

  12. Reliable aluminum contact formation by electrostatic bonding

    NASA Astrophysics Data System (ADS)

    Kárpáti, T.; Pap, A. E.; Radnóczi, Gy; Beke, B.; Bársony, I.; Fürjes, P.

    2015-07-01

    The paper presents a detailed study of a reliable method developed for aluminum fusion wafer bonding assisted by the electrostatic force evolving during the anodic bonding process. The IC-compatible procedure described allows the parallel formation of electrical and mechanical contacts, facilitating a reliable packaging of electromechanical systems with backside electrical contacts. This fusion bonding method supports the fabrication of complex microelectromechanical systems (MEMS) and micro-opto-electromechanical systems (MOEMS) structures with enhanced temperature stability, which is crucial in mechanical sensor applications such as pressure or force sensors. Due to the applied electrical potential of  -1000 V the Al metal layers are compressed by electrostatic force, and at the bonding temperature of 450 °C intermetallic diffusion causes aluminum ions to migrate between metal layers.

  13. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  14. Photovoltaic module reliability workshop

    NASA Astrophysics Data System (ADS)

    Mrig, L.

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986 to 1990. The reliability photovoltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warrantees available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the U.S., PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  15. Orbiter Autoland reliability analysis

    NASA Technical Reports Server (NTRS)

    Welch, D. Phillip

    1993-01-01

    The Space Shuttle Orbiter is the only space reentry vehicle in which the crew is seated upright. This position presents some physiological effects requiring countermeasures to prevent a crewmember from becoming incapacitated. This also introduces a potential need for automated vehicle landing capability. Autoland is a primary procedure that was identified as a requirement for landing following and extended duration orbiter mission. This report documents the results of the reliability analysis performed on the hardware required for an automated landing. A reliability block diagram was used to evaluate system reliability. The analysis considers the manual and automated landing modes currently available on the Orbiter. (Autoland is presently a backup system only.) Results of this study indicate a +/- 36 percent probability of successfully extending a nominal mission to 30 days. Enough variations were evaluated to verify that the reliability could be altered with missions planning and procedures. If the crew is modeled as being fully capable after 30 days, the probability of a successful manual landing is comparable to that of Autoland because much of the hardware is used for both manual and automated landing modes. The analysis indicates that the reliability for the manual mode is limited by the hardware and depends greatly on crew capability. Crew capability for a successful landing after 30 days has not been determined yet.

  16. Photovoltaic module reliability workshop

    SciTech Connect

    Mrig, L.

    1990-01-01

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  17. Evolving networks-Using past structure to predict the future

    NASA Astrophysics Data System (ADS)

    Shang, Ke-ke; Yan, Wei-sheng; Small, Michael

    2016-08-01

    Many previous studies on link prediction have focused on using common neighbors to predict the existence of links between pairs of nodes. More broadly, research into the structural properties of evolving temporal networks and temporal link prediction methods have recently attracted increasing attention. In this study, for the first time, we examine the use of links between a pair of nodes to predict their common neighbors and analyze the relationship between the weight and the structure in static networks, evolving networks, and in the corresponding randomized networks. We propose both new unweighted and weighted prediction methods and use six kinds of real networks to test our algorithms. In unweighted networks, we find that if a pair of nodes connect to each other in the current network, they will have a higher probability to connect common nodes both in the current and the future networks-and the probability will decrease with the increase of the number of neighbors. Furthermore, we find that the original networks have their particular structure and statistical characteristics which benefit link prediction. In weighted networks, the prediction algorithm performance of networks which are dominated by human factors decrease with the decrease of weight and are in general better in static networks. Furthermore, we find that geographical position and link weight both have significant influence on the transport network. Moreover, the evolving financial network has the lowest predictability. In addition, we find that the structure of non-social networks has more robustness than social networks. The structure of engineering networks has both best predictability and also robustness.

  18. Evolved gas analysis of secondary organic aerosols

    SciTech Connect

    Grosjean, D.; Williams, E.L. II; Grosjean, E. ); Novakov, T. )

    1994-11-01

    Secondary organic aerosols have been characterized by evolved gas analysis (EGA). Hydrocarbons selected as aerosol precursors were representative of anthropogenic emissions (cyclohexene, cyclopentene, 1-decene and 1-dodecene, n-dodecane, o-xylene, and 1,3,5-trimethylbenzene) and of biogenic emissions (the terpenes [alpha]-pinene, [beta]-pinene and d-limonene and the sesquiterpene trans-caryophyllene). Also analyzed by EGA were samples of secondary, primary (highway tunnel), and ambient (urban) aerosols before and after exposure to ozone and other photochemical oxidants. The major features of the EGA thermograms (amount of CO[sub 2] evolved as a function of temperature) are described. The usefulness and limitations of EGA data for source apportionment of atmospheric particulate carbon are briefly discussed. 28 refs., 7 figs., 4 tabs.

  19. Traceless protein splicing utilizing evolved split inteins

    PubMed Central

    Lockless, Steve W.; Muir, Tom W.

    2009-01-01

    Split inteins are parasitic genetic elements frequently found inserted into reading frames of essential proteins. Their association and excision restore host protein function through a protein self-splicing reaction. They have gained an increasingly important role in the chemical modification of proteins to create cyclical, segmentally labeled, and fluorescently tagged proteins. Ideally, inteins would seamlessly splice polypeptides together with no remnant sequences and at high efficiency. Here, we describe experiments that identify the branched intermediate, a transient step in the overall splicing reaction, as a key determinant of the splicing efficiency at different splice-site junctions. To alter intein specificity, we developed a cell-based selection scheme to evolve split inteins that splice with high efficiency at different splice junctions and at higher temperatures. Mutations within these evolved inteins occur at sites distant from the active site. We present a hypothesis that a network of conserved coevolving amino acids in inteins mediates these long-range effects. PMID:19541616

  20. Where the Wild Things Are: The Evolving Iconography of Rural Fauna

    ERIC Educational Resources Information Center

    Buller, Henry

    2004-01-01

    This paper explores the changing relationship between "nature" and rurality through an examination of the shifting iconography of animals, and particularly "wild" animals, in a rural setting. Drawing upon a set of examples, the paper argues that the faunistic icons of rural areas are evolving as alternative conceptions of the countryside, of…

  1. The evolving epidemiology of stone disease.

    PubMed

    Roudakova, Ksenia; Monga, Manoj

    2014-01-01

    The epidemiology of kidney stones is evolving - not only is the prevalence increasing, but also the gender gap has narrowed. What drives these changes? Diet, obesity or environmental factors? This article will review the possible explanations for a shift in the epidemiology, with the hope of gaining a better understanding of the extent to which modifiable risk factors play a role on stone formation and what measures may be undertaken for disease prevention in view of these changing trends. PMID:24497682

  2. Nursing administration research: an evolving science.

    PubMed

    Murphy, Lyn Stankiewicz; Scott, Elaine S; Warshawsky, Nora E

    2014-12-01

    The nature and focus of nursing administrative research have evolved over time. Recently, the research agenda has primarily reflected the national health policy agenda. Although nursing research has traditionally been dominated by clinical interests, nursing administrative research has historically addressed the interface of reimbursement, quality, and care delivery systems. This article traces the evolution of nursing administrative research to answer questions relevant to scope, practice, and policy and suggests future directions. PMID:25393136

  3. Design Space Issues for Intrinsic Evolvable Hardware

    NASA Technical Reports Server (NTRS)

    Hereford, James; Gwaltney, David

    2004-01-01

    This paper discusses the problem of increased programming time for intrinsic evolvable hardware (EM) as the complexity of the circuit grows. As the circuit becomes more complex, then more components will be required and a longer programming string, L, is required. We develop equations for the size of the population, n, and the number of generations required for the population to converge, based on L. Our analytical results show that even though the design search space grows as 2L (assuming a binary programming string), the number of circuit evaluations, n*ngen, only grows as O(Lg3), or slightly less than O(L). This makes evolvable techniques a good tool for exploring large design spaces. The major hurdle for intrinsic EHW is evaluation time for each possible circuit. The evaluation time involves downloading the bit string to the device, updating the device configuration, measuring the output and then transferring the output data to the control processor. Each of these steps must be done for each member of the population. The processing time of the computer becomes negligible since the selection/crossover/mutation steps are only done once per generation. Evaluation time presently limits intrinsic evolvable hardware techniques to designing only small or medium-sized circuits. To evolve large or complicated circuits, several researchers have proposed using hierarchical design or reuse techniques where submodules are combined together to form complex circuits. However, these practical approaches limit the search space of available designs and preclude utilizing parasitic coupling or other effects within the programmable device. The practical approaches also raise the issue of why intrinsic EHW techniques do not easily apply to large design spaces, since the analytical results show only an O(L) complexity growth.

  4. Quantum games on evolving random networks

    NASA Astrophysics Data System (ADS)

    Pawela, Łukasz

    2016-09-01

    We study the advantages of quantum strategies in evolutionary social dilemmas on evolving random networks. We focus our study on the two-player games: prisoner's dilemma, snowdrift and stag-hunt games. The obtained result show the benefits of quantum strategies for the prisoner's dilemma game. For the other two games, we obtain regions of parameters where the quantum strategies dominate, as well as regions where the classical strategies coexist.

  5. Chemical evolution of viscously evolving galactic discs

    NASA Technical Reports Server (NTRS)

    Clarke, Catherine J.

    1989-01-01

    The ability of the Lin-Pringle (1987) model of galactic disk formation to reproduce the observed radial distributions of total gas surface density and metals in disk galaxies is investigated. It is found that a satisfactory fit is obtained provided that there exists an outer cut-off to the star-forming disk beyond which gas is allowed to viscously evolve. The metallicity gradient is then established by radial inflow of gas from beyond this cut-off.

  6. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  7. Software reliability perspectives

    NASA Technical Reports Server (NTRS)

    Wilson, Larry; Shen, Wenhui

    1987-01-01

    Software which is used in life critical functions must be known to be highly reliable before installation. This requires a strong testing program to estimate the reliability, since neither formal methods, software engineering nor fault tolerant methods can guarantee perfection. Prior to the final testing software goes through a debugging period and many models have been developed to try to estimate reliability from the debugging data. However, the existing models are poorly validated and often give poor performance. This paper emphasizes the fact that part of their failures can be attributed to the random nature of the debugging data given to these models as input, and it poses the problem of correcting this defect as an area of future research.

  8. Continuous evaluation of evolving behavioral intervention technologies.

    PubMed

    Mohr, David C; Cheung, Ken; Schueller, Stephen M; Hendricks Brown, C; Duan, Naihua

    2013-10-01

    Behavioral intervention technologies (BITs) are web-based and mobile interventions intended to support patients and consumers in changing behaviors related to health, mental health, and well-being. BITs are provided to patients and consumers in clinical care settings and commercial marketplaces, frequently with little or no evaluation. Current evaluation methods, including RCTs and implementation studies, can require years to validate an intervention. This timeline is fundamentally incompatible with the BIT environment, where technology advancement and changes in consumer expectations occur quickly, necessitating rapidly evolving interventions. However, BITs can routinely and iteratively collect data in a planned and strategic manner and generate evidence through systematic prospective analyses, thereby creating a system that can "learn." A methodologic framework, Continuous Evaluation of Evolving Behavioral Intervention Technologies (CEEBIT), is proposed that can support the evaluation of multiple BITs or evolving versions, eliminating those that demonstrate poorer outcomes, while allowing new BITs to be entered at any time. CEEBIT could be used to ensure the effectiveness of BITs provided through deployment platforms in clinical care organizations or BIT marketplaces. The features of CEEBIT are described, including criteria for the determination of inferiority, determination of BIT inclusion, methods of assigning consumers to BITs, definition of outcomes, and evaluation of the usefulness of the system. CEEBIT offers the potential to collapse initial evaluation and postmarketing surveillance, providing ongoing assurance of safety and efficacy to patients and consumers, payers, and policymakers. PMID:24050429

  9. Transistor Level Circuit Experiments using Evolvable Hardware

    NASA Technical Reports Server (NTRS)

    Stoica, A.; Zebulum, R. S.; Keymeulen, D.; Ferguson, M. I.; Daud, Taher; Thakoor, A.

    2005-01-01

    The Jet Propulsion Laboratory (JPL) performs research in fault tolerant, long life, and space survivable electronics for the National Aeronautics and Space Administration (NASA). With that focus, JPL has been involved in Evolvable Hardware (EHW) technology research for the past several years. We have advanced the technology not only by simulation and evolution experiments, but also by designing, fabricating, and evolving a variety of transistor-based analog and digital circuits at the chip level. EHW refers to self-configuration of electronic hardware by evolutionary/genetic search mechanisms, thereby maintaining existing functionality in the presence of degradations due to aging, temperature, and radiation. In addition, EHW has the capability to reconfigure itself for new functionality when required for mission changes or encountered opportunities. Evolution experiments are performed using a genetic algorithm running on a DSP as the reconfiguration mechanism and controlling the evolvable hardware mounted on a self-contained circuit board. Rapid reconfiguration allows convergence to circuit solutions in the order of seconds. The paper illustrates hardware evolution results of electronic circuits and their ability to perform under 230 C temperature as well as radiations of up to 250 kRad.

  10. Evolving hardware as model of enzyme evolution.

    PubMed

    Lahoz-Beltra, R

    2001-06-01

    Organism growth and survival is based on thousands of enzymes organized in networks. The motivation to understand how a large number of enzymes evolved so fast inside cells may be relevant to explaining the origin and maintenance of life on Earth. This paper presents electronic circuits called 'electronic enzymes' that model the catalytic function performed by biological enzymes. Electronic enzymes are the hardware realization of enzymes defined as molecular automata with a finite number of internal conformational states and a set of Boolean operators modelling the active groups of the active site. One of the main features of electronic enzymes is the possibility of evolution finding the proper active site by means of a genetic algorithm yielding a metabolic ring or k-cycle that bears a resemblance to Krebs (k=7) or Calvin (k=4) cycles present in organisms. The simulations are consistent with those results obtained in vitro evolving enzymes based on polymerase chain reaction (PCR) as well as with the general view that suggests the main role of recombination during enzyme evolution. The proposed methodology shows how molecular automata with evolvable features that model enzymes or other processing molecules provide an experimental framework for simulation of the principles governing metabolic pathways evolution and self-organization. PMID:11448522

  11. Continuous Evaluation of Evolving Behavioral Intervention Technologies

    PubMed Central

    Mohr, David C.; Cheung, Ken; Schueller, Stephen M.; Brown, C. Hendricks; Duan, Naihua

    2013-01-01

    Behavioral intervention technologies (BITs) are web-based and mobile interventions intended to support patients and consumers in changing behaviors related to health, mental health, and well-being. BITs are provided to patients and consumers in clinical care settings and commercial marketplaces, frequently with little or no evaluation. Current evaluation methods, including RCTs and implementation studies, can require years to validate an intervention. This timeline is fundamentally incompatible with the BIT environment, where technology advancement and changes in consumer expectations occur quickly, necessitating rapidly evolving interventions. However, BITs can routinely and iteratively collect data in a planned and strategic manner and generate evidence through systematic prospective analyses, thereby creating a system that can “learn.” A methodologic framework, Continuous Evaluation of Evolving Behavioral Intervention Technologies (CEEBIT), is proposed that can support the evaluation of multiple BITs or evolving versions, eliminating those that demonstrate poorer outcomes, while allowing new BITs to be entered at any time. CEEBIT could be used to ensure the effectiveness of BITs provided through deployment platforms in clinical care organizations or BIT marketplaces. The features of CEEBIT are described, including criteria for the determination of inferiority, determination of BIT inclusion, methods of assigning consumers to BITs, definition of outcomes, and evaluation of the usefulness of the system. CEEBIT offers the potential to collapse initial evaluation and postmarketing surveillance, providing ongoing assurance of safety and efficacy to patients and consumers, payers, and policymakers. PMID:24050429

  12. Evolving Stochastic Learning Algorithm based on Tsallis entropic index

    NASA Astrophysics Data System (ADS)

    Anastasiadis, A. D.; Magoulas, G. D.

    2006-03-01

    In this paper, inspired from our previous algorithm, which was based on the theory of Tsallis statistical mechanics, we develop a new evolving stochastic learning algorithm for neural networks. The new algorithm combines deterministic and stochastic search steps by employing a different adaptive stepsize for each network weight, and applies a form of noise that is characterized by the nonextensive entropic index q, regulated by a weight decay term. The behavior of the learning algorithm can be made more stochastic or deterministic depending on the trade off between the temperature T and the q values. This is achieved by introducing a formula that defines a time-dependent relationship between these two important learning parameters. Our experimental study verifies that there are indeed improvements in the convergence speed of this new evolving stochastic learning algorithm, which makes learning faster than using the original Hybrid Learning Scheme (HLS). In addition, experiments are conducted to explore the influence of the entropic index q and temperature T on the convergence speed and stability of the proposed method.

  13. Gearbox Reliability Collaborative Update (Presentation)

    SciTech Connect

    Sheng, S.

    2013-10-01

    This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.

  14. Materials reliability issues in microelectronics

    SciTech Connect

    Lloyd, J.R. ); Yost, F.G. ); Ho, P.S. )

    1991-01-01

    This book covers the proceedings of a MRS symposium on materials reliability in microelectronics. Topics include: electromigration; stress effects on reliability; stress and packaging; metallization; device, oxide and dielectric reliability; new investigative techniques; and corrosion.

  15. Reliability Degradation Due to Stockpile Aging

    SciTech Connect

    Robinson, David G.

    1999-04-01

    The objective of this reseach is the investigation of alternative methods for characterizing the reliability of systems with time dependent failure modes associated with stockpile aging. Reference to 'reliability degradation' has, unfortunately, come to be associated with all types of aging analyes: both deterministic and stochastic. In this research, in keeping with the true theoretical definition, reliability is defined as a probabilistic description of system performance as a funtion of time. Traditional reliability methods used to characterize stockpile reliability depend on the collection of a large number of samples or observations. Clearly, after the experiments have been performed and the data has been collected, critical performance problems can be identified. A Major goal of this research is to identify existing methods and/or develop new mathematical techniques and computer analysis tools to anticipate stockpile problems before they become critical issues. One of the most popular methods for characterizing the reliability of components, particularly electronic components, assumes that failures occur in a completely random fashion, i.e. uniformly across time. This method is based primarily on the use of constant failure rates for the various elements that constitute the weapon system, i.e. the systems do not degrade while in storage. Experience has shown that predictions based upon this approach should be regarded with great skepticism since the relationship between the life predicted and the observed life has been difficult to validate. In addition to this fundamental problem, the approach does not recognize that there are time dependent material properties and variations associated with the manufacturing process and the operational environment. To appreciate the uncertainties in predicting system reliability a number of alternative methods are explored in this report. All of the methods are very different from those currently used to assess stockpile

  16. An Examination of the Reliability of Scores from Zuckerman's Sensation Seeking Scales, Form V.

    ERIC Educational Resources Information Center

    Deditius-Island, Heide K.; Caruso, John C.

    2002-01-01

    Conducted a reliability generalization study on Zuckerman's Sensation Seeking Scale (M. Zuckerman and others, 1964) using 113 reliability coefficients from 21 published studies. The reliability of scores was marginal for four of the five scales, and low for the other. Mean age of subjects has a significant relationship with score reliability. (SLD)

  17. The organization and control of an evolving interdependent population.

    PubMed

    Vural, Dervis C; Isakov, Alexander; Mahadevan, L

    2015-07-01

    Starting with Darwin, biologists have asked how populations evolve from a low fitness state that is evolutionarily stable to a high fitness state that is not. Specifically of interest is the emergence of cooperation and multicellularity where the fitness of individuals often appears in conflict with that of the population. Theories of social evolution and evolutionary game theory have produced a number of fruitful results employing two-state two-body frameworks. In this study, we depart from this tradition and instead consider a multi-player, multi-state evolutionary game, in which the fitness of an agent is determined by its relationship to an arbitrary number of other agents. We show that populations organize themselves in one of four distinct phases of interdependence depending on one parameter, selection strength. Some of these phases involve the formation of specialized large-scale structures. We then describe how the evolution of independence can be manipulated through various external perturbations. PMID:26040593

  18. The organization and control of an evolving interdependent population

    PubMed Central

    Vural, Dervis C.; Isakov, Alexander; Mahadevan, L.

    2015-01-01

    Starting with Darwin, biologists have asked how populations evolve from a low fitness state that is evolutionarily stable to a high fitness state that is not. Specifically of interest is the emergence of cooperation and multicellularity where the fitness of individuals often appears in conflict with that of the population. Theories of social evolution and evolutionary game theory have produced a number of fruitful results employing two-state two-body frameworks. In this study, we depart from this tradition and instead consider a multi-player, multi-state evolutionary game, in which the fitness of an agent is determined by its relationship to an arbitrary number of other agents. We show that populations organize themselves in one of four distinct phases of interdependence depending on one parameter, selection strength. Some of these phases involve the formation of specialized large-scale structures. We then describe how the evolution of independence can be manipulated through various external perturbations. PMID:26040593

  19. Wild Origins: The Evolving Nature of Animal Behavior

    NASA Astrophysics Data System (ADS)

    Flores, Ifigenia

    For billions of years, evolution has been the driving force behind the incredible range of biodiversity on our planet. Wild Origins is a concept plan for an exhibition at the National Zoo that uses case studies of animal behavior to explain the theory of evolution. Behaviors evolve, just as physical forms do. Understanding natural selection can help us interpret animal behavior and vice-versa. A living collection, digital media, interactives, fossils, and photographs will relay stories of social behavior, sex, navigation and migration, foraging, domestication, and relationships between different species. The informal learning opportunities visitors are offered at the zoo will create a connection with the exhibition's teaching points. Visitors will leave with an understanding and sense of wonder at the evolutionary view of life.

  20. Bioharness™ Multivariable Monitoring Device: Part. II: Reliability

    PubMed Central

    Johnstone, James A.; Ford, Paul A.; Hughes, Gerwyn; Watson, Tim; Garrett, Andrew T.

    2012-01-01

    The Bioharness™ monitoring system may provide physiological information on human performance but the reliability of this data is fundamental for confidence in the equipment being used. The objective of this study was to assess the reliability of each of the 5 Bioharness™ variables using a treadmill based protocol. 10 healthy males participated. A between and within subject design to assess the reliability of Heart rate (HR), Breathing Frequency (BF), Accelerometry (ACC) and Infra-red skin temperature (ST) was completed via a repeated, discontinuous, incremental treadmill protocol. Posture (P) was assessed by a tilt table, moved through 160°. Between subject data reported low Coefficient of Variation (CV) and strong correlations(r) for ACC and P (CV< 7.6; r = 0.99, p < 0.01). In contrast, HR and BF (CV~19.4; r~0.70, p < 0.01) and ST (CV 3.7; r = 0.61, p < 0.01), present more variable data. Intra and inter device data presented strong relationships (r > 0.89, p < 0.01) and low CV (<10.1) for HR, ACC, P and ST. BF produced weaker relationships (r < 0.72) and higher CV (<17.4). In comparison to the other variables BF variable consistently presents less reliability. Global results suggest that the Bioharness™ is a reliable multivariable monitoring device during laboratory testing within the limits presented. Key pointsHeart rate and breathing frequency data increased in variance at higher velocities (i.e. ≥ 10 km.h-1)In comparison to the between subject testing, the intra and inter reliability presented good reliability in data suggesting placement or position of device relative to performer could be important for data collectionUnderstanding a devices variability in measurement is important before it can be used within an exercise testing or monitoring setting PMID:24149347

  1. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the ``factories,`` reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  2. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the factories,'' reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  3. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  4. Software Reliability Measurement Experience

    NASA Technical Reports Server (NTRS)

    Nikora, A. P.

    1993-01-01

    In this chapter, we describe a recent study of software reliability measurement methods that was conducted at the Jet Propulsion Laboratory. The first section of the chapter, sections 8.1, summarizes the study, characterizes the participating projects, describes the available data, and summarizes the tudy's results.

  5. Reliable solar cookers

    SciTech Connect

    Magney, G.K.

    1992-12-31

    The author describes the activities of SERVE, a Christian relief and development agency, to introduce solar ovens to the Afghan refugees in Pakistan. It has provided 5,000 solar cookers since 1984. The experience has demonstrated the potential of the technology and the need for a durable and reliable product. Common complaints about the cookers are discussed and the ideal cooker is described.

  6. Parametric Mass Reliability Study

    NASA Technical Reports Server (NTRS)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  7. Nonparametric Methods in Reliability

    PubMed Central

    Hollander, Myles; Peña, Edsel A.

    2005-01-01

    Probabilistic and statistical models for the occurrence of a recurrent event over time are described. These models have applicability in the reliability, engineering, biomedical and other areas where a series of events occurs for an experimental unit as time progresses. Nonparametric inference methods, in particular, the estimation of a relevant distribution function, are described. PMID:16710444

  8. Conceptualizing Essay Tests' Reliability and Validity: From Research to Theory

    ERIC Educational Resources Information Center

    Badjadi, Nour El Imane

    2013-01-01

    The current paper on writing assessment surveys the literature on the reliability and validity of essay tests. The paper aims to examine the two concepts in relationship with essay testing as well as to provide a snapshot of the current understandings of the reliability and validity of essay tests as drawn in recent research studies. Bearing in…

  9. The Mineralogy of Dust Around Evolved Stars

    NASA Astrophysics Data System (ADS)

    Speck, A. K.

    1998-11-01

    Infrared (IR) observations of evolved red giant stars (AGB stars) have shown that many are surrounded by dust envelopes, which are ejected into the interstellar medium and seed the next generation of stars and planets. By studying these one can understand the origins of interstellar and solar system materials. AGB stars fall into two main categories: oxygen-rich and carbon-rich. The prominent features of the IR spectra of AGB stars are: the 11.3microns feature of C-stars, attributed to silicon carbide (SiC); and the 9.7microns feature of O-rich stars, attributed to silicates. There are also various minor features with less secure identifications. Identifying dust around stars requires the use of laboratory spectra of dust species analogous to those one expects to observe. I have compiled a database of such spectra, and thereby constrained the identifications of circumstellar dust, which I have also tried to ensure are compatible with data from meteoritic presolar grains. Some laboratory spectra need to be modified before they are relevant to the problem in hand, i.e. stardust. The techniques used for such modifications are outlined in the thesis. In order to fully comprehend the problems that can arise from using laboratory spectra, the way in which light interacts with matter must be understood. To this end the optical properties of matter are discussed. While the mineral constituents of the Earth have been reprocessed so extensively that they no longer contain any evidence of their stellar origins, the same is not true of primitive meteorites which contain "presolar" dust grains with isotopic fingerprints identifying their stellar sources. By comparing these presolar grains with nucleosynthesis models, grains expected to form around various stars and observational evidence of dust, we can gain a better picture of the formation mechanisms and sites of the various dust grains. I have investigated the mineralogy of SiC of 32 C-stars and its relationship to

  10. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection

    NASA Astrophysics Data System (ADS)

    Janković, Srdja; Ćirković, Milan M.

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  11. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection.

    PubMed

    Janković, Srdja; Ćirković, Milan M

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection. PMID:26419865

  12. Information entropy as a measure of genetic diversity and evolvability in colonization.

    PubMed

    Day, Troy

    2015-05-01

    In recent years, several studies have examined the relationship between genetic diversity and establishment success in colonizing species. Many of these studies have shown that genetic diversity enhances establishment success. There are several hypotheses that might explain this pattern, and here I focus on the possibility that greater genetic diversity results in greater evolvability during colonization. Evaluating the importance of this mechanism first requires that we quantify evolvability. Currently, most measures of evolvability have been developed for quantitative traits whereas many studies of colonization success deal with discrete molecular markers or phenotypes. The purpose of this study is to derive a suitable measure of evolvability for such discrete data. I show that under certain assumptions, Shannon's information entropy of the allelic distribution provides a natural measure of evolvability. This helps to alleviate previous concerns about the interpretation of information entropy for genetic data. I also suggest that information entropy provides a natural generalization to previous measures of evolvability for quantitative traits when the trait distributions are not necessarily multivariate normal. PMID:25604806

  13. Survivability Is More Fundamental Than Evolvability

    PubMed Central

    Palmer, Michael E.; Feldman, Marcus W.

    2012-01-01

    For a lineage to survive over long time periods, it must sometimes change. This has given rise to the term evolvability, meaning the tendency to produce adaptive variation. One lineage may be superior to another in terms of its current standing variation, or it may tend to produce more adaptive variation. However, evolutionary outcomes depend on more than standing variation and produced adaptive variation: deleterious variation also matters. Evolvability, as most commonly interpreted, is not predictive of evolutionary outcomes. Here, we define a predictive measure of the evolutionary success of a lineage that we call the k-survivability, defined as the probability that the lineage avoids extinction for k generations. We estimate the k-survivability using multiple experimental replicates. Because we measure evolutionary outcomes, the initial standing variation, the full spectrum of generated variation, and the heritability of that variation are all incorporated. Survivability also accounts for the decreased joint likelihood of extinction of sub-lineages when they 1) disperse in space, or 2) diversify in lifestyle. We illustrate measurement of survivability with in silico models, and suggest that it may also be measured in vivo using multiple longitudinal replicates. The k-survivability is a metric that enables the quantitative study of, for example, the evolution of 1) mutation rates, 2) dispersal mechanisms, 3) the genotype-phenotype map, and 4) sexual reproduction, in temporally and spatially fluctuating environments. Although these disparate phenomena evolve by well-understood microevolutionary rules, they are also subject to the macroevolutionary constraint of long-term survivability. PMID:22723844

  14. Production and decay of evolving horizons

    NASA Astrophysics Data System (ADS)

    Nielsen, Alex B.; Visser, Matt

    2006-07-01

    We consider a simple physical model for an evolving horizon that is strongly interacting with its environment, exchanging arbitrarily large quantities of matter with its environment in the form of both infalling material and outgoing Hawking radiation. We permit fluxes of both lightlike and timelike particles to cross the horizon, and ask how the horizon grows and shrinks in response to such flows. We place a premium on providing a clear and straightforward exposition with simple formulae. To be able to handle such a highly dynamical situation in a simple manner we make one significant physical restriction—that of spherical symmetry—and two technical mathematical restrictions: (1) we choose to slice the spacetime in such a way that the spacetime foliations (and hence the horizons) are always spherically symmetric. (2) Furthermore, we adopt Painlevé Gullstrand coordinates (which are well suited to the problem because they are nonsingular at the horizon) in order to simplify the relevant calculations. Of course physics results are ultimately independent of the choice of coordinates, but this particular coordinate system yields a clean physical interpretation of the relevant physics. We find particularly simple forms for surface gravity, and for the first and second law of black hole thermodynamics, in this general evolving horizon situation. Furthermore, we relate our results to Hawking's apparent horizon, Ashtekar and co-worker's isolated and dynamical horizons, and Hayward's trapping horizon. The evolving black hole model discussed here will be of interest, both from an astrophysical viewpoint in terms of discussing growing black holes and from a purely theoretical viewpoint in discussing black hole evaporation via Hawking radiation.

  15. Evolvable circuit with transistor-level reconfigurability

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian (Inventor); Salazar-Lazaro, Carlos Harold (Inventor)

    2004-01-01

    An evolvable circuit includes a plurality of reconfigurable switches, a plurality of transistors within a region of the circuit, the plurality of transistors having terminals, the plurality of transistors being coupled between a power source terminal and a power sink terminal so as to be capable of admitting power between the power source terminal and the power sink terminal, the plurality of transistors being coupled so that every transistor terminal to transistor terminal coupling within the region of the circuit comprises a reconfigurable switch.

  16. Earth As an Evolving Planetary System

    NASA Astrophysics Data System (ADS)

    Meert, Joseph G.

    2005-05-01

    ``System'' is an overused buzzword in textbooks covering geological sciences. Describing the Earth as a system of component parts is a reasonable concept, but providing a comprehensive framework for detailing the system is a more formidable task. Kent Condie lays out the systems approach in an easy-to-read introductory chapter in Earth as an Evolving Planetary System. In the book, Condie makes a valiant attempt at taking the mélange of diverse subjects in the solid Earth sciences and weaving them into a coherent tapestry.

  17. Mobile computing acceptance grows as applications evolve.

    PubMed

    Porn, Louis M; Patrick, Kelly

    2002-01-01

    Handheld devices are becoming more cost-effective to own, and their use in healthcare environments is increasing. Handheld devices currently are being used for e-prescribing, charge capture, and accessing daily schedules and reference tools. Future applications may include education on medications, dictation, order entry, and test-results reporting. Selecting the right handheld device requires careful analysis of current and future applications, as well as vendor expertise. It is important to recognize the technology will continue to evolve over the next three years. PMID:11806321

  18. Evolving Black Holes with Wavy Initial Data

    NASA Astrophysics Data System (ADS)

    Kelly, Bernard; Tichy, Wolfgang; Zlochower, Yosef; Campanelli, Manuela; Whiting, Bernard

    2009-05-01

    In Kelly et al. [Phys. Rev. D v. 76, 024008 (2007)], we presented new binary black-hole initial data adapted to puncture evolutions in numerical relativity. This data satisfies the constraint equations to 2.5 post-Newtonian order, and contains a transverse-traceless ``wavy'' metric contribution, violating the standard assumption of conformal flatness. We report on progress in evolving this data with a modern moving-puncture implementation of the BSSN equations in several numerical codes. We will discuss the effect of the new metric terms on junk radiation and continuity of physical radiation extracted.

  19. Present weather and climate: evolving conditions

    USGS Publications Warehouse

    Hoerling, Martin P; Dettinger, Michael; Wolter, Klaus; Lukas, Jeff; Eischeid, Jon K.; Nemani, Rama; Liebmann, Brant; Kunkel, Kenneth E.

    2013-01-01

    This chapter assesses weather and climate variability and trends in the Southwest, using observed climate and paleoclimate records. It analyzes the last 100 years of climate variability in comparison to the last 1,000 years, and links the important features of evolving climate conditions to river flow variability in four of the region’s major drainage basins. The chapter closes with an assessment of the monitoring and scientific research needed to increase confidence in understanding when climate episodes, events, and phenomena are attributable to human-caused climate change.

  20. Evolving techniques for gastrointestinal endoscopic hemostasis treatment.

    PubMed

    Ghassemi, Kevin A; Jensen, Dennis M

    2016-05-01

    With mortality due to gastrointestinal (GI) bleeding remaining stable, the focus on endoscopic hemostasis has been on improving other outcomes such as rebleeding rate, need for transfusions, and need for angiographic embolization or surgery. Over the past few years, a number of devices have emerged to help endoscopically assess and treat bleeding GI lesions. These include the Doppler endoscopic probe, hemostatic powder, and over-the-scope clip. Also, new applications have been described for radiofrequency ablation. In this article, we will discuss these evolving tools and techniques that have been developed, including an analysis of their efficacy and limitations. PMID:26651414

  1. Scar State on Time-evolving Wavepacket

    NASA Astrophysics Data System (ADS)

    Tomiya, Mitsuyoshi; Tsuyuki, Hiroyoshi; Kawamura, Kentaro; Sakamoto, Shoichi; Heller, Eric J.

    2015-09-01

    The scar-like enhancement is found in the accumulation of the time-evolving wavepacket in stadium billiard. It appears close to unstable periodic orbits, when the wavepackets are launched along the orbits. The enhancement is essentially due to the same mechanism of the well-known scar states in stationary eigenstates. The weighted spectral function reveals that the enhancement is the pileup of contributions from scar states on the same periodic orbit. The availavility of the weighted spectrum to the semiclassical approximation is also disscussed.

  2. Reliability-based design optimization under stationary stochastic process loads

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Du, Xiaoping

    2016-08-01

    Time-dependent reliability-based design ensures the satisfaction of reliability requirements for a given period of time, but with a high computational cost. This work improves the computational efficiency by extending the sequential optimization and reliability analysis (SORA) method to time-dependent problems with both stationary stochastic process loads and random variables. The challenge of the extension is the identification of the most probable point (MPP) associated with time-dependent reliability targets. Since a direct relationship between the MPP and reliability target does not exist, this work defines the concept of equivalent MPP, which is identified by the extreme value analysis and the inverse saddlepoint approximation. With the equivalent MPP, the time-dependent reliability-based design optimization is decomposed into two decoupled loops: deterministic design optimization and reliability analysis, and both are performed sequentially. Two numerical examples are used to show the efficiency of the proposed method.

  3. Space Shuttle Propulsion System Reliability

    NASA Technical Reports Server (NTRS)

    Welzyn, Ken; VanHooser, Katherine; Moore, Dennis; Wood, David

    2011-01-01

    This session includes the following sessions: (1) External Tank (ET) System Reliability and Lessons, (2) Space Shuttle Main Engine (SSME), Reliability Validated by a Million Seconds of Testing, (3) Reusable Solid Rocket Motor (RSRM) Reliability via Process Control, and (4) Solid Rocket Booster (SRB) Reliability via Acceptance and Testing.

  4. Reliable broadcast protocols

    NASA Technical Reports Server (NTRS)

    Joseph, T. A.; Birman, Kenneth P.

    1989-01-01

    A number of broadcast protocols that are reliable subject to a variety of ordering and delivery guarantees are considered. Developing applications that are distributed over a number of sites and/or must tolerate the failures of some of them becomes a considerably simpler task when such protocols are available for communication. Without such protocols the kinds of distributed applications that can reasonably be built will have a very limited scope. As the trend towards distribution and decentralization continues, it will not be surprising if reliable broadcast protocols have the same role in distributed operating systems of the future that message passing mechanisms have in the operating systems of today. On the other hand, the problems of engineering such a system remain large. For example, deciding which protocol is the most appropriate to use in a certain situation or how to balance the latency-communication-storage costs is not an easy question.

  5. Data networks reliability

    NASA Astrophysics Data System (ADS)

    Gallager, Robert G.

    1988-10-01

    The research from 1984 to 1986 on Data Network Reliability had the objective of developing general principles governing the reliable and efficient control of data networks. The research was centered around three major areas: congestion control, multiaccess networks, and distributed asynchronous algorithms. The major topics within congestion control were the use of flow control algorithms. The major topics within congestion control were the use of flow control to reduce congestion and the use of routing to reduce congestion. The major topics within multiaccess networks were the communication properties of multiaccess channels, collision resolution, and packet radio networks. The major topics within asynchronous distributed algorithms were failure recovery, time vs. communication tradeoffs, and the general theory of distributed algorithms.

  6. Human Reliability Program Workshop

    SciTech Connect

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  7. Reliability of photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1986-01-01

    In order to assess the reliability of photovoltaic modules, four categories of known array failure and degradation mechanisms are discussed, and target reliability allocations have been developed within each category based on the available technology and the life-cycle-cost requirements of future large-scale terrestrial applications. Cell-level failure mechanisms associated with open-circuiting or short-circuiting of individual solar cells generally arise from cell cracking or the fatigue of cell-to-cell interconnects. Power degradation mechanisms considered include gradual power loss in cells, light-induced effects, and module optical degradation. Module-level failure mechanisms and life-limiting wear-out mechanisms are also explored.

  8. Initial value sensitivity of the Chinese stock market and its relationship with the investment psychology

    NASA Astrophysics Data System (ADS)

    Ying, Shangjun; Li, Xiaojun; Zhong, Xiuqin

    2015-04-01

    This paper discusses the initial value sensitivity (IVS) of Chinese stock market, including the single stock market and the Chinese A-share stock market, with respect to real markets and evolving models. The aim is to explore the relationship between IVS of the Chinese A-share stock market and the investment psychology based on the evolving model of genetic cellular automaton (GCA). We find: (1) The Chinese stock market is sensitively dependent on the initial conditions. (2) The GCA model provides a considerable reliability in complexity simulation (e.g. the IVS). (3) The IVS of stock market is positively correlated with the imitation probability when the intensity of the imitation psychology reaches a certain threshold. The paper suggests that the government should seek to keep the imitation psychology under a certain level, otherwise it may induce severe fluctuation to the market.

  9. Compact, Reliable EEPROM Controller

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Kleyner, Igor

    2010-01-01

    A compact, reliable controller for an electrically erasable, programmable read-only memory (EEPROM) has been developed specifically for a space-flight application. The design may be adaptable to other applications in which there are requirements for reliability in general and, in particular, for prevention of inadvertent writing of data in EEPROM cells. Inadvertent writes pose risks of loss of reliability in the original space-flight application and could pose such risks in other applications. Prior EEPROM controllers are large and complex and do not provide all reasonable protections (in many cases, few or no protections) against inadvertent writes. In contrast, the present controller provides several layers of protection against inadvertent writes. The controller also incorporates a write-time monitor, enabling determination of trends in the performance of an EEPROM through all phases of testing. The controller has been designed as an integral subsystem of a system that includes not only the controller and the controlled EEPROM aboard a spacecraft but also computers in a ground control station, relatively simple onboard support circuitry, and an onboard communication subsystem that utilizes the MIL-STD-1553B protocol. (MIL-STD-1553B is a military standard that encompasses a method of communication and electrical-interface requirements for digital electronic subsystems connected to a data bus. MIL-STD- 1553B is commonly used in defense and space applications.) The intent was to both maximize reliability while minimizing the size and complexity of onboard circuitry. In operation, control of the EEPROM is effected via the ground computers, the MIL-STD-1553B communication subsystem, and the onboard support circuitry, all of which, in combination, provide the multiple layers of protection against inadvertent writes. There is no controller software, unlike in many prior EEPROM controllers; software can be a major contributor to unreliability, particularly in fault

  10. Spacecraft transmitter reliability

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A workshop on spacecraft transmitter reliability was held at the NASA Lewis Research Center on September 25 and 26, 1979, to discuss present knowledge and to plan future research areas. Since formal papers were not submitted, this synopsis was derived from audio tapes of the workshop. The following subjects were covered: users' experience with space transmitters; cathodes; power supplies and interfaces; and specifications and quality assurance. A panel discussion ended the workshop.

  11. Reliability and testing

    NASA Technical Reports Server (NTRS)

    Auer, Werner

    1996-01-01

    Reliability and its interdependence with testing are important topics for development and manufacturing of successful products. This generally accepted fact is not only a technical statement, but must be also seen in the light of 'Human Factors.' While the background for this paper is the experience gained with electromechanical/electronic space products, including control and system considerations, it is believed that the content could be also of interest for other fields.

  12. Shaping the outflows of evolved stars

    NASA Astrophysics Data System (ADS)

    Mohamed, Shazrene

    2015-08-01

    Both hot and cool evolved stars, e.g., red (super)giants and Wolf-Rayet stars, lose copious amounts of mass, momentum and mechanical energy through powerful, dense stellar winds. The interaction of these outflows with their surroundings results in highly structured and complex circumstellar environments, often featuring knots, arcs, shells and spirals. Recent improvements in computational power and techniques have led to the development of detailed, multi-dimensional simulations that have given new insight into the origin of these structures, and better understanding of the physical mechanisms driving their formation. In this talk, I will discuss three of the main mechanisms that shape the outflows of evolved stars:- interaction with the interstellar medium (ISM), i.e., wind-ISM interactions- interaction with a stellar wind, either from a previous phase of evolution or the wind from a companion star, i.e., wind-wind interactions- and interaction with a companion star that has a weak or insignicant outflow (e.g., a compact companion such as a neutron star or black hole), i.e., wind-companion interactions.I will also highlight the broader implications and impact of these stellar wind interactions for other phenomena, e.g, for symbiotic and X-ray binaries, supernovae and Gamma-ray bursts.

  13. Caterpillars evolved from onychophorans by hybridogenesis.

    PubMed

    Williamson, Donald I

    2009-11-24

    I reject the Darwinian assumption that larvae and their adults evolved from a single common ancestor. Rather I posit that, in animals that metamorphose, the basic types of larvae originated as adults of different lineages, i.e., larvae were transferred when, through hybridization, their genomes were acquired by distantly related animals. "Caterpillars," the name for eruciforms with thoracic and abdominal legs, are larvae of lepidopterans, hymenopterans, and mecopterans (scorpionflies). Grubs and maggots, including the larvae of beetles, bees, and flies, evolved from caterpillars by loss of legs. Caterpillar larval organs are dismantled and reconstructed in the pupal phase. Such indirect developmental patterns (metamorphoses) did not originate solely by accumulation of random mutations followed by natural selection; rather they are fully consistent with my concept of evolution by hybridogenesis. Members of the phylum Onychophora (velvet worms) are proposed as the evolutionary source of caterpillars and their grub or maggot descendants. I present a molecular biological research proposal to test my thesis. By my hypothesis 2 recognizable sets of genes are detectable in the genomes of all insects with caterpillar grub- or maggot-like larvae: (i) onychophoran genes that code for proteins determining larval morphology/physiology and (ii) sequentially expressed insect genes that code for adult proteins. The genomes of insects and other animals that, by contrast, entirely lack larvae comprise recognizable sets of genes from single animal common ancestors. PMID:19717430

  14. Early formation of evolved asteroidal crust.

    PubMed

    Day, James M D; Ash, Richard D; Liu, Yang; Bellucci, Jeremy J; Rumble, Douglas; McDonough, William F; Walker, Richard J; Taylor, Lawrence A

    2009-01-01

    Mechanisms for the formation of crust on planetary bodies remain poorly understood. It is generally accepted that Earth's andesitic continental crust is the product of plate tectonics, whereas the Moon acquired its feldspar-rich crust by way of plagioclase flotation in a magma ocean. Basaltic meteorites provide evidence that, like the terrestrial planets, some asteroids generated crust and underwent large-scale differentiation processes. Until now, however, no evolved felsic asteroidal crust has been sampled or observed. Here we report age and compositional data for the newly discovered, paired and differentiated meteorites Graves Nunatak (GRA) 06128 and GRA 06129. These meteorites are feldspar-rich, with andesite bulk compositions. Their age of 4.52 +/- 0.06 Gyr demonstrates formation early in Solar System history. The isotopic and elemental compositions, degree of metamorphic re-equilibration and sulphide-rich nature of the meteorites are most consistent with an origin as partial melts from a volatile-rich, oxidized asteroid. GRA 06128 and 06129 are the result of a newly recognized style of evolved crust formation, bearing witness to incomplete differentiation of their parent asteroid and to previously unrecognized diversity of early-formed materials in the Solar System. PMID:19129845

  15. Have plants evolved to self-immolate?

    PubMed Central

    Bowman, David M. J. S.; French, Ben J.; Prior, Lynda D.

    2014-01-01

    By definition fire prone ecosystems have highly combustible plants, leading to the hypothesis, first formally stated by Mutch in 1970, that community flammability is the product of natural selection of flammable traits. However, proving the “Mutch hypothesis” has presented an enormous challenge for fire ecologists given the difficulty in establishing cause and effect between landscape fire and flammable plant traits. Individual plant traits (such as leaf moisture content, retention of dead branches and foliage, oil rich foliage) are known to affect the flammability of plants but there is no evidence these characters evolved specifically to self-immolate, although some of these traits may have been secondarily modified to increase the propensity to burn. Demonstrating individual benefits from self-immolation is extraordinarily difficult, given the intersection of the physical environmental factors that control landscape fire (fuel production, dryness and ignitions) with community flammability properties that emerge from numerous traits of multiple species (canopy cover and litter bed bulk density). It is more parsimonious to conclude plants have evolved mechanisms to tolerate, but not promote, landscape fire. PMID:25414710

  16. Netgram: Visualizing Communities in Evolving Networks

    PubMed Central

    Mall, Raghvendra; Langone, Rocco; Suykens, Johan A. K.

    2015-01-01

    Real-world complex networks are dynamic in nature and change over time. The change is usually observed in the interactions within the network over time. Complex networks exhibit community like structures. A key feature of the dynamics of complex networks is the evolution of communities over time. Several methods have been proposed to detect and track the evolution of these groups over time. However, there is no generic tool which visualizes all the aspects of group evolution in dynamic networks including birth, death, splitting, merging, expansion, shrinkage and continuation of groups. In this paper, we propose Netgram: a tool for visualizing evolution of communities in time-evolving graphs. Netgram maintains evolution of communities over 2 consecutive time-stamps in tables which are used to create a query database using the sql outer-join operation. It uses a line-based visualization technique which adheres to certain design principles and aesthetic guidelines. Netgram uses a greedy solution to order the initial community information provided by the evolutionary clustering technique such that we have fewer line cross-overs in the visualization. This makes it easier to track the progress of individual communities in time evolving graphs. Netgram is a generic toolkit which can be used with any evolutionary community detection algorithm as illustrated in our experiments. We use Netgram for visualization of topic evolution in the NIPS conference over a period of 11 years and observe the emergence and merging of several disciplines in the field of information processing systems. PMID:26356538

  17. Evolving NASA's Earth Science Data Systems

    NASA Astrophysics Data System (ADS)

    Walter, J.; Behnke, J.; Murphy, K. J.; Lowe, D. R.

    2013-12-01

    NASA's Earth Science Data and Information System Project (ESDIS) is charged with managing, maintaining, and evolving NASA's Earth Observing System Data and Information System (EOSDIS) and is responsible for processing, archiving, and distributing NASA Earth science data. The system supports a multitude of missions and serves diverse science research and other user communities. Keeping up with ever-changing information technology and figuring out how to leverage those changes across such a large system in order to continuously improve and meet the needs of a diverse user community is a significant challenge. Maintaining and evolving the system architecture and infrastructure is a continuous and multi-layered effort. It requires a balance between a "top down" management paradigm that provides a coherent system view and maintaining the managerial, technological, and functional independence of the individual system elements. This presentation will describe some of the key elements of the current system architecture, some of the strategies and processes we employ to meet these challenges, current and future challenges, and some ideas for meeting those challenges.

  18. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  19. General Aviation Aircraft Reliability Study

    NASA Technical Reports Server (NTRS)

    Pettit, Duane; Turnbull, Andrew; Roelant, Henk A. (Technical Monitor)

    2001-01-01

    This reliability study was performed in order to provide the aviation community with an estimate of Complex General Aviation (GA) Aircraft System reliability. To successfully improve the safety and reliability for the next generation of GA aircraft, a study of current GA aircraft attributes was prudent. This was accomplished by benchmarking the reliability of operational Complex GA Aircraft Systems. Specifically, Complex GA Aircraft System reliability was estimated using data obtained from the logbooks of a random sample of the Complex GA Aircraft population.

  20. Capturing the Interpersonal Implications of Evolved Preferences? Frequency of Sex Shapes Automatic, but Not Explicit, Partner Evaluations.

    PubMed

    Hicks, Lindsey L; McNulty, James K; Meltzer, Andrea L; Olson, Michael A

    2016-06-01

    A strong predisposition to engage in sexual intercourse likely evolved in humans because sex is crucial to reproduction. Given that meeting interpersonal preferences tends to promote positive relationship evaluations, sex within a relationship should be positively associated with relationship satisfaction. Nevertheless, prior research has been inconclusive in demonstrating such a link, with longitudinal and experimental studies showing no association between sexual frequency and relationship satisfaction. Crucially, though, all prior research has utilized explicit reports of satisfaction, which reflect deliberative processes that may override the more automatic implications of phylogenetically older evolved preferences. Accordingly, capturing the implications of sexual frequency for relationship evaluations may require implicit measurements that bypass deliberative reasoning. Consistent with this idea, one cross-sectional and one 3-year study of newlywed couples revealed a positive association between sexual frequency and automatic partner evaluations but not explicit satisfaction. These findings highlight the importance of automatic measurements to understanding interpersonal relationships. PMID:27084851

  1. Interrelation Between Safety Factors and Reliability

    NASA Technical Reports Server (NTRS)

    Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.

  2. Crystalline-silicon reliability lessons for thin-film modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1985-01-01

    The reliability of crystalline silicon modules has been brought to a high level with lifetimes approaching 20 years, and excellent industry credibility and user satisfaction. The transition from crystalline modules to thin film modules is comparable to the transition from discrete transistors to integrated circuits. New cell materials and monolithic structures will require new device processing techniques, but the package function and design will evolve to a lesser extent. Although there will be new encapsulants optimized to take advantage of the mechanical flexibility and low temperature processing features of thin films, the reliability and life degradation stresses and mechanisms will remain mostly unchanged. Key reliability technologies in common between crystalline and thin film modules include hot spot heating, galvanic and electrochemical corrosion, hail impact stresses, glass breakage, mechanical fatigue, photothermal degradation of encapsulants, operating temperature, moisture sorption, circuit design strategies, product safety issues, and the process required to achieve a reliable product from a laboratory prototype.

  3. Renal cell carcinoma: Evolving and emerging subtypes.

    PubMed

    Crumley, Suzanne M; Divatia, Mukul; Truong, Luan; Shen, Steven; Ayala, Alberto G; Ro, Jae Y

    2013-12-16

    Our knowledge of renal cell carcinoma (RCC) is rapidly expanding. For those who diagnose and treat RCC, it is important to understand the new developments. In recent years, many new renal tumors have been described and defined, and our understanding of the biology and clinical correlates of these tumors is changing. Evolving concepts in Xp11 translocation carcinoma, mucinous tubular and spindle cell carcinoma, multilocular cystic clear cell RCC, and carcinoma associated with neuroblastoma are addressed within this review. Tubulocystic carcinoma, thyroid-like follicular carcinoma of kidney, acquired cystic disease-associated RCC, and clear cell papillary RCC are also described. Finally, candidate entities, including RCC with t(6;11) translocation, hybrid oncocytoma/chromophobe RCC, hereditary leiomyomatosis and RCC syndrome, and renal angiomyoadenomatous tumor are reviewed. Knowledge of these new entities is important for diagnosis, treatment and subsequent prognosis. This review provides a targeted summary of new developments in RCC. PMID:24364021

  4. The evolving classification of renal cell neoplasia.

    PubMed

    Delahunt, Brett; Srigley, John R

    2015-03-01

    The classification of renal cell neoplasia is morphologically based; however, this has evolved over the last 35 years with the incorporation of genetic characteristics into the diagnostic features of some tumors. The 2013 Vancouver classification recognized 17 morphotypes of renal parenchymal malignancy and two benign tumors. This classification included the newly established entities tubulocystic renal cell carcinoma (RCC)), acquired cystic disease-associated RCC, clear cell (tubulo) papillary RCC, microphthalmia transcription factor family translocation RCC and hereditary leiomyomatosis RCC syndrome-associated RCC. In addition to these newly described forms of RCC there are a number of novel tumors that are currently recognized as emerging entities. These are likely to be incorporated into subsequent classifications and include thyroid-like follicular RCC, succinate dehydrogenase B mutation-associated RCC, ALK translocation RCC, tuberous sclerosis complex-associated RCC, and RCC with (angio) leiomyomatous stroma. PMID:25753529

  5. Evolving unipolar memristor spiking neural networks

    NASA Astrophysics Data System (ADS)

    Howard, David; Bull, Larry; De Lacy Costello, Ben

    2015-10-01

    Neuromorphic computing - brain-like computing in hardware - typically requires myriad complimentary metal oxide semiconductor spiking neurons interconnected by a dense mesh of nanoscale plastic synapses. Memristors are frequently cited as strong synapse candidates due to their statefulness and potential for low-power implementations. To date, plentiful research has focused on the bipolar memristor synapse, which is capable of incremental weight alterations and can provide adaptive self-organisation under a Hebbian learning scheme. In this paper, we consider the unipolar memristor synapse - a device capable of non-Hebbian switching between only two states (conductive and resistive) through application of a suitable input voltage - and discuss its suitability for neuromorphic systems. A self-adaptive evolutionary process is used to autonomously find highly fit network configurations. Experimentation on two robotics tasks shows that unipolar memristor networks evolve task-solving controllers faster than both bipolar memristor networks and networks containing constant non-plastic connections whilst performing at least comparably.

  6. Resiliently evolving supply-demand networks

    NASA Astrophysics Data System (ADS)

    Rubido, Nicolás; Grebogi, Celso; Baptista, Murilo S.

    2014-01-01

    The ability to design a transport network such that commodities are brought from suppliers to consumers in a steady, optimal, and stable way is of great importance for distribution systems nowadays. In this work, by using the circuit laws of Kirchhoff and Ohm, we provide the exact capacities of the edges that an optimal supply-demand network should have to operate stably under perturbations, i.e., without overloading. The perturbations we consider are the evolution of the connecting topology, the decentralization of hub sources or sinks, and the intermittence of supplier and consumer characteristics. We analyze these conditions and the impact of our results, both on the current United Kingdom power-grid structure and on numerically generated evolving archetypal network topologies.

  7. Finch: A System for Evolving Java (Bytecode)

    NASA Astrophysics Data System (ADS)

    Orlov, Michael; Sipper, Moshe

    The established approach in genetic programming (GP) involves the definition of functions and terminals appropriate to the problem at hand, after which evolution of expressions using these definitions takes place. We have recently developed a system, dubbed FINCH (Fertile Darwinian Bytecode Harvester), to evolutionarily improve actual, extant software, which was not intentionally written for the purpose of serving as a GP representation in particular, nor for evolution in general. This is in contrast to existing work that uses restricted subsets of the Java bytecode instruction set as a representation language for individuals in genetic programming. The ability to evolve Java programs will hopefully lead to a valuable new tool in the software engineer's toolkit.

  8. Pulmonary Sporotrichosis: An Evolving Clinical Paradigm.

    PubMed

    Aung, Ar K; Spelman, Denis W; Thompson, Philip J

    2015-10-01

    In recent decades, sporotrichosis, caused by thermally dimorphic fungi Sporothrix schenckii complex, has become an emerging infection in many parts of the world. Pulmonary infection with S. schenckii still remains relatively uncommon, possibly due to underrecognition. Pulmonary sporotrichosis presents with distinct clinical and radiological patterns in both immunocompetent and immunocompromised hosts and can often result in significant morbidity and mortality despite treatment. Current understanding regarding S. schenckii biology, epidemiology, immunopathology, clinical diagnostics, and treatment options has been evolving in the recent years with increased availability of molecular sequencing techniques. However, this changing knowledge has not yet been fully translated into a better understanding of the clinical aspects of pulmonary sporotrichosis, as such current management guidelines remain unsupported by high-level clinical evidence. This article examines recent advances in the knowledge of sporotrichosis and its application to the difficult challenges of managing pulmonary sporotrichosis. PMID:26398541

  9. Synchronization in evolving snowdrift game model

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Wu, L.; Zhu, S. Q.

    2009-06-01

    The interaction between the evolution of the game and the underlying network structure with evolving snowdrift game model is investigated. The constructed network follows a power-law degree distribution typically showing scale-free feature. The topological features of average path length, clustering coefficient, degree-degree correlations and the dynamical feature of synchronizability are studied. The synchronizability of the constructed networks changes by the interaction. It will converge to a certain value when sufficient new nodes are added. It is found that initial payoffs of nodes greatly affect the synchronizability. When initial payoffs for players are equal, low common initial payoffs may lead to more heterogeneity of the network and good synchronizability. When initial payoffs follow certain distributions, better synchronizability is obtained compared to equal initial payoff. The result is also true for phase synchronization of nonidentical oscillators.

  10. Observations of the Dust Around Evolved Stars

    NASA Astrophysics Data System (ADS)

    Walker, H. J.; Heinrichsen, I.; Richards, P. J.

    ISOPHOT has been used to obtain low resolution spectra from 2.5µm to 5µm and 5.8µm to 11.6µm and multi-aperture photometry at 60µm of several evolved stars; oxygen-rich and carbon-rich (including the peculiar carbon-rich stars R CrB and RY Sgr). R CrB was observed early in the ISO mission, 3 weeks after it had been at minimum light. Another spectrum was obtained several months later. The second spectrum shows that the broad plateau (from around 6µm to 8µm) is still present but the flux density has declined from 60Jy to 50Jy. The spectrum for RY Sgr shows the same type of plateau. The multi-aperture data suggest that the dust shells are resolved around R CrB, RY Sgr, Y CVn and RS Lib.

  11. Evolving colon injury management: a review.

    PubMed

    Greer, Lauren T; Gillern, Suzanne M; Vertrees, Amy E

    2013-02-01

    The colon is the second most commonly injured intra-abdominal organ in penetrating trauma. Management of traumatic colon injuries has evolved significantly over the past 200 years. Traumatic colon injuries can have a wide spectrum of severity, presentation, and management options. There is strong evidence that most non-destructive colon injuries can be successfully managed with primary repair or primary anastomosis. The management of destructive colon injuries remains controversial with most favoring resection with primary anastomosis and others favor colonic diversion in specific circumstances. The historical management of traumatic colon injuries, common mechanisms of injury, demographics, presentation, assessment, diagnosis, management, and complications of traumatic colon injuries both in civilian and military practice are reviewed. The damage control revolution has added another layer of complexity to management with continued controversy. PMID:23336650

  12. Properties of evolving e-mail networks

    NASA Astrophysics Data System (ADS)

    Wang, Juan; de Wilde, Philippe

    2004-12-01

    Computer viruses spread by attaching to an e-mail message and sending themselves to users whose addresses are in the e-mail address book of the recipients. Here we investigate a simple model of an evolving e-mail network, with nodes as e-mail address books of users and links as the records of e-mail addresses in the address books. Within specific periods, some new links are generated and some old links are deleted. We study the statistical properties of this e-mail network and observe the effect of the evolution on the structure of the network. We also find that the balance between the generation procedure and deletion procedure is dependent on different parameters of the model.

  13. Regulatory mechanisms link phenotypic plasticity to evolvability.

    PubMed

    van Gestel, Jordi; Weissing, Franz J

    2016-01-01

    Organisms have a remarkable capacity to respond to environmental change. They can either respond directly, by means of phenotypic plasticity, or they can slowly adapt through evolution. Yet, how phenotypic plasticity links to evolutionary adaptability is largely unknown. Current studies of plasticity tend to adopt a phenomenological reaction norm (RN) approach, which neglects the mechanisms underlying plasticity. Focusing on a concrete question - the optimal timing of bacterial sporulation - we here also consider a mechanistic approach, the evolution of a gene regulatory network (GRN) underlying plasticity. Using individual-based simulations, we compare the RN and GRN approach and find a number of striking differences. Most importantly, the GRN model results in a much higher diversity of responsive strategies than the RN model. We show that each of the evolved strategies is pre-adapted to a unique set of unseen environmental conditions. The regulatory mechanisms that control plasticity therefore critically link phenotypic plasticity to the adaptive potential of biological populations. PMID:27087393

  14. Structural phase transition in evolving networks.

    PubMed

    Kim, Sang-Woo; Noh, Jae Dong

    2009-08-01

    A network as a substrate for dynamic processes may have its own dynamics. We propose a model for networks which evolve together with diffusing particles through a coupled dynamics and investigate emerging structural property. The model consists of an undirected weighted network of fixed mean degree and randomly diffusing particles of fixed density. The weight w of an edge increases by the amount of traffics through its connecting nodes or decreases by a constant factor. Edges are removed with the probability P(rew)=1/(1+w) and replaced by new ones having w=0 at random locations. We find that the model exhibits a structural phase transition between the homogeneous phase characterized by an exponentially decaying degree distribution and the heterogeneous phase characterized by the presence of hubs. The hubs emerge as a consequence of a positive feedback between the particle and the edge dynamics. PMID:19792212

  15. Modelling of the Evolving Stable Boundary Layer

    NASA Astrophysics Data System (ADS)

    Sorbjan, Zbigniew

    2014-06-01

    A single-column model of the evolving stable boundary layer (SBL) is tested for self-similar properties of the flow and effects of ambient forcing. The turbulence closure of the model is diagnostic, based on the K-theory approach, with a semi-empirical form of the mixing length, and empirical stability functions of the Richardson number. The model results, expressed in terms of local similarity scales, are universal functions, satisfied in the entire SBL. Based on similarity expression, a realizability condition is derived for the minimum allowable turbulent heat flux in the SBL. Numerical experiments show that the development of "horse-shoe" shaped, fixed-elevation hodographs in the interior of the SBL around sunrise is controlled by effects imposed by surface thermal forcing.

  16. The Evolving Theory of Evolutionary Radiations.

    PubMed

    Simões, M; Breitkreuz, L; Alvarado, M; Baca, S; Cooper, J C; Heins, L; Herzog, K; Lieberman, B S

    2016-01-01

    Evolutionary radiations have intrigued biologists for more than 100 years, and our understanding of the patterns and processes associated with these radiations continues to grow and evolve. Recently it has been recognized that there are many different types of evolutionary radiation beyond the well-studied adaptive radiations. We focus here on multifarious types of evolutionary radiations, paying special attention to the abiotic factors that might trigger diversification in clades. We integrate concepts such as exaptation, species selection, coevolution, and the turnover-pulse hypothesis (TPH) into the theoretical framework of evolutionary radiations. We also discuss other phenomena that are related to, but distinct from, evolutionary radiations that have relevance for evolutionary biology. PMID:26632984

  17. Renal cell carcinoma: Evolving and emerging subtypes

    PubMed Central

    Crumley, Suzanne M; Divatia, Mukul; Truong, Luan; Shen, Steven; Ayala, Alberto G; Ro, Jae Y

    2013-01-01

    Our knowledge of renal cell carcinoma (RCC) is rapidly expanding. For those who diagnose and treat RCC, it is important to understand the new developments. In recent years, many new renal tumors have been described and defined, and our understanding of the biology and clinical correlates of these tumors is changing. Evolving concepts in Xp11 translocation carcinoma, mucinous tubular and spindle cell carcinoma, multilocular cystic clear cell RCC, and carcinoma associated with neuroblastoma are addressed within this review. Tubulocystic carcinoma, thyroid-like follicular carcinoma of kidney, acquired cystic disease-associated RCC, and clear cell papillary RCC are also described. Finally, candidate entities, including RCC with t(6;11) translocation, hybrid oncocytoma/chromophobe RCC, hereditary leiomyomatosis and RCC syndrome, and renal angiomyoadenomatous tumor are reviewed. Knowledge of these new entities is important for diagnosis, treatment and subsequent prognosis. This review provides a targeted summary of new developments in RCC. PMID:24364021

  18. Microbial communities evolve faster in extreme environments

    PubMed Central

    Li, Sheng-Jin; Hua, Zheng-Shuang; Huang, Li-Nan; Li, Jie; Shi, Su-Hua; Chen, Lin-Xing; Kuang, Jia-Liang; Liu, Jun; Hu, Min; Shu, Wen-Sheng

    2014-01-01

    Evolutionary analysis of microbes at the community level represents a new research avenue linking ecological patterns to evolutionary processes, but remains insufficiently studied. Here we report a relative evolutionary rates (rERs) analysis of microbial communities from six diverse natural environments based on 40 metagenomic samples. We show that the rERs of microbial communities are mainly shaped by environmental conditions, and the microbes inhabiting extreme habitats (acid mine drainage, saline lake and hot spring) evolve faster than those populating benign environments (surface ocean, fresh water and soil). These findings were supported by the observation of more relaxed purifying selection and potentially frequent horizontal gene transfers in communities from extreme habitats. The mechanism of high rERs was proposed as high mutation rates imposed by stressful conditions during the evolutionary processes. This study brings us one stage closer to an understanding of the evolutionary mechanisms underlying the adaptation of microbes to extreme environments. PMID:25158668

  19. Life cycle planning: An evolving concept

    SciTech Connect

    Moore, P.J.R.; Gorman, I.G.

    1994-12-31

    Life-cycle planning is an evolving concept in the management of oil and gas projects. BHP Petroleum now interprets this idea to include all development planning from discovery and field appraisal to final abandonment and includes safety, environmental, technical, plant, regulatory, and staffing issues. This article describes in the context of the Timor Sea, how despite initial successes and continuing facilities upgrades, BHPP came to perceive that current operations could be the victim of early development successes, particularly in the areas of corrosion and maintenance. The search for analogies elsewhere lead to the UK North Sea, including the experiences of Britoil and BP, both of which performed detailed Life of Field studies in the later eighties. These materials have been used to construct a format and content for total Life-cycle plans in general and the social changes required to ensure their successful application in Timor Sea operations and deployment throughout Australia.

  20. Evolving spiking networks with variable resistive memories.

    PubMed

    Howard, Gerard; Bull, Larry; de Lacy Costello, Ben; Gale, Ella; Adamatzky, Andrew

    2014-01-01

    Neuromorphic computing is a brainlike information processing paradigm that requires adaptive learning mechanisms. A spiking neuro-evolutionary system is used for this purpose; plastic resistive memories are implemented as synapses in spiking neural networks. The evolutionary design process exploits parameter self-adaptation and allows the topology and synaptic weights to be evolved for each network in an autonomous manner. Variable resistive memories are the focus of this research; each synapse has its own conductance profile which modifies the plastic behaviour of the device and may be altered during evolution. These variable resistive networks are evaluated on a noisy robotic dynamic-reward scenario against two static resistive memories and a system containing standard connections only. The results indicate that the extra behavioural degrees of freedom available to the networks incorporating variable resistive memories enable them to outperform the comparative synapse types. PMID:23614774

  1. Ultimately Reliable Pyrotechnic Systems

    NASA Technical Reports Server (NTRS)

    Scott, John H.; Hinkel, Todd

    2015-01-01

    This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing

  2. CR reliability testing

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Rill, Lynn; Frost, Meryll M.; Staab, Edward V.

    1998-07-01

    The purpose of this work was to develop a method for systematically testing the reliability of a CR system under realistic daily loads in a non-clinical environment prior to its clinical adoption. Once digital imaging replaces film, it will be very difficult to revert back should the digital system become unreliable. Prior to the beginning of the test, a formal evaluation was performed to set the benchmarks for performance and functionality. A formal protocol was established that included all the 62 imaging plates in the inventory for each 24-hour period in the study. Imaging plates were exposed using different combinations of collimation, orientation, and SID. Anthropomorphic phantoms were used to acquire images of different sizes. Each combination was chosen randomly to simulate the differences that could occur in clinical practice. The tests were performed over a wide range of times with batches of plates processed to simulate the temporal constraints required by the nature of portable radiographs taken in the Intensive Care Unit (ICU). Current patient demographics were used for the test studies so automatic routing algorithms could be tested. During the test, only three minor reliability problems occurred, two of which were not directly related to the CR unit. One plate was discovered to cause a segmentation error that essentially reduced the image to only black and white with no gray levels. This plate was removed from the inventory to be replaced. Another problem was a PACS routing problem that occurred when the DICOM server with which the CR was communicating had a problem with disk space. The final problem was a network printing failure to the laser cameras. Although the units passed the reliability test, problems with interfacing to workstations were discovered. The two issues that were identified were the interpretation of what constitutes a study for CR and the construction of the look-up table for a proper gray scale display.

  3. Evolved mechanisms in depression: the role and interaction of attachment and social rank in depression.

    PubMed

    Sloman, L; Gilbert, P; Hasey, G

    2003-04-01

    Evolved mechanisms underpinning attachment and social rank behavior may be the basis for some forms of major depression, especially those associated with chronic stress. We note the heterogeneity of depression, but suggest that some of its core symptoms, such as behavioral withdrawal, low self-esteem and anhedonia, may have evolved in order to regulate behavior and mood and convey sensitivity to threats and safety. Focusing on the evolved mental mechanisms for attachment and social rank helps to make sense of (1) depression's common early vulnerability factors (e.g., attachment disruptions, neglect and abuse), (2) the triggering events (e.g., loss of close relationships, being defeated and/or trapped in low socially rewarding or hostile environments), and (3) the psychological preoccupations of depressed people (e.g., sense of unlovableness, self as inferior and a failure). This focus offers clues as to how these two systems interact and on how to intervene. PMID:12706512

  4. Reliable VLSI sequential controllers

    NASA Technical Reports Server (NTRS)

    Whitaker, S.; Maki, G.; Shamanna, M.

    1990-01-01

    A VLSI architecture for synchronous sequential controllers is presented that has attractive qualities for producing reliable circuits. In these circuits, one hardware implementation can realize any flow table with a maximum of 2(exp n) internal states and m inputs. Also all design equations are identical. A real time fault detection means is presented along with a strategy for verifying the correctness of the checking hardware. This self check feature can be employed with no increase in hardware. The architecture can be modified to achieve fail safe designs. With no increase in hardware, an adaptable circuit can be realized that allows replacement of faulty transitions with fault free transitions.

  5. Ferrite logic reliability study

    NASA Technical Reports Server (NTRS)

    Baer, J. A.; Clark, C. B.

    1973-01-01

    Development and use of digital circuits called all-magnetic logic are reported. In these circuits the magnetic elements and their windings comprise the active circuit devices in the logic portion of a system. The ferrite logic device belongs to the all-magnetic class of logic circuits. The FLO device is novel in that it makes use of a dual or bimaterial ferrite composition in one physical ceramic body. This bimaterial feature, coupled with its potential for relatively high speed operation, makes it attractive for high reliability applications. (Maximum speed of operation approximately 50 kHz.)

  6. Fault Tree Reliability Analysis and Design-for-reliability

    Energy Science and Technology Software Center (ESTSC)

    1998-05-05

    WinR provides a fault tree analysis capability for performing systems reliability and design-for-reliability analyses. The package includes capabilities for sensitivity and uncertainity analysis, field failure data analysis, and optimization.

  7. How evolved psychological mechanisms empower cultural group selection.

    PubMed

    Henrich, Joseph; Boyd, Robert

    2016-01-01

    Driven by intergroup competition, social norms, beliefs, and practices can evolve in ways that more effectively tap into a wide variety of evolved psychological mechanisms to foster group-beneficial behavior. The more powerful such evolved mechanisms are, the more effectively culture can potentially harness and manipulate them to generate greater phenotypic variation across groups, thereby fueling cultural group selection. PMID:27561383

  8. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1990-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  9. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1988-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  10. Load Control System Reliability

    SciTech Connect

    Trudnowski, Daniel

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  11. Understanding the Elements of Operational Reliability: A Key for Achieving High Reliability

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    2010-01-01

    This viewgraph presentation reviews operational reliability and its role in achieving high reliability through design and process reliability. The topics include: 1) Reliability Engineering Major Areas and interfaces; 2) Design Reliability; 3) Process Reliability; and 4) Reliability Applications.

  12. Sustaining an International Partnership: An Evolving Collaboration

    ERIC Educational Resources Information Center

    Pierson, Melinda R.; Myck-Wayne, Janice; Stang, Kristin K.; Basinska, Anna

    2015-01-01

    Universities across the United States have an increasing interest in international education. Increasing global awareness through educational collaborations will promote greater cross-cultural understanding and build effective relationships with diverse communities. This paper documents one university's effort to build an effective international…

  13. Fatigue Reliability of Gas Turbine Engine Structures

    NASA Technical Reports Server (NTRS)

    Cruse, Thomas A.; Mahadevan, Sankaran; Tryon, Robert G.

    1997-01-01

    The results of an investigation are described for fatigue reliability in engine structures. The description consists of two parts. Part 1 is for method development. Part 2 is a specific case study. In Part 1, the essential concepts and practical approaches to damage tolerance design in the gas turbine industry are summarized. These have evolved over the years in response to flight safety certification requirements. The effect of Non-Destructive Evaluation (NDE) methods on these methods is also reviewed. Assessment methods based on probabilistic fracture mechanics, with regard to both crack initiation and crack growth, are outlined. Limit state modeling techniques from structural reliability theory are shown to be appropriate for application to this problem, for both individual failure mode and system-level assessment. In Part 2, the results of a case study for the high pressure turbine of a turboprop engine are described. The response surface approach is used to construct a fatigue performance function. This performance function is used with the First Order Reliability Method (FORM) to determine the probability of failure and the sensitivity of the fatigue life to the engine parameters for the first stage disk rim of the two stage turbine. A hybrid combination of regression and Monte Carlo simulation is to use incorporate time dependent random variables. System reliability is used to determine the system probability of failure, and the sensitivity of the system fatigue life to the engine parameters of the high pressure turbine. 'ne variation in the primary hot gas and secondary cooling air, the uncertainty of the complex mission loading, and the scatter in the material data are considered.

  14. Heritability and evolvability of fitness and nonfitness traits: Lessons from livestock.

    PubMed

    Hoffmann, Ary A; Merilä, Juha; Kristensen, Torsten N

    2016-08-01

    Data from natural populations have suggested a disconnection between trait heritability (variance standardized additive genetic variance, VA ) and evolvability (mean standardized VA ) and emphasized the importance of environmental variation as a determinant of trait heritability but not evolvability. However, these inferences are based on heterogeneous and often small datasets across species from different environments. We surveyed the relationship between evolvability and heritability in >100 traits in farmed cattle, taking advantage of large sample sizes and consistent genetic approaches. Heritability and evolvability estimates were positively correlated (r = 0.37/0.54 on untransformed/log scales) reflecting a substantial impact of VA on both measures. Furthermore, heritabilities and residual variances were uncorrelated. The differences between this and previously described patterns may reflect lower environmental variation experienced in farmed systems, but also low and heterogeneous quality of data from natural populations. Similar to studies on wild populations, heritabilities for life-history and behavioral traits were lower than for other traits. Traits having extremely low heritabilities and evolvabilities (17% of the studied traits) were almost exclusively life-history or behavioral traits, suggesting that evolutionary constraints stemming from lack of genetic variability are likely to be most common for classical "fitness" (cf. life-history) rather than for "nonfitness" (cf. morphological) traits. PMID:27346243

  15. Central pedicle reduction mammoplasty: a reliable technique

    PubMed Central

    2014-01-01

    Reduction mammoplasty is one of the most frequently performed procedures in plastic surgery for macromastia or gigantomastia. Recently it is also evolved for oncoplastic breast cancer surgery due to equivalent in terms of outcome for breast conserving surgery with radiotherapy versus mastectomy. Various techniques and modification has been made to achieve long lasting and aesthetically good result with minimal morbidity. Central (posterior) reduction mammoplasty is known for its versatile pedicle due to its good blood supply and innervation for maintaining of nipple sensation with unremarkably long term complication and proven in preservation of breastfeeding function. It is one of the good and reliable options to correct breast hypertrophy and ptosis. Various modifications were introduced by different authors to improve the technique and reduce scar formation which will give more satisfaction to patients. PMID:25083495

  16. Central pedicle reduction mammoplasty: a reliable technique.

    PubMed

    See, Mee-Hoong

    2014-02-01

    Reduction mammoplasty is one of the most frequently performed procedures in plastic surgery for macromastia or gigantomastia. Recently it is also evolved for oncoplastic breast cancer surgery due to equivalent in terms of outcome for breast conserving surgery with radiotherapy versus mastectomy. Various techniques and modification has been made to achieve long lasting and aesthetically good result with minimal morbidity. Central (posterior) reduction mammoplasty is known for its versatile pedicle due to its good blood supply and innervation for maintaining of nipple sensation with unremarkably long term complication and proven in preservation of breastfeeding function. It is one of the good and reliable options to correct breast hypertrophy and ptosis. Various modifications were introduced by different authors to improve the technique and reduce scar formation which will give more satisfaction to patients. PMID:25083495

  17. Formal methods and software reliability

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.

    2004-01-01

    In this position statement I briefly describe how the software reliability problem has changed over the years, and the primary reasons for the recent creation of the Laboratory for Reliable Software at JPL.

  18. Further discussion on reliability: the art of reliability estimation.

    PubMed

    Yang, Yanyun; Green, Samuel B

    2015-01-01

    Sijtsma and van der Ark (2015) focused in their lead article on three frameworks for reliability estimation in nursing research: classical test theory (CTT), factor analysis (FA), and generalizability theory. We extend their presentation with particular attention to CTT and FA methods. We first consider the potential of yielding an overly negative or an overly positive assessment of reliability based on coefficient alpha. Next, we discuss other CTT methods for estimating reliability and how the choice of methods affects the interpretation of the reliability coefficient. Finally, we describe FA methods, which not only permit an understanding of a measure's underlying structure but also yield a variety of reliability coefficients with different interpretations. On a more general note, we discourage reporting reliability as a two-choice outcome--unsatisfactory or satisfactory; rather, we recommend that nursing researchers make a conceptual and empirical argument about when a measure might be more or less reliable, depending on its use. PMID:25738627

  19. Testing for PV Reliability (Presentation)

    SciTech Connect

    Kurtz, S.; Bansal, S.

    2014-09-01

    The DOE SUNSHOT workshop is seeking input from the community about PV reliability and how the DOE might address gaps in understanding. This presentation describes the types of testing that are needed for PV reliability and introduces a discussion to identify gaps in our understanding of PV reliability testing.

  20. Business of reliability

    NASA Astrophysics Data System (ADS)

    Engel, Pierre

    1999-12-01

    The presentation is organized around three themes: (1) The decrease of reception equipment costs allows non-Remote Sensing organization to access a technology until recently reserved to scientific elite. What this means is the rise of 'operational' executive agencies considering space-based technology and operations as a viable input to their daily tasks. This is possible thanks to totally dedicated ground receiving entities focusing on one application for themselves, rather than serving a vast community of users. (2) The multiplication of earth observation platforms will form the base for reliable technical and financial solutions. One obstacle to the growth of the earth observation industry is the variety of policies (commercial versus non-commercial) ruling the distribution of the data and value-added products. In particular, the high volume of data sales required for the return on investment does conflict with traditional low-volume data use for most applications. Constant access to data sources supposes monitoring needs as well as technical proficiency. (3) Large volume use of data coupled with low- cost equipment costs is only possible when the technology has proven reliable, in terms of application results, financial risks and data supply. Each of these factors is reviewed. The expectation is that international cooperation between agencies and private ventures will pave the way for future business models. As an illustration, the presentation proposes to use some recent non-traditional monitoring applications, that may lead to significant use of earth observation data, value added products and services: flood monitoring, ship detection, marine oil pollution deterrent systems and rice acreage monitoring.

  1. Molecular line emission in asymmetric envelopes of evolved stars

    NASA Astrophysics Data System (ADS)

    Sanchez, Andres Felipe Perez

    2014-06-01

    Stars with initial masses of 0.8 < M⊙ < 9M⊙ eject most of their mass when evolving along the asymptotic giant branch (AGB) phase. The ejected material eventually cools down, which leads it to condensate and to form dust grains and molecular gas around the star, creating an extended circumstellar envelope (CSE). The mechanism responsible for the expansion of the dusty and dense CSEs is not completely understood. It is suggested that stellar radiation pressure on the dust particles can accelerate them outwards. Then, by collisional exchange of momentum, the dust particles drag along the molecular gas. However, this scenario cannot explain the onset of asymmetries in the CSEs observed towards more evolved sources such as post-AGB sources and Planetary nebulae. Part of the research in this thesis is focused on the study of the role that the stellar magnetic field plays on the formation of the collimated high-velocity outflows observed towards post-AGB sources. Polarized maser emission towards (post-)AGB stars has become an useful tool to determine the properties of the stellar magnetic fields permeating their CSEs. However, the polarization fraction detected can be affected by non-Zeeman effects. Here I present the results of our analysis of the polarization properties of SiO, H2O and HCN maser emission in the (sub-)millimetre wavelength range. The goal of this analysis is to determine whether polarized maser emission of these molecular species can be used as reliable tracer of the magnetic field from observations at (sub-)millimetre wavelengths. I also present the results of radio interferometric observations of both continuum and polarized maser emission towards post-AGB stars. The sources observed are characterized by H2O maser emission arising from their collimated, high-velocity outflows. The observations have been carried out with the Australian Telescope Compact Array aiming to detect both polarized maser emission and non-thermal radio continuum emission

  2. Origins of stereoselectivity in evolved ketoreductases

    PubMed Central

    Noey, Elizabeth L.; Tibrewal, Nidhi; Jiménez-Osés, Gonzalo; Osuna, Sílvia; Park, Jiyong; Bond, Carly M.; Cascio, Duilio; Liang, Jack; Zhang, Xiyun; Huisman, Gjalt W.; Tang, Yi; Houk, Kendall N.

    2015-01-01

    Mutants of Lactobacillus kefir short-chain alcohol dehydrogenase, used here as ketoreductases (KREDs), enantioselectively reduce the pharmaceutically relevant substrates 3-thiacyclopentanone and 3-oxacyclopentanone. These substrates differ by only the heteroatom (S or O) in the ring, but the KRED mutants reduce them with different enantioselectivities. Kinetic studies show that these enzymes are more efficient with 3-thiacyclopentanone than with 3-oxacyclopentanone. X-ray crystal structures of apo- and NADP+-bound selected mutants show that the substrate-binding loop conformational preferences are modified by these mutations. Quantum mechanical calculations and molecular dynamics (MD) simulations are used to investigate the mechanism of reduction by the enzyme. We have developed an MD-based method for studying the diastereomeric transition state complexes and rationalize different enantiomeric ratios. This method, which probes the stability of the catalytic arrangement within the theozyme, shows a correlation between the relative fractions of catalytically competent poses for the enantiomeric reductions and the experimental enantiomeric ratio. Some mutations, such as A94F and Y190F, induce conformational changes in the active site that enlarge the small binding pocket, facilitating accommodation of the larger S atom in this region and enhancing S-selectivity with 3-thiacyclopentanone. In contrast, in the E145S mutant and the final variant evolved for large-scale production of the intermediate for the antibiotic sulopenem, R-selectivity is promoted by shrinking the small binding pocket, thereby destabilizing the pro-S orientation. PMID:26644568

  3. Generative Representations for Evolving Families of Designs

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2003-01-01

    Since typical evolutionary design systems encode only a single artifact with each individual, each time the objective changes a new set of individuals must be evolved. When this objective varies in a way that can be parameterized, a more general method is to use a representation in which a single individual encodes an entire class of artifacts. In addition to saving time by preventing the need for multiple evolutionary runs, the evolution of parameter-controlled designs can create families of artifacts with the same style and a reuse of parts between members of the family. In this paper an evolutionary design system is described which uses a generative representation to encode families of designs. Because a generative representation is an algorithmic encoding of a design, its input parameters are a way to control aspects of the design it generates. By evaluating individuals multiple times with different input parameters the evolutionary design system creates individuals in which the input parameter controls specific aspects of a design. This system is demonstrated on two design substrates: neural-networks which solve the 3/5/7-parity problem and three-dimensional tables of varying heights.

  4. Emerging and Evolving Ovarian Cancer Animal Models

    PubMed Central

    Bobbs, Alexander S; Cole, Jennifer M; Cowden Dahl, Karen D

    2015-01-01

    Ovarian cancer (OC) is the leading cause of death from a gynecological malignancy in the United States. By the time a woman is diagnosed with OC, the tumor has usually metastasized. Mouse models that are used to recapitulate different aspects of human OC have been evolving for nearly 40 years. Xenograft studies in immunocompromised and immunocompetent mice have enhanced our knowledge of metastasis and immune cell involvement in cancer. Patient-derived xenografts (PDXs) can accurately reflect metastasis, response to therapy, and diverse genetics found in patients. Additionally, multiple genetically engineered mouse models have increased our understanding of possible tissues of origin for OC and what role individual mutations play in establishing ovarian tumors. Many of these models are used to test novel therapeutics. As no single model perfectly copies the human disease, we can use a variety of OC animal models in hypothesis testing that will lead to novel treatment options. The goal of this review is to provide an overview of the utility of different mouse models in the study of OC and their suitability for cancer research. PMID:26380555

  5. Regulatory mechanisms link phenotypic plasticity to evolvability

    PubMed Central

    van Gestel, Jordi; Weissing, Franz J.

    2016-01-01

    Organisms have a remarkable capacity to respond to environmental change. They can either respond directly, by means of phenotypic plasticity, or they can slowly adapt through evolution. Yet, how phenotypic plasticity links to evolutionary adaptability is largely unknown. Current studies of plasticity tend to adopt a phenomenological reaction norm (RN) approach, which neglects the mechanisms underlying plasticity. Focusing on a concrete question – the optimal timing of bacterial sporulation – we here also consider a mechanistic approach, the evolution of a gene regulatory network (GRN) underlying plasticity. Using individual-based simulations, we compare the RN and GRN approach and find a number of striking differences. Most importantly, the GRN model results in a much higher diversity of responsive strategies than the RN model. We show that each of the evolved strategies is pre-adapted to a unique set of unseen environmental conditions. The regulatory mechanisms that control plasticity therefore critically link phenotypic plasticity to the adaptive potential of biological populations. PMID:27087393

  6. Sexual regret: evidence for evolved sex differences.

    PubMed

    Galperin, Andrew; Haselton, Martie G; Frederick, David A; Poore, Joshua; von Hippel, William; Buss, David M; Gonzaga, Gian C

    2013-10-01

    Regret and anticipated regret enhance decision quality by helping people avoid making and repeating mistakes. Some of people's most intense regrets concern sexual decisions. We hypothesized evolved sex differences in women's and men's experiences of sexual regret. Because of women's higher obligatory costs of reproduction throughout evolutionary history, we hypothesized that sexual actions, particularly those involving casual sex, would be regretted more intensely by women than by men. In contrast, because missed sexual opportunities historically carried higher reproductive fitness costs for men than for women, we hypothesized that poorly chosen sexual inactions would be regretted more by men than by women. Across three studies (Ns = 200, 395, and 24,230), we tested these hypotheses using free responses, written scenarios, detailed checklists, and Internet sampling to achieve participant diversity, including diversity in sexual orientation. Across all data sources, results supported predicted psychological sex differences and these differences were localized in casual sex contexts. These findings are consistent with the notion that the psychology of sexual regret was shaped by recurrent sex differences in selection pressures operating over deep time. PMID:23179233

  7. Evolving application of biomimetic nanostructured hydroxyapatite

    PubMed Central

    Roveri, Norberto; Iafisco, Michele

    2010-01-01

    By mimicking Nature, we can design and synthesize inorganic smart materials that are reactive to biological tissues. These smart materials can be utilized to design innovative third-generation biomaterials, which are able to not only optimize their interaction with biological tissues and environment, but also mimic biogenic materials in their functionalities. The biomedical applications involve increasing the biomimetic levels from chemical composition, structural organization, morphology, mechanical behavior, nanostructure, and bulk and surface chemical–physical properties until the surface becomes bioreactive and stimulates cellular materials. The chemical–physical characteristics of biogenic hydroxyapatites from bone and tooth have been described, in order to point out the elective sides, which are important to reproduce the design of a new biomimetic synthetic hydroxyapatite. This review outlines the evolving applications of biomimetic synthetic calcium phosphates, details the main characteristics of bone and tooth, where the calcium phosphates are present, and discusses the chemical–physical characteristics of biomimetic calcium phosphates, methods of synthesizing them, and some of their biomedical applications. PMID:24198477

  8. Origins of stereoselectivity in evolved ketoreductases.

    PubMed

    Noey, Elizabeth L; Tibrewal, Nidhi; Jiménez-Osés, Gonzalo; Osuna, Sílvia; Park, Jiyong; Bond, Carly M; Cascio, Duilio; Liang, Jack; Zhang, Xiyun; Huisman, Gjalt W; Tang, Yi; Houk, Kendall N

    2015-12-22

    Mutants of Lactobacillus kefir short-chain alcohol dehydrogenase, used here as ketoreductases (KREDs), enantioselectively reduce the pharmaceutically relevant substrates 3-thiacyclopentanone and 3-oxacyclopentanone. These substrates differ by only the heteroatom (S or O) in the ring, but the KRED mutants reduce them with different enantioselectivities. Kinetic studies show that these enzymes are more efficient with 3-thiacyclopentanone than with 3-oxacyclopentanone. X-ray crystal structures of apo- and NADP(+)-bound selected mutants show that the substrate-binding loop conformational preferences are modified by these mutations. Quantum mechanical calculations and molecular dynamics (MD) simulations are used to investigate the mechanism of reduction by the enzyme. We have developed an MD-based method for studying the diastereomeric transition state complexes and rationalize different enantiomeric ratios. This method, which probes the stability of the catalytic arrangement within the theozyme, shows a correlation between the relative fractions of catalytically competent poses for the enantiomeric reductions and the experimental enantiomeric ratio. Some mutations, such as A94F and Y190F, induce conformational changes in the active site that enlarge the small binding pocket, facilitating accommodation of the larger S atom in this region and enhancing S-selectivity with 3-thiacyclopentanone. In contrast, in the E145S mutant and the final variant evolved for large-scale production of the intermediate for the antibiotic sulopenem, R-selectivity is promoted by shrinking the small binding pocket, thereby destabilizing the pro-S orientation. PMID:26644568

  9. An evolving model of online bipartite networks

    NASA Astrophysics Data System (ADS)

    Zhang, Chu-Xu; Zhang, Zi-Ke; Liu, Chuang

    2013-12-01

    Understanding the structure and evolution of online bipartite networks is a significant task since they play a crucial role in various e-commerce services nowadays. Recently, various attempts have been tried to propose different models, resulting in either power-law or exponential degree distributions. However, many empirical results show that the user degree distribution actually follows a shifted power-law distribution, the so-called Mandelbrot’s law, which cannot be fully described by previous models. In this paper, we propose an evolving model, considering two different user behaviors: random and preferential attachment. Extensive empirical results on two real bipartite networks, Delicious and CiteULike, show that the theoretical model can well characterize the structure of real networks for both user and object degree distributions. In addition, we introduce a structural parameter p, to demonstrate that the hybrid user behavior leads to the shifted power-law degree distribution, and the region of power-law tail will increase with the increment of p. The proposed model might shed some lights in understanding the underlying laws governing the structure of real online bipartite networks.

  10. How does cognition evolve? Phylogenetic comparative psychology

    PubMed Central

    Matthews, Luke J.; Hare, Brian A.; Nunn, Charles L.; Anderson, Rindy C.; Aureli, Filippo; Brannon, Elizabeth M.; Call, Josep; Drea, Christine M.; Emery, Nathan J.; Haun, Daniel B. M.; Herrmann, Esther; Jacobs, Lucia F.; Platt, Michael L.; Rosati, Alexandra G.; Sandel, Aaron A.; Schroepfer, Kara K.; Seed, Amanda M.; Tan, Jingzhi; van Schaik, Carel P.; Wobber, Victoria

    2014-01-01

    Now more than ever animal studies have the potential to test hypotheses regarding how cognition evolves. Comparative psychologists have developed new techniques to probe the cognitive mechanisms underlying animal behavior, and they have become increasingly skillful at adapting methodologies to test multiple species. Meanwhile, evolutionary biologists have generated quantitative approaches to investigate the phylogenetic distribution and function of phenotypic traits, including cognition. In particular, phylogenetic methods can quantitatively (1) test whether specific cognitive abilities are correlated with life history (e.g., lifespan), morphology (e.g., brain size), or socio-ecological variables (e.g., social system), (2) measure how strongly phylogenetic relatedness predicts the distribution of cognitive skills across species, and (3) estimate the ancestral state of a given cognitive trait using measures of cognitive performance from extant species. Phylogenetic methods can also be used to guide the selection of species comparisons that offer the strongest tests of a priori predictions of cognitive evolutionary hypotheses (i.e., phylogenetic targeting). Here, we explain how an integration of comparative psychology and evolutionary biology will answer a host of questions regarding the phylogenetic distribution and history of cognitive traits, as well as the evolutionary processes that drove their evolution. PMID:21927850

  11. Consensus in evolving networks of mobile agents

    NASA Astrophysics Data System (ADS)

    Baronchelli, Andrea; Díaz-Guilera, Albert

    2012-02-01

    Populations of mobile and communicating agents describe a vast array of technological and natural systems, ranging from sensor networks to animal groups. Here, we investigate how a group-level agreement may emerge in the continuously evolving networks defined by the local interactions of the moving individuals. We adopt a general scheme of motion in two dimensions and we let the individuals interact through the minimal naming game, a prototypical scheme to investigate social consensus. We distinguish different regimes of convergence determined by the emission range of the agents and by their mobility, and we identify the corresponding scaling behaviors of the consensus time. In the same way, we rationalize also the behavior of the maximum memory used during the convergence process, which determines the minimum cognitive/storage capacity needed by the individuals. Overall, we believe that the simple and general model presented in this talk can represent a helpful reference for a better understanding of the behavior of populations of mobile agents.

  12. The Evolved Pulsating CEMP Star HD 112869

    NASA Astrophysics Data System (ADS)

    Začs, Laimons; Sperauskas, Julius; Grankina, Aija; Deveikis, Viktoras; Kaminskyi, Bogdan; Pavlenko, Yakiv; Musaev, Faig A.

    2015-04-01

    Radial velocity measurements, BVRC photometry, and high-resolution spectroscopy in the wavelength region from blue to near-infrared are employed in order to clarify the evolutionary status of the carbon-enhanced metal-poor star HD 112869 with a unique ratio of carbon isotopes in the atmosphere. An LTE abundance analysis was carried out using the method of spectral synthesis and new self-consistent 1D atmospheric models. The radial velocity monitoring confirmed semiregular variations with a peak-to-peak amplitude of about 10 km {{s}-1} and a dominating period of about 115 days. The light, color, and radial velocity variations are typical of the evolved pulsating stars. The atmosphere of HD 112869 appears to be less metal-poor than reported before, [Fe/H] = -2.3 ± 0.2 dex. Carbon-to-oxygen and carbon isotope ratios are found to be extremely high, C/O ≃ 12.6 and12C/13C ≳ 1500, respectively. The s-process elements yttrium and barium are not enhanced, but neodymium appears to be overabundant. The magnesium abundance seems to be lower than the average found for CEMP stars, [Mg/Fe] < +0.4 dex. HD 112869 could be a single low-mass halo star in the stage of asymptotic giant branch evolution.

  13. Minority games, evolving capitals and replicator dynamics

    NASA Astrophysics Data System (ADS)

    Galla, Tobias; Zhang, Yi-Cheng

    2009-11-01

    We discuss a simple version of the minority game (MG) in which agents hold only one strategy each, but in which their capitals evolve dynamically according to their success and in which the total trading volume varies in time accordingly. This feature is known to be crucial for MGs to reproduce stylized facts of real market data. The stationary states and phase diagram of the model can be computed, and we show that the ergodicity breaking phase transition common for MGs, and marked by a divergence of the integrated response, is present also in this simplified model. An analogous majority game turns out to be relatively void of interesting features, and the total capital is found to diverge in time. Introducing a restraining force leads to a model akin to the replicator dynamics of evolutionary game theory, and we demonstrate that here a different type of phase transition is observed. Finally we briefly discuss the relation of this model with one strategy per player to more sophisticated minority games with dynamical capitals and several trading strategies per agent.

  14. Are Electronic Cardiac Devices Still Evolving?

    PubMed Central

    Mabo, P.

    2014-01-01

    Summary Objectives The goal of this paper is to review some important issues occurring during the past year in Implantable devices. Methods First cardiac implantable device was proposed to maintain an adequate heart rate, either because the heart’s natural pacemaker is not fast enough, or there is a block in the heart’s electrical conduction system. During the last forty years, pacemakers have evolved considerably and become programmable and allow to configure specific patient optimum pacing modes. Various technological aspects (electrodes, connectors, algorithms diagnosis, therapies, …) have been progressed and cardiac implants address several clinical applications: management of arrhythmias, cardioversion / defibrillation and cardiac resynchronization therapy. Results Observed progress was the miniaturization of device, increased longevity, coupled with efficient pacing functions, multisite pacing modes, leadless pacing and also a better recognition of supraventricular or ventricular tachycardia’s in order to deliver appropriate therapy. Subcutaneous implant, new modes of stimulation (leadless implant or ultrasound lead), quadripolar lead and new sensor or new algorithm for the hemodynamic management are introduced and briefly described. Each times, the main result occurring during the two past years are underlined and repositioned from the history, remaining limitations are also addressed. Conclusion Some important technological improvements were described. Nevertheless, news trends for the future are also considered in a specific session such as the remote follow-up of the patient or the treatment of heart failure by neuromodulation. PMID:25123732

  15. Gastric cancer: current and evolving treatment landscape.

    PubMed

    Sun, Weijing; Yan, Li

    2016-01-01

    Gastric (including gastroesophageal junction) cancer is the third leading cause of cancer-related death in the world. In China, an estimated 420,000 patients were diagnosed with gastric cancer in 2011, ranking this malignancy the second most prevalent cancer type and resulting in near 300,000 deaths. The treatment landscape of gastric cancer has evolved in recent years. Although systemic chemotherapy is still the mainstay treatment of metastatic disease, the introduction of agents targeting human epidermal growth factor receptor 2 and vascular endothelial growth factor/vascular endothelia growth factor receptor has brought this disease into the molecular and personalized medicine era. The preliminary yet encouraging clinical efficacy observed with immune checkpoint inhibitors, e.g., anti-programmed cell death protein 1/programmed death-ligand 1, will further shape the treatment landscape for gastric cancer. Molecular characterization of patients will play a critical role in developing new agents, as well as in implementing new treatment options for this disease. PMID:27581465

  16. Tearing Mode Stability of Evolving Toroidal Equilibria

    NASA Astrophysics Data System (ADS)

    Pletzer, A.; McCune, D.; Manickam, J.; Jardin, S. C.

    2000-10-01

    There are a number of toroidal equilibrium (such as JSOLVER, ESC, EFIT, and VMEC) and transport codes (such as TRANSP, BALDUR, and TSC) in our community that utilize differing equilibrium representations. There are also many heating and current drive (LSC and TORRAY), and stability (PEST1-3, GATO, NOVA, MARS, DCON, M3D) codes that require this equilibrium information. In an effort to provide seamless compatibility between the codes that produce and need these equilibria, we have developed two Fortran 90 modules, MEQ and XPLASMA, that serve as common interfaces between these two classes of codes. XPLASMA provides a common equilibrium representation for the heating and current drive applications while MEQ provides common equilibrium and associated metric information needed by MHD stability codes. We illustrate the utility of this approach by presenting results of PEST-3 tearing stability calculations of an NSTX discharge performed on profiles provided by the TRANSP code. Using the MEQ module, the TRANSP equilibrium data are stored in a Fortran 90 derived type and passed to PEST3 as a subroutine argument. All calculations are performed on the fly, as the profiles evolve.

  17. Origins and evolvability of the PAX family.

    PubMed

    Paixão-Côrtes, Vanessa R; Salzano, Francisco M; Bortolini, Maria Cátira

    2015-08-01

    The paired box (PAX) family of transcription/developmental genes plays a key role in numerous stages of embryonic development, as well as in adult organogenesis. There is evidence linking the acquisition of a paired-like DNA binding domain (PD) to domestication of a Tc1/mariner transposon. Further duplication/deletion processes led to at least five paralogous metazoan protein groups, which can be classified into two supergroups, PAXB-like or PAXD-like, using ancestral defining structures; the PD plus an octapeptide motif (OP) and a paired-type homeobox DNA binding domain (PTHD), producing the PD-OP-PTHD structure characteristic of the PAXB-like group, whereas an additional domain, the paired-type homeodomain tail (PHT), is present in the PAXD-like group, producing a PD-OP-PTHD-PHT structure. We examined their patterns of distribution in various species, using both available data and new bioinformatic analyses, including vertebrate PAX genes and their shared and specific functions, as well as inter- and intraspecific variability of PAX in primates. These analyses revealed a relatively conserved PAX network, accompanied by specific changes that led to adaptive novelties. Therefore, both stability and evolvability shaped the molecular evolution of this key transcriptional network. PMID:26321496

  18. Speciation genetics: current status and evolving approaches

    PubMed Central

    Wolf, Jochen B. W.; Lindell, Johan; Backström, Niclas

    2010-01-01

    The view of species as entities subjected to natural selection and amenable to change put forth by Charles Darwin and Alfred Wallace laid the conceptual foundation for understanding speciation. Initially marred by a rudimental understanding of hereditary principles, evolutionists gained appreciation of the mechanistic underpinnings of speciation following the merger of Mendelian genetic principles with Darwinian evolution. Only recently have we entered an era where deciphering the molecular basis of speciation is within reach. Much focus has been devoted to the genetic basis of intrinsic postzygotic isolation in model organisms and several hybrid incompatibility genes have been successfully identified. However, concomitant with the recent technological advancements in genome analysis and a newfound interest in the role of ecology in the differentiation process, speciation genetic research is becoming increasingly open to non-model organisms. This development will expand speciation research beyond the traditional boundaries and unveil the genetic basis of speciation from manifold perspectives and at various stages of the splitting process. This review aims at providing an extensive overview of speciation genetics. Starting from key historical developments and core concepts of speciation genetics, we focus much of our attention on evolving approaches and introduce promising methodological approaches for future research venues. PMID:20439277

  19. Evolving role of MRI in Crohn's disease.

    PubMed

    Yacoub, Joseph H; Obara, Piotr; Oto, Aytekin

    2013-06-01

    MR enterography is playing an evolving role in the evaluation of small bowel Crohn's disease (CD). Standard MR enterography includes a combination of rapidly acquired T2 sequence, balanced steady-state acquisition, and contrast enhanced T1-weighted gradient echo sequence. The diagnostic performance of these sequences has been shown to be comparable, and in some respects superior, to other small bowel imaging modalities. The findings of CD on MR enterography have been well described in the literature. New and emerging techniques such as diffusion-weighted imaging (DWI), dynamic contrast enhanced MRI (DCE-MRI), cinematography, and magnetization transfer, may lead to improved accuracy in characterizing the disease. These advanced techniques can provide quantitative parameters that may prove to be useful in assessing disease activity, severity, and response to treatment. In the future, MR enterography may play an increasing role in management decisions for patients with small bowel CD; however, larger studies are needed to validate these emerging MRI parameters as imaging biomarkers. PMID:23712842

  20. Lower mass limit of an evolving interstellar cloud and chemistry in an evolving oscillatory cloud

    NASA Technical Reports Server (NTRS)

    Tarafdar, S. P.

    1986-01-01

    Simultaneous solution of the equation of motion, equation of state and energy equation including heating and cooling processes for interstellar medium gives for a collapsing cloud a lower mass limit which is significantly smaller than the Jeans mass for the same initial density. The clouds with higher mass than this limiting mass collapse whereas clouds with smaller than critical mass pass through a maximum central density giving apparently similar clouds (i.e., same Av, size and central density) at two different phases of its evolution (i.e., with different life time). Preliminary results of chemistry in such an evolving oscillatory cloud show significant difference in abundances of some of the molecules in two physically similar clouds with different life times. The problems of depletion and short life time of evolving clouds appear to be less severe in such an oscillatory cloud.

  1. The Evolving Context for Science and Society

    NASA Astrophysics Data System (ADS)

    Leshner, Alan I.

    2012-01-01

    The relationship between science and the rest of society is critical both to the support it receives from the public and to the receptivity of the broader citizenry to science's explanations of the nature of the world and to its other outputs. Science's ultimate usefulness depends on a receptive public. For example, given that science and technology are imbedded in virtually every issue of modern life, either as a cause or a cure, it is critical that the relationship be strong and that the role of science is well appreciated by society, or the impacts of scientific advances will fall short of their great potential. Unfortunately, a variety of problems have been undermining the science-society relationship for over a decade. Some problems emerge from within the scientific enterprise - like scientific misconduct or conflicts of interest - and tarnish or weaken its image and credibility. Other problems and stresses come from outside the enterprise. The most obvious external pressure is that the world economic situation is undermining the financial support of both the conduct and infrastructure of science. Other examples of external pressures include conflicts between what science is revealing and political or economic expediency - e.g., global climate change - or instances where scientific advances encroach upon core human values or beliefs - e.g., scientific understanding of the origins and evolution of the universe as compared to biblical accounts of creation. Significant efforts - some dramatically non-traditional for many in the scientific community - are needed to restore balance to the science-society relationship.

  2. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  3. Early-branching or fast-evolving eukaryotes? An answer based on slowly evolving positions.

    PubMed

    Philippe, H; Lopez, P; Brinkmann, H; Budin, K; Germot, A; Laurent, J; Moreira, D; Müller, M; Le Guyader, H

    2000-06-22

    The current paradigm of eukaryotic evolution is based primarily on comparative analysis of ribosomal RNA sequences. It shows several early-emerging lineages, mostly amitochondriate, which might be living relics of a progressive assembly of the eukaryotic cell. However, the analysis of slow-evolving positions, carried out with the newly developed slow-fast method, reveals that these lineages are, in terms of nucleotide substitution, fast-evolving ones, misplaced at the base of the tree by a long branch attraction artefact. Since the fast-evolving groups are not always the same, depending on which macromolecule is used as a marker, this explains most of the observed incongruent phylogenies. The current paradigm of eukaryotic evolution thus has to be seriously re-examined as the eukaryotic phylogeny is presently best summarized by a multifurcation. This is consistent with the Big Bang hypothesis that all extant eukaryotic lineages are the result of multiple cladogeneses within a relatively brief period, although insufficiency of data is also a possible explanation for the lack of resolution. For further resolution, rare evolutionary events such as shared insertions and/or deletions or gene fusions might be helpful. PMID:10902687

  4. Sauropod dinosaurs evolved moderately sized genomes unrelated to body size

    PubMed Central

    Organ, Chris L.; Brusatte, Stephen L.; Stein, Koen

    2009-01-01

    Sauropodomorph dinosaurs include the largest land animals to have ever lived, some reaching up to 10 times the mass of an African elephant. Despite their status defining the upper range for body size in land animals, it remains unknown whether sauropodomorphs evolved larger-sized genomes than non-avian theropods, their sister taxon, or whether a relationship exists between genome size and body size in dinosaurs, two questions critical for understanding broad patterns of genome evolution in dinosaurs. Here we report inferences of genome size for 10 sauropodomorph taxa. The estimates are derived from a Bayesian phylogenetic generalized least squares approach that generates posterior distributions of regression models relating genome size to osteocyte lacunae volume in extant tetrapods. We estimate that the average genome size of sauropodomorphs was 2.02 pg (range of species means: 1.77–2.21 pg), a value in the upper range of extant birds (mean = 1.42 pg, range: 0.97–2.16 pg) and near the average for extant non-avian reptiles (mean = 2.24 pg, range: 1.05–5.44 pg). The results suggest that the variation in size and architecture of genomes in extinct dinosaurs was lower than the variation found in mammals. A substantial difference in genome size separates the two major clades within dinosaurs, Ornithischia (large genomes) and Saurischia (moderate to small genomes). We find no relationship between body size and estimated genome size in extinct dinosaurs, which suggests that neutral forces did not dominate the evolution of genome size in this group. PMID:19793755

  5. ACRM's evolving mission: opportunities to promote rehabilitation research.

    PubMed

    Heinemann, Allen W

    2006-02-01

    This presidential address reflects on the history and mission of the American Congress of Rehabilitation Medicine (ACRM) and considers the benefits derived from joint ownership of Archives of Physical Medicine and Rehabilitation with the American Academy of Physical Medicine and Rehabilitation (AAPM&R). Much of ACRM's history has been distinguished by collaboration with AAPM&R on essential concerns. Evolving organizational priorities have resulted in distinct association missions that have consequences for joint ownership of Archives. The journal has grown in important ways in the past 86 years from a solo editor to an editorial board and to joint ownership and sponsorship of alternating issues. The quality of Archives has improved substantially in the past decade, with an improving impact factor and an increasing number of manuscript submissions. A new contract with the publisher provides an opportunity to consider the relationship between the Congress and Archives and what kind of benefit ACRM desires it to be for its members, to the larger community of physical medicine and rehabilitation (PM&R), and to the persons with disabilities we as PM&R professionals seek to serve. Archives is well positioned to fulfill ACRM's focus on promoting rehabilitation research and facilitating information dissemination and technology transfer. An internationally respected journal is an excellent means to disseminate rehabilitation research that promotes health, independence, productivity, and quality of life for people with disabling conditions. This new chapter in the relationship between Archives and the Academy and Congress provides several opportunities for rehabilitation research leadership. More than ever, Archives provides a premier mechanism to fulfill the Congress's mission and to promote our sense of community. PMID:16442965

  6. BUBBLE DYNAMICS AT GAS-EVOLVING ELECTRODES

    SciTech Connect

    Sides, Paul J.

    1980-12-01

    Nucleation of bubbles, their growth by diffusion of dissolved gas to the bubble surface and by coalescence, and their detachment from the electrode are all very fast phenomena; furthermore, electrolytically generated bubbles range in size from ten to a few hundred microns; therefore, magnification and high speed cinematography are required to observe bubbles and the phenomena of their growth on the electrode surface. Viewing the action from the front side (the surface on which the bubbles form) is complicated because the most important events occur close to the surface and are obscured by other bubbles passing between the camera and the electrode; therefore, oxygen was evolved on a transparent tin oxide "window" electrode and the events were viewed from the backside. The movies showed that coalescence of bubbles is very important for determining the size of bubbles and in the chain of transport processes; growth by diffusion and by coalescence proceeds in series and parallel; coalescing bubbles cause significant fluid motion close to the electrode; bubbles can leave and reattach; and bubbles evolve in a cycle of growth by diffusion and different modes of coalescence. An analytical solution for the primary potential and current distribution around a spherical bubble in contact with a plane electrode is presented. Zero at the contact point, the current density reaches only one percent of its undisturbed value at 30 percent of the radius from that point and goes through a shallow maximum two radii away. The solution obtained for spherical bubbles is shown to apply for the small bubbles of electrolytic processes. The incremental resistance in ohms caused by sparse arrays of bubbles is given by {Delta}R = 1.352 af/kS where f is the void fraction of gas in the bubble layer, a is the bubble layer thickness, k is the conductivity of gas free electrolyte, and S is the electrode area. A densely populated gas bubble layer on an electrode was modeled as a hexagonal array of

  7. Complex Formation History of Highly Evolved Basaltic Shergottite, Zagami

    NASA Technical Reports Server (NTRS)

    Niihara, T.; Misawa, K.; Mikouchi, T.; Nyquist, L. E.; Park, J.; Hirata, D.

    2012-01-01

    Zagami, a basaltic shergottite, contains several kinds of lithologies such as Normal Zagami consisting of Fine-grained (FG) and Coarse-grained (CG), Dark Mottled lithology (DML), and Olivine-rich late-stage melt pocket (DN). Treiman and Sutton concluded that Zagami (Normal Zagami) is a fractional crystallization product from a single magma. It has been suggested that there were two igneous stages (deep magma chamber and shallow magma chamber or surface lava flow) on the basis of chemical zoning features of pyroxenes which have homogeneous Mg-rich cores and FeO, CaO zoning at the rims. Nyquist et al. reported that FG has a different initial Sr isotopic ratio than CG and DML, and suggested the possibility of magma mixing on Mars. Here we report new results of petrology and mineralogy for DML and the Olivine-rich lithology (we do not use DN here), the most evolved lithology in this rock, to understand the relationship among lithologies and reveal Zagami s formation history

  8. Predatory prokaryotes: predation and primary consumption evolved in bacteria

    NASA Technical Reports Server (NTRS)

    Guerrero, R.; Pedros-Alio, C.; Esteve, I.; Mas, J.; Chase, D.; Margulis, L.

    1986-01-01

    Two kinds of predatory bacteria have been observed and characterized by light and electron microscopy in samples from freshwater sulfurous lakes in northeastern Spain. The first bacterium, named Vampirococcus, is Gram-negative and ovoidal (0.6 micrometer wide). An anaerobic epibiont, it adheres to the surface of phototrophic bacteria (Chromatium spp.) by specific attachment structures and, as it grows and divides by fission, destroys its prey. An important in situ predatory role can be inferred for Vampirococcus from direct counts in natural samples. The second bacterium, named Daptobacter, is a Gram-negative, facultatively anaerobic straight rod (0.5 x 1.5 micrometers) with a single polar flagellum, which collides, penetrates, and grows inside the cytoplasm of its prey (several genera of Chromatiaceae). Considering also the well-known case of Bdellovibrio, a Gram-negative, aerobic curved rod that penetrates and divides in the periplasmic space of many chemotrophic Gram-negative bacteria, there are three types of predatory prokaryotes presently known (epibiotic, cytoplasmic, and periplasmic). Thus, we conclude that antagonistic relationships such as primary consumption, predation, and scavenging had already evolved in microbial ecosystems prior to the appearance of eukaryotes. Furthermore, because they represent methods by which prokaryotes can penetrate other prokaryotes in the absence of phagocytosis, these associations can be considered preadaptation for the origin of intracellular organelles.

  9. Emergent spacetime in stochastically evolving dimensions

    NASA Astrophysics Data System (ADS)

    Afshordi, Niayesh; Stojkovic, Dejan

    2014-12-01

    Changing the dimensionality of the space-time at the smallest and largest distances has manifold theoretical advantages. If the space is lower dimensional in the high energy regime, then there are no ultraviolet divergencies in field theories, it is possible to quantize gravity, and the theory of matter plus gravity is free of divergencies or renormalizable. If the space is higher dimensional at cosmological scales, then some cosmological problems (including the cosmological constant problem) can be attacked from a completely new perspective. In this paper, we construct an explicit model of "evolving dimensions" in which the dimensions open up as the temperature of the universe drops. We adopt the string theory framework in which the dimensions are fields that live on the string worldsheet, and add temperature dependent mass terms for them. At the Big Bang, all the dimensions are very heavy and are not excited. As the universe cools down, dimensions open up one by one. Thus, the dimensionality of the space we live in depends on the energy or temperature that we are probing. In particular, we provide a kinematic Brandenberger-Vafa argument for how a discrete causal set, and eventually a continuum (3 + 1)-dim spacetime along with Einstein gravity emerges in the Infrared from the worldsheet action. The (3 + 1)-dim Planck mass and the string scale become directly related, without any compactification. Amongst other predictions, we argue that LHC might be blind to new physics even if it comes at the TeV scale. In contrast, cosmic ray experiments, especially those that can register the very beginning of the shower, and collisions with high multiplicity and density of particles, might be sensitive to the dimensional cross-over.

  10. Evolving Recommendations on Prostate Cancer Screening.

    PubMed

    Brawley, Otis W; Thompson, Ian M; Grönberg, Henrik

    2016-01-01

    Results of a number of studies demonstrate that the serum prostate-specific antigen (PSA) in and of itself is an inadequate screening test. Today, one of the most pressing questions in prostate cancer medicine is how can screening be honed to identify those who have life-threatening disease and need aggressive treatment. A number of efforts are underway. One such effort is the assessment of men in the landmark Prostate Cancer Prevention Trial that has led to a prostate cancer risk calculator (PCPTRC), which is available online. PCPTRC version 2.0 predicts the probability of the diagnosis of no cancer, low-grade cancer, or high-grade cancer when variables such as PSA, age, race, family history, and physical findings are input. Modern biomarker development promises to provide tests with fewer false positives and improved ability to find high-grade cancers. Stockholm III (STHLM3) is a prospective, population-based, paired, screen-positive, prostate cancer diagnostic study assessing a combination of plasma protein biomarkers along with age, family history, previous biopsy, and prostate examination for prediction of prostate cancer. Multiparametric MRI incorporates anatomic and functional imaging to better characterize and predict future behavior of tumors within the prostate. After diagnosis of cancer, several genomic tests promise to better distinguish the cancers that need treatment versus those that need observation. Although the new technologies are promising, there is an urgent need for evaluation of these new tests in high-quality, large population-based studies. Until these technologies are proven, most professional organizations have evolved to a recommendation of informed or shared decision making in which there is a discussion between the doctor and patient. PMID:27249774

  11. Evolvable Lunar Navigation and Communication Constellations

    NASA Astrophysics Data System (ADS)

    Hamera, Kathryn E.; Mosher, T.

    2008-05-01

    Several international space agencies have announced plans for future lunar exploration missions, including orbiters, rovers, and the eventual build-up of a lunar outpost. Each of these missions will have certain communication and navigation requirements. Some missions will explore parts of the lunar environment that are not directly visible to the Earth, and a lunar relay element will be necessary to provide critical communication and navigation support. Previous research has shown the advantages of using halo orbits for a lunar relay. Halo orbit insertion costs are less than those for geostationary orbit, station-keeping costs are minimal, and the Earth is visible for 100% of the spacecraft's orbital period. An example constellation was designed to show the feasibility of a halo orbit constellation. The methodologies used to construct the example constellation can be tailored to design halo orbit constellations to meet the coverage needs of any lunar mission. Additionally, the halo orbits are selected such that low energy-transfers may be used to transfer spacecraft between halo orbits for very small maneuver costs. This follow-on research demonstrates the capability of the constellation to evolve and reconfigure through the use of low-energy transfers. Using low-energy transfers, multiple space agencies can utilize the same lunar relay constellation by reconfiguring the spacecraft to provide optimal coverage for different missions at different times. This research was funded by a grant from NASA's Exploration System Mission Directorate to the Colorado Space Grant Consortium and a Graduate Student Research Fellowship from the National Science Foundation.

  12. Contrast-enhanced ultrasound: The evolving applications

    PubMed Central

    Xu, Hui-Xiong

    2009-01-01

    Contrast-enhanced ultrasound (CEUS) is a major breakthrough for ultrasound imaging in recent years. By using a microbubble contrast agent and contrast-specific imaging software, CEUS is able to depict the micro- and macro-circulation of the targeted organ, which in turn leads to improved performance in diagnosis. Due to the special dual blood supply system in the liver, CEUS is particularly suitable for liver imaging. It is evident that CEUS facilitates improvement for characterization of focal liver lesions (FLLs), detection of liver malignancy, guidance for interventional procedures, and evaluation of treatment response after local therapies. CEUS has been demonstrated to be equal to contrast-enhanced computed tomography or magnetic resonance imaging for the characterization of FLLs. In addition, the applicability of CEUS has expanded to non-liver structures such as gallbladder, bile duct, pancreas, kidney, spleen, breast, thyroid, and prostate. The usefulness of CEUS in these applications is confirmed by extensive literature production. Novel applications include detecting bleeding sites and hematomas in patients with abdominal trauma, guiding percutaneous injection therapy and therefore achieving the goal of using interventional ultrasonography in managing splenic trauma, assessing the activity of Crohn’s disease, and detecting suspected endoleaks after endovascular abdominal aneurysm repair. Contrast-enhanced intraoperative ultrasound (US) and intracavitary use of CEUS have been developed and clinically studied. The potential use of CEUS involves sentinel lymph node detection, drug or gene delivery, and molecular imaging. In conclusion, the advent of CEUS has greatly enhanced the usefulness of US and even changed the status of US in clinical practice. The application of CEUS in the clinic is continuously evolving and it is expected that its use will be expanded further in the future. PMID:21160717

  13. Ethical Implications of Validity-vs.-Reliability Trade-Offs in Educational Research

    ERIC Educational Resources Information Center

    Fendler, Lynn

    2016-01-01

    In educational research that calls itself empirical, the relationship between validity and reliability is that of trade-off: the stronger the bases for validity, the weaker the bases for reliability (and vice versa). Validity and reliability are widely regarded as basic criteria for evaluating research; however, there are ethical implications of…

  14. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  15. Did the ctenophore nervous system evolve independently?

    PubMed

    Ryan, Joseph F

    2014-08-01

    Recent evidence supports the placement of ctenophores as the most distant relative to all other animals. This revised animal tree means that either the ancestor of all animals possessed neurons (and that sponges and placozoans apparently lost them) or that ctenophores developed them independently. Differentiating between these possibilities is important not only from a historical perspective, but also for the interpretation of a wide range of neurobiological results. In this short perspective paper, I review the evidence in support of each scenario and show that the relationship between the nervous system of ctenophores and other animals is an unsolved, yet tractable problem. PMID:24986234

  16. Enhancing the Principal-School Counselor Relationship: Toolkit

    ERIC Educational Resources Information Center

    College Board Advocacy & Policy Center, 2011

    2011-01-01

    The College Board, NASSP and ASCA believe that the principal-counselor relationship is a dynamic and organic relationship that evolves over time in response to the ever-changing needs of a school. The goal of an effective principal-counselor relationship is to use the strength of the relationship to collaboratively lead school reform efforts to…

  17. Approach to plant automation with evolving technology

    SciTech Connect

    White, J.D.

    1989-01-01

    The US Department of Energy has provided support to Oak Ridge National Laboratory in order to pursue research leading to advanced, automated control of new innovative liquid-metal-cooled nuclear power plants. The purpose of this effort is to conduct research that will help to ensure improved operability, reliability, and safety for advanced LMRs. The plan adopted to achieve these program goals in an efficient and timely manner consists of utilizing, and advancing where required, state-of-the-art controls technology through close interaction with other national laboratories, universities, industry and utilities. A broad range of applications for the control systems strategies and the design environment developed in the course of this program is likely. A natural evolution of automated control in nuclear power plants is envisioned by ORNL to be a phased transition from today's situation of some analog control at the subsystem level with significant operator interaction to the future capability for completely automated digital control with operator supervision. The technical accomplishments provided by this program will assist the industry to accelerate this transition and provide greater economy and safety. The development of this transition to advanced, automated control system designs is expected to have extensive benefits in reduced operating costs, fewer outages, enhanced safety, improved licensability, and improved public acceptance for commercial nuclear power plants. 24 refs.

  18. Cancer and aging. An evolving panorama.

    PubMed

    Balducci, L; Extermann, M

    2000-02-01

    This article illustrates how the nosology of cancer evolves with the patient's age. If the current trends are maintained, 70% of all neoplasms will occur in persons aged 65 years and over by the year 2020, leading to increased cancer-related morbidity among older persons. Cancer control in the older person involves chemoprevention, early diagnosis, and timely and effective treatment that entails both antineoplastic therapy and symptom management. These interventions must be individualized based on a multidimensional assessment that can predict life expectancy and treatment complications and that may evaluate the quality of life of the older person. This article suggests a number of interventions that may improve cancer control in the aged. Public education is needed to illustrate the benefits of health maintenance and early detection of cancer even among older individuals, to create realistic expectations, and to heighten awareness of early symptoms and signs of cancer. Professional education is needed to train students and practitioners in the evaluation and management of the older person. Of special interest is the current initiative of the Hartford Foundation offering combined fellowships in oncology and geriatrics and incorporating principles of geriatric medicine in medical specialty training. Prudent pharmacologic principles must be followed in managing older persons with cytotoxic chemotherapy. These principles include adjusting the dose according to the patient's renal function, using epoietin to maintain hemoglobin levels of 12 g/dL or more, and using hemopoietic growth factors in persons aged 70 years and older receiving cytotoxic chemotherapy of moderate toxicity (e.g., CHOP). To assure uniformity of data, a cooperative oncology group should formulate a geriatric package outlining a common plan for evaluating function and comorbidity. This article also suggests several important areas of research items: Molecular interactions of age and cancer Host

  19. Mechanics of evolving thin film structures

    NASA Astrophysics Data System (ADS)

    Liang, Jim

    In the Stranski-Krastanov system, the lattice mismatch between the film and the substrate causes the film to break into islands. During annealing, both the surface energy and the elastic energy drive the islands to coarsen. Motivated by several related studies, we suggest that stable islands should form when a stiff ceiling is placed at a small gap above the film. We show that the role of elasticity is reversed: with the ceiling, the total elastic energy stored in the system increases as the islands coarsen laterally. Consequently, the islands select an equilibrium size to minimize the combined elastic energy and surface energy. In lithographically-induced self-assembly, when a two-phase fluid confined between parallel substrates is subjected to an electric field, one phase can self-assemble into a triangular lattice of islands in another phase. We describe a theory of the stability of the island lattice. The islands select the equilibrium diameter to minimize the combined interface energy and electrostatic energy. Furthermore, we study compressed SiGe thin film islands fabricated on a glass layer, which itself lies on a silicon wafer. Upon annealing, the glass flows, and the islands relax. A small island relaxes by in-plane expansion. A large island, however, wrinkles at the center before the in-plane relaxation arrives. The wrinkles may cause significant tensile stress in the island, leading to fracture. We model the island by the von Karman plate theory and the glass layer by the Reynolds lubrication theory. Numerical simulations evolve the in-plane expansion and the wrinkles simultaneously. We determine the critical island size, below which in-plane expansion prevails over wrinkling. Finally, in devices that integrate dissimilar materials in small dimensions, crack extension in one material often accompanies inelastic deformation in another. We analyze a channel crack advancing in an elastic film under tension, while an underlayer creeps. We use a two

  20. The Evolvement of Automobile Steering System Based on TRIZ

    NASA Astrophysics Data System (ADS)

    Zhao, Xinjun; Zhang, Shuang

    Products and techniques pass through a process of birth, growth, maturity, death and quit the stage like biological evolution process. The developments of products and techniques conform to some evolvement rules. If people know and hold these rules, they can design new kind of products and forecast the develop trends of the products. Thereby, enterprises can grasp the future technique directions of products, and make product and technique innovation. Below, based on TRIZ theory, the mechanism evolvement, the function evolvement and the appearance evolvement of automobile steering system had been analyzed and put forward some new ideas about future automobile steering system.

  1. The tortoise and the hare: slowly evolving T-cell responses take hastily evolving KIR

    PubMed Central

    van Bergen, Jeroen; Koning, Frits

    2010-01-01

    The killer cell immunoglobulin-like receptor (KIR) locus comprises a variable and rapidly evolving set of genes encoding multiple inhibitory and activating receptors. The activating receptors recently evolved from the inhibitory receptors and both bind HLA class I and probably also class I-like structures induced by viral infection. Although generally considered natural killer (NK) cell receptors, KIR are also expressed by a large fraction of effector memory T cells, which slowly accumulate during human life. These effector memory cells are functionally similar to NK cells, as they are immediate effector cells that are cytotoxic and produce IFN-γ. However, different rules apply to NK and T cells with respect to KIR expression and function. For example, KIR tend to modulate signals driven by the T-cell receptor (TCR) rather than to act independently, and use different signal transduction pathways to modulate only a subset of effector functions. The most important difference may lie in the rules governing tolerance: while NK cells with activating KIR binding self-HLA are hyporesponsive, the same is unlikely to apply to T cells. We argue that the expression of activating KIR on virus-specific T cells carrying TCR that weakly cross-react with autoantigens can unleash the autoreactive potential of these cells. This may be the case in rheumatoid arthritis, where cytomegalovirus-specific KIR2DS2+ T cells might cause vasculitis. Thus, the rapid evolution of activating KIR may have allowed for efficient NK-cell control of viruses, but may also have increased the risk that slowly evolving T-cell responses to persistent pathogens derail into autoimmunity. PMID:20722764

  2. Optimization of reliability allocation strategies through use of genetic algorithms

    SciTech Connect

    Campbell, J.E.; Painton, L.A.

    1996-08-01

    This paper examines a novel optimization technique called genetic algorithms and its application to the optimization of reliability allocation strategies. Reliability allocation should occur in the initial stages of design, when the objective is to determine an optimal breakdown or allocation of reliability to certain components or subassemblies in order to meet system specifications. The reliability allocation optimization is applied to the design of a cluster tool, a highly complex piece of equipment used in semiconductor manufacturing. The problem formulation is presented, including decision variables, performance measures and constraints, and genetic algorithm parameters. Piecewise ``effort curves`` specifying the amount of effort required to achieve a certain level of reliability for each component of subassembly are defined. The genetic algorithm evolves or picks those combinations of ``effort`` or reliability levels for each component which optimize the objective of maximizing Mean Time Between Failures while staying within a budget. The results show that the genetic algorithm is very efficient at finding a set of robust solutions. A time history of the optimization is presented, along with histograms or the solution space fitness, MTBF, and cost for comparative purposes.

  3. Reliability of wireless sensor networks.

    PubMed

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2014-01-01

    Wireless Sensor Networks (WSNs) consist of hundreds or thousands of sensor nodes with limited processing, storage, and battery capabilities. There are several strategies to reduce the power consumption of WSN nodes (by increasing the network lifetime) and increase the reliability of the network (by improving the WSN Quality of Service). However, there is an inherent conflict between power consumption and reliability: an increase in reliability usually leads to an increase in power consumption. For example, routing algorithms can send the same packet though different paths (multipath strategy), which it is important for reliability, but they significantly increase the WSN power consumption. In this context, this paper proposes a model for evaluating the reliability of WSNs considering the battery level as a key factor. Moreover, this model is based on routing algorithms used by WSNs. In order to evaluate the proposed models, three scenarios were considered to show the impact of the power consumption on the reliability of WSNs. PMID:25157553

  4. Nuclear weapon reliability evaluation methodology

    SciTech Connect

    Wright, D.L.

    1993-06-01

    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  5. Could life have evolved in cometary nuclei

    NASA Technical Reports Server (NTRS)

    Bar-Nun, A.; Lazcano-Araujo, A.; Oro, J.

    1981-01-01

    The suggestion by Hoyle and Wickramasinghe (1978) that life might have originated in cometary nuclei rather than directly on the earth is discussed. Factors in the cometary environment including the conditions at perihelion passage leading to the ablation of cometary ices, ice temperatures, the absence of an atmosphere and discrete liquid and solid surfaces, weak cometary structure incapable of supporting a liquid core, and radiation are presented as arguments against biopoesis in comets. It is concluded that although the contribution of cometary and meteoritic matter was significant in shaping the earth environment, the view that life on earth originally arose in comets is untenable, and the proposition that the process of interplanetary infection still occurs is unlikely in view of the high specificity of host-parasite relationships.

  6. Food addiction: an evolving nonlinear science.

    PubMed

    Shriner, Richard; Gold, Mark

    2014-11-01

    The purpose of this review is to familiarize readers with the role that addiction plays in the formation and treatment of obesity, type 2 diabetes and disorders of eating. We will outline several useful models that integrate metabolism, addiction, and human relationship adaptations to eating. A special effort will be made to demonstrate how the use of simple and straightforward nonlinear models can and are being used to improve our knowledge and treatment of patients suffering from nutritional pathology. Moving forward, the reader should be able to incorporate some of the findings in this review into their own practice, research, teaching efforts or other interests in the fields of nutrition, diabetes, and/or bariatric (weight) management. PMID:25421535

  7. Food Addiction: An Evolving Nonlinear Science

    PubMed Central

    Shriner, Richard; Gold, Mark

    2014-01-01

    The purpose of this review is to familiarize readers with the role that addiction plays in the formation and treatment of obesity, type 2 diabetes and disorders of eating. We will outline several useful models that integrate metabolism, addiction, and human relationship adaptations to eating. A special effort will be made to demonstrate how the use of simple and straightforward nonlinear models can and are being used to improve our knowledge and treatment of patients suffering from nutritional pathology. Moving forward, the reader should be able to incorporate some of the findings in this review into their own practice, research, teaching efforts or other interests in the fields of nutrition, diabetes, and/or bariatric (weight) management. PMID:25421535

  8. Fault tolerant highly reliable inertial navigation system

    NASA Astrophysics Data System (ADS)

    Jeerage, Mahesh; Boettcher, Kevin

    This paper describes a development of failure detection and isolation (FDI) strategies for highly reliable inertial navigation systems. FDI strategies are developed based on the generalized likelihood ratio test (GLRT). A relationship between detection threshold and false alarm rate is developed in terms of the sensor parameters. A new method for correct isolation of failed sensors is presented. Evaluation of FDI performance parameters, such as false alarm rate, wrong isolation probability, and correct isolation probability, are presented. Finally a fault recovery scheme capable of correcting false isolation of good sensors is presented.

  9. A fourth generation reliability predictor

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Martensen, Anna L.

    1988-01-01

    A reliability/availability predictor computer program has been developed and is currently being beta-tested by over 30 US companies. The computer program is called the Hybrid Automated Reliability Predictor (HARP). HARP was developed to fill an important gap in reliability assessment capabilities. This gap was manifested through the use of its third-generation cousin, the Computer-Aided Reliability Estimation (CARE III) program, over a six-year development period and an additional three-year period during which CARE III has been in the public domain. The accumulated experience of the over 30 establishments now using CARE III was used in the development of the HARP program.

  10. US electric power system reliability

    NASA Astrophysics Data System (ADS)

    Electric energy supply, transmission and distribution systems are investigated in order to determine priorities for legislation. The status and the outlook for electric power reliability are discussed.

  11. The investigation of supply chain's reliability measure: a case study

    NASA Astrophysics Data System (ADS)

    Taghizadeh, Houshang; Hafezi, Ehsan

    2012-10-01

    In this paper, using supply chain operational reference, the reliability evaluation of available relationships in supply chain is investigated. For this purpose, in the first step, the chain under investigation is divided into several stages including first and second suppliers, initial and final customers, and the producing company. Based on the formed relationships between these stages, the supply chain system is then broken down into different subsystem parts. The formed relationships between the stages are based on the transportation of the orders between stages. Paying attention to the system elements' location, which can be in one of the five forms of series namely parallel, series/parallel, parallel/series, or their combinations, we determine the structure of relationships in the divided subsystems. According to reliability evaluation scales on the three levels of supply chain, the reliability of each chain is then calculated. Finally, using the formulas of calculating the reliability in combined systems, the reliability of each system and ultimately the whole system is investigated.

  12. Evolving Expert Knowledge Bases: Applications of Crowdsourcing and Serious Gaming to Advance Knowledge Development for Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Floryan, Mark

    2013-01-01

    This dissertation presents a novel effort to develop ITS technologies that adapt by observing student behavior. In particular, we define an evolving expert knowledge base (EEKB) that structures a domain's information as a set of nodes and the relationships that exist between those nodes. The structure of this model is not the particularly novel…

  13. Impact of Device Scaling on Deep Sub-micron Transistor Reliability: A Study of Reliability Trends using SRAM

    NASA Technical Reports Server (NTRS)

    White, Mark; Huang, Bing; Qin, Jin; Gur, Zvi; Talmor, Michael; Chen, Yuan; Heidecker, Jason; Nguyen, Duc; Bernstein, Joseph

    2005-01-01

    As microelectronics are scaled in to the deep sub-micron regime, users of advanced technology CMOS, particularly in high-reliability applications, should reassess how scaling effects impact long-term reliability. An experimental based reliability study of industrial grade SRAMs, consisting of three different technology nodes, is proposed to substantiate current acceleration models for temperature and voltage life-stress relationships. This reliability study utilizes step-stress techniques to evaluate memory technologies (0.25mum, 0.15mum, and 0.13mum) embedded in many of today's high-reliability space/aerospace applications. Two acceleration modeling approaches are presented to relate experimental FIT calculations to Mfr's qualification data.

  14. Health-Literate Youth: Evolving Challenges for Health Educators

    ERIC Educational Resources Information Center

    Fetro, Joyce V.

    2010-01-01

    This article presents the author's AAHE Scholar presentation at the 2010 AAHE annual meeting in Indianapolis, Indiana. In her discussion, the author addresses what she sees to be some evolving challenges for health educators working with youth as well as some possible strategies for addressing them. These evolving challenges are: (1) understanding…

  15. The evolving velocity field around protostars

    NASA Astrophysics Data System (ADS)

    Brinch, Christian

    2008-10-01

    Using a hydrodynamical simulation of a gravitational collapse and subsequent disk formation, we calculate a time-resolved synthetic data set with a sophisticated molecular excitation and radiation transfer code. These synthetic data consist of a number of molecular gas emission lines that contains information about the density, temperature, and the velocity field. We use this simulated data set to asses how accurately we can extract information about the underlying velocity field from the lines with a simple parameterized velocity model. This model has only two free parameters, the central stellar mass and a geometric angle that describes the ratio of infall to rotation. We find that, by modeling the spectral lines, we can reliably and uniquely describe the underlying velocity field as given by the hydrodynamical simulation and we then assume that by applying the same parameterized model to real data, we can equally well determine the velocity field of observed young stellar objects. We observe two young sources, L1489 IRS in the Taurus star forming region and IRAS2A in NGC-1333. Both sources are observed with single dish telescopes (JCMT, OSO) and with the Submilimeter Array. For L1489~IRS, the interferometric observations reveal a kinematically distinct region on a scale of a few hundred AUs, dominated by rotation, which is still surrounded by some envelope material. Contrary to this, IRAS2A shows no sign of rotation despite the fact that a compact (disk) component is needed in order to interpret the continuum measurements. We do not detect this component in the velocity field and we conclude that IRAS2A is a considerably younger source than L1489 IRS. While this result is based on the gas flow alone, it is entirely consistent with the current classification of IRAS2A as a Class 0 object and L1489~IRS as a Class I object. This thesis also contains a treatment of CO depletion in the disk and envelope. Under certain temperature and density conditions, CO may freeze

  16. Evolving minds: Helping students with cognitive dissonance

    NASA Astrophysics Data System (ADS)

    Bramschreiber, Terry L.

    Even 150 years after Charles Darwin published On the Origin of Species, public school teachers still find themselves dealing with student resistance to learning about biological evolution. Some teachers deal with this pressure by undermining, deemphasizing, or even omitting the topic in their science curriculum. Others face the challenge and deliver solid scientific instruction of evolutionary theory despite the conflicts that may arise. The latter were the topic of this study. I interviewed five teachers that had experience dealing with resistance to learning evolution in their school community. Through these in-depth interviews, I examined strategies these teachers use when facing resistance and how they help students deal with the cognitive dissonance that may be experienced when learning about evolution. I selected the qualitative method of educational criticism and connoisseurship to organize and categorize my data. From the interviews, the following findings emerged. Experienced teachers increased their confidence in teaching evolution by pursuing outside professional development. They not only learned more about evolutionary theory, but about creationist arguments against evolution. These teachers front-load their curriculum to integrate the nature of science into their lessons to address misunderstandings about how science works. They also highlight the importance of learning evolutionary theory but ensure students they do not have an agenda to indoctrinate students. Finally these experienced teachers work hard to create an intellectually safe learning environment to build trusting and respectful relationships with their students.

  17. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  18. Avionics design for reliability bibliography

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A bibliography with abstracts was presented in support of AGARD lecture series No. 81. The following areas were covered: (1) program management, (2) design for high reliability, (3) selection of components and parts, (4) environment consideration, (5) reliable packaging, (6) life cycle cost, and (7) case histories.

  19. Computer-Aided Reliability Estimation

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.; Stiffler, J. J.; Bryant, L. A.; Petersen, P. L.

    1986-01-01

    CARE III (Computer-Aided Reliability Estimation, Third Generation) helps estimate reliability of complex, redundant, fault-tolerant systems. Program specifically designed for evaluation of fault-tolerant avionics systems. However, CARE III general enough for use in evaluation of other systems as well.

  20. Laplacian Estrada and normalized Laplacian Estrada indices of evolving graphs.

    PubMed

    Shang, Yilun

    2015-01-01

    Large-scale time-evolving networks have been generated by many natural and technological applications, posing challenges for computation and modeling. Thus, it is of theoretical and practical significance to probe mathematical tools tailored for evolving networks. In this paper, on top of the dynamic Estrada index, we study the dynamic Laplacian Estrada index and the dynamic normalized Laplacian Estrada index of evolving graphs. Using linear algebra techniques, we established general upper and lower bounds for these graph-spectrum-based invariants through a couple of intuitive graph-theoretic measures, including the number of vertices or edges. Synthetic random evolving small-world networks are employed to show the relevance of the proposed dynamic Estrada indices. It is found that neither the static snapshot graphs nor the aggregated graph can approximate the evolving graph itself, indicating the fundamental difference between the static and dynamic Estrada indices. PMID:25822506

  1. Laplacian Estrada and Normalized Laplacian Estrada Indices of Evolving Graphs

    PubMed Central

    Shang, Yilun

    2015-01-01

    Large-scale time-evolving networks have been generated by many natural and technological applications, posing challenges for computation and modeling. Thus, it is of theoretical and practical significance to probe mathematical tools tailored for evolving networks. In this paper, on top of the dynamic Estrada index, we study the dynamic Laplacian Estrada index and the dynamic normalized Laplacian Estrada index of evolving graphs. Using linear algebra techniques, we established general upper and lower bounds for these graph-spectrum-based invariants through a couple of intuitive graph-theoretic measures, including the number of vertices or edges. Synthetic random evolving small-world networks are employed to show the relevance of the proposed dynamic Estrada indices. It is found that neither the static snapshot graphs nor the aggregated graph can approximate the evolving graph itself, indicating the fundamental difference between the static and dynamic Estrada indices. PMID:25822506

  2. Gene Essentiality Is a Quantitative Property Linked to Cellular Evolvability.

    PubMed

    Liu, Gaowen; Yong, Mei Yun Jacy; Yurieva, Marina; Srinivasan, Kandhadayar Gopalan; Liu, Jaron; Lim, John Soon Yew; Poidinger, Michael; Wright, Graham Daniel; Zolezzi, Francesca; Choi, Hyungwon; Pavelka, Norman; Rancati, Giulia

    2015-12-01

    Gene essentiality is typically determined by assessing the viability of the corresponding mutant cells, but this definition fails to account for the ability of cells to adaptively evolve to genetic perturbations. Here, we performed a stringent screen to assess the degree to which Saccharomyces cerevisiae cells can survive the deletion of ~1,000 individual "essential" genes and found that ~9% of these genetic perturbations could in fact be overcome by adaptive evolution. Our analyses uncovered a genome-wide gradient of gene essentiality, with certain essential cellular functions being more "evolvable" than others. Ploidy changes were prevalent among the evolved mutant strains, and aneuploidy of a specific chromosome was adaptive for a class of evolvable nucleoporin mutants. These data justify a quantitative redefinition of gene essentiality that incorporates both viability and evolvability of the corresponding mutant cells and will enable selection of therapeutic targets associated with lower risk of emergence of drug resistance. PMID:26627736

  3. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  4. Diversity Against Adversity: How Adaptive Immune System Evolves Potent Antibodies

    NASA Astrophysics Data System (ADS)

    Heo, Muyoung; Zeldovich, Konstantin B.; Shakhnovich, Eugene I.

    2011-07-01

    Adaptive immunity is an amazing mechanism, whereby new protein functions—affinity of antibodies (Immunoglobulins) to new antigens—evolve through mutation and selection in a matter of a few days. Despite numerous experimental studies, the fundamental physical principles underlying immune response are still poorly understood. In considerable departure from past approaches, here, we propose a microscopic multiscale model of adaptive immune response, which consists of three essential players: The host cells, viruses, and B-cells in Germinal Centers (GC). Each moiety carries a genome, which encodes proteins whose stability and interactions are determined from their sequences using laws of Statistical Mechanics, providing an exact relationship between genomic sequences and strength of interactions between pathogens and antibodies and antibodies and host proteins (autoimmunity). We find that evolution of potent antibodies (the process known as Affinity Maturation (AM)) is a delicate balancing act, which has to reconcile the conflicting requirements of protein stability, lack of autoimmunity, and high affinity of antibodies to incoming antigens. This becomes possible only when antibody producing B cells elevate their mutation rates (process known as Somatic Hypermutation (SHM)) to fall into a certain range—not too low to find potency increasing mutations but not too high to destroy stable Immunoglobulins and/or already achieved affinity. Potent antibodies develop through clonal expansion of initial B cells expressing marginally potent antibodies followed by their subsequent affinity maturation through mutation and selection. As a result, in each GC the population of mature potent Immunoglobulins is monoclonal being ancestors of a single cell from initial (germline) pool. We developed a simple analytical theory, which provides further rationale to our findings. The model and theory reveal the molecular factors that determine the efficiency of affinity maturation

  5. PV Reliability Development Lessons from JPL's Flat Plate Solar Array Project

    NASA Technical Reports Server (NTRS)

    Ross, Ronald G., Jr.

    2013-01-01

    Key reliability and engineering lessons learned from the 20-year history of the Jet Propulsion Laboratory's Flat-Plate Solar Array Project and thin film module reliability research activities are presented and analyzed. Particular emphasis is placed on lessons applicable to evolving new module technologies and the organizations involved with these technologies. The user-specific demand for reliability is a strong function of the application, its location, and its expected duration. Lessons relative to effective means of specifying reliability are described, and commonly used test requirements are assessed from the standpoint of which are the most troublesome to pass, and which correlate best with field experience. Module design lessons are also summarized, including the significance of the most frequently encountered failure mechanisms and the role of encapsulate and cell reliability in determining module reliability. Lessons pertaining to research, design, and test approaches include the historical role and usefulness of qualification tests and field tests.

  6. Impact of staffing parameters on operational reliability

    SciTech Connect

    Hahn, H.A.; Houghton, F.K.

    1993-01-01

    This paper reports on a project related to human resource management of the Department of Energy's (DOE's) High-Level Waste (HLW) Tank program. Safety and reliability of waste tank operations is impacted by several issues, including not only the design of the tanks themselves, but also how operations and operational personnel are managed. As demonstrated by management assessments performed by the Tiger Teams, DOE believes that the effective use of human resources impacts environment safety, and health concerns. For the of the current paper, human resource management activities are identified as Staffing'' and include the of developing the functional responsibilities and qualifications of technical and administrative personnel. This paper discusses the importance of staff plans and management in the overall view of safety and reliability. The work activities and procedures associated with the project, a review of the results of these activities, including a summary of the literature and a preliminary analysis of the data. We conclude that although identification of staffing issues and the development of staffing plans contributes to the overall reliability and safety of the HLW tanks, the relationship is not well understood and is in need of further development.

  7. Impact of staffing parameters on operational reliability

    SciTech Connect

    Hahn, H.A.; Houghton, F.K.

    1993-02-01

    This paper reports on a project related to human resource management of the Department of Energy`s (DOE`s) High-Level Waste (HLW) Tank program. Safety and reliability of waste tank operations is impacted by several issues, including not only the design of the tanks themselves, but also how operations and operational personnel are managed. As demonstrated by management assessments performed by the Tiger Teams, DOE believes that the effective use of human resources impacts environment safety, and health concerns. For the of the current paper, human resource management activities are identified as ``Staffing`` and include the of developing the functional responsibilities and qualifications of technical and administrative personnel. This paper discusses the importance of staff plans and management in the overall view of safety and reliability. The work activities and procedures associated with the project, a review of the results of these activities, including a summary of the literature and a preliminary analysis of the data. We conclude that although identification of staffing issues and the development of staffing plans contributes to the overall reliability and safety of the HLW tanks, the relationship is not well understood and is in need of further development.

  8. Fundamental mechanisms of micromachine reliability

    SciTech Connect

    DE BOER,MAARTEN P.; SNIEGOWSKI,JEFFRY J.; KNAPP,JAMES A.; REDMOND,JAMES M.; MICHALSKE,TERRY A.; MAYER,THOMAS K.

    2000-01-01

    Due to extreme surface to volume ratios, adhesion and friction are critical properties for reliability of Microelectromechanical Systems (MEMS), but are not well understood. In this LDRD the authors established test structures, metrology and numerical modeling to conduct studies on adhesion and friction in MEMS. They then concentrated on measuring the effect of environment on MEMS adhesion. Polycrystalline silicon (polysilicon) is the primary material of interest in MEMS because of its integrated circuit process compatibility, low stress, high strength and conformal deposition nature. A plethora of useful micromachined device concepts have been demonstrated using Sandia National Laboratories' sophisticated in-house capabilities. One drawback to polysilicon is that in air the surface oxidizes, is high energy and is hydrophilic (i.e., it wets easily). This can lead to catastrophic failure because surface forces can cause MEMS parts that are brought into contact to adhere rather than perform their intended function. A fundamental concern is how environmental constituents such as water will affect adhesion energies in MEMS. The authors first demonstrated an accurate method to measure adhesion as reported in Chapter 1. In Chapter 2 through 5, they then studied the effect of water on adhesion depending on the surface condition (hydrophilic or hydrophobic). As described in Chapter 2, they find that adhesion energy of hydrophilic MEMS surfaces is high and increases exponentially with relative humidity (RH). Surface roughness is the controlling mechanism for this relationship. Adhesion can be reduced by several orders of magnitude by silane coupling agents applied via solution processing. They decrease the surface energy and render the surface hydrophobic (i.e. does not wet easily). However, only a molecular monolayer coats the surface. In Chapters 3-5 the authors map out the extent to which the monolayer reduces adhesion versus RH. They find that adhesion is independent of

  9. Surface capillary currents: Rediscovery of fluid-structure interaction by forced evolving boundary theory

    NASA Astrophysics Data System (ADS)

    Wang, Chunbai; Mitra, Ambar K.

    2016-01-01

    Any boundary surface evolving in viscous fluid is driven with surface capillary currents. By step function defined for the fluid-structure interface, surface currents are found near a flat wall in a logarithmic form. The general flat-plate boundary layer is demonstrated through the interface kinematics. The dynamics analysis elucidates the relationship of the surface currents with the adhering region as well as the no-slip boundary condition. The wall skin friction coefficient, displacement thickness, and the logarithmic velocity-defect law of the smooth flat-plate boundary-layer flow are derived with the advent of the forced evolving boundary method. This fundamental theory has wide applications in applied science and engineering.

  10. 18 CFR 39.5 - Reliability Standards.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Reliability Standards... RELIABILITY STANDARDS § 39.5 Reliability Standards. (a) The Electric Reliability Organization shall file each Reliability Standard or modification to a Reliability Standard that it proposes to be made effective...

  11. 18 CFR 39.5 - Reliability Standards.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Reliability Standards... RELIABILITY STANDARDS § 39.5 Reliability Standards. (a) The Electric Reliability Organization shall file each Reliability Standard or modification to a Reliability Standard that it proposes to be made effective...

  12. Could life have evolved in cometary nuclei?

    NASA Astrophysics Data System (ADS)

    Bar-Nun, A.; Lazcano-Araujo, A.; Oró, J.

    1981-12-01

    circumstances. 6. Concerning viruses, the high specificity of host-parasite relationships and their coevolutionary lines of descent, rule out a cometary origin for them. In summary, the view that life originated in comets is untenable in the light of all the available evidence.

  13. Reliability Analysis and Modeling of ZigBee Networks

    NASA Astrophysics Data System (ADS)

    Lin, Cheng-Min

    The architecture of ZigBee networks focuses on developing low-cost, low-speed ubiquitous communication between devices. The ZigBee technique is based on IEEE 802.15.4, which specifies the physical layer and medium access control (MAC) for a low rate wireless personal area network (LR-WPAN). Currently, numerous wireless sensor networks have adapted the ZigBee open standard to develop various services to promote improved communication quality in our daily lives. The problem of system and network reliability in providing stable services has become more important because these services will be stopped if the system and network reliability is unstable. The ZigBee standard has three kinds of networks; star, tree and mesh. The paper models the ZigBee protocol stack from the physical layer to the application layer and analyzes these layer reliability and mean time to failure (MTTF). Channel resource usage, device role, network topology and application objects are used to evaluate reliability in the physical, medium access control, network, and application layers, respectively. In the star or tree networks, a series system and the reliability block diagram (RBD) technique can be used to solve their reliability problem. However, a division technology is applied here to overcome the problem because the network complexity is higher than that of the others. A mesh network using division technology is classified into several non-reducible series systems and edge parallel systems. Hence, the reliability of mesh networks is easily solved using series-parallel systems through our proposed scheme. The numerical results demonstrate that the reliability will increase for mesh networks when the number of edges in parallel systems increases while the reliability quickly drops when the number of edges and the number of nodes increase for all three networks. More use of resources is another factor impact on reliability decreasing. However, lower network reliability will occur due to

  14. Calculating system reliability with SRFYDO

    SciTech Connect

    Morzinski, Jerome; Anderson - Cook, Christine M; Klamann, Richard M

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  15. A reliable multicast for XTP

    NASA Technical Reports Server (NTRS)

    Dempsey, Bert J.; Weaver, Alfred C.

    1990-01-01

    Multicast services needed for current distributed applications on LAN's fall generally into one of three categories: datagram, semi-reliable, and reliable. Transport layer multicast datagrams represent unreliable service in which the transmitting context 'fires and forgets'. XTP executes these semantics when the MULTI and NOERR mode bits are both set. Distributing sensor data and other applications in which application-level error recovery strategies are appropriate benefit from the efficiency in multidestination delivery offered by datagram service. Semi-reliable service refers to multicasting in which the control algorithms of the transport layer--error, flow, and rate control--are used in transferring the multicast distribution to the set of receiving contexts, the multicast group. The multicast defined in XTP provides semi-reliable service. Since, under a semi-reliable service, joining a multicast group means listening on the group address and entails no coordination with other members, a semi-reliable facility can be used for communication between a client and a server group as well as true peer-to-peer group communication. Resource location in a LAN is an important application domain. The term 'semi-reliable' refers to the fact that group membership changes go undetected. No attempt is made to assess the current membership of the group at any time--before, during, or after--the data transfer.

  16. The effect of genetic robustness on evolvability in digital organisms

    PubMed Central

    2008-01-01

    Background Recent work has revealed that many biological systems keep functioning in the face of mutations and therefore can be considered genetically robust. However, several issues related to robustness remain poorly understood, such as its implications for evolvability (the ability to produce adaptive evolutionary innovations). Results Here, we use the Avida digital evolution platform to explore the effects of genetic robustness on evolvability. First, we obtained digital organisms with varying levels of robustness by evolving them under combinations of mutation rates and population sizes previously shown to select for different levels of robustness. Then, we assessed the ability of these organisms to adapt to novel environments in a variety of experimental conditions. The data consistently support that, for simple environments, genetic robustness fosters long-term evolvability, whereas, in the short-term, robustness is not beneficial for evolvability but may even be a counterproductive trait. For more complex environments, however, results are less conclusive. Conclusion The finding that the effect of robustness on evolvability is time-dependent is compatible with previous results obtained using RNA folding algorithms and transcriptional regulation models. A likely scenario is that, in the short-term, genetic robustness hampers evolvability because it reduces the intensity of selection, but that, in the long-term, relaxed selection facilitates the accumulation of genetic diversity and thus, promotes evolutionary innovation. PMID:18854018

  17. Reliability analysis of interdependent lattices

    NASA Astrophysics Data System (ADS)

    Limiao, Zhang; Daqing, Li; Pengju, Qin; Bowen, Fu; Yinan, Jiang; Zio, Enrico; Rui, Kang

    2016-06-01

    Network reliability analysis has drawn much attention recently due to the risks of catastrophic damage in networked infrastructures. These infrastructures are dependent on each other as a result of various interactions. However, most of the reliability analyses of these interdependent networks do not consider spatial constraints, which are found important for robustness of infrastructures including power grid and transport systems. Here we study the reliability properties of interdependent lattices with different ranges of spatial constraints. Our study shows that interdependent lattices with strong spatial constraints are more resilient than interdependent Erdös-Rényi networks. There exists an intermediate range of spatial constraints, at which the interdependent lattices have minimal resilience.

  18. Evolving role of pharmaceutical physicians in the industry: Indian perspective

    PubMed Central

    Patil, Anant; Rajadhyaksha, Viraj

    2012-01-01

    The Indian pharmaceutical industry, like any other industry, has undergone significant change in the last decade. The role of a Medical advisor has always been of paramount importance in the pharmaceutical companies in India. On account of the evolving medical science and the competitive environment, the medical advisor's role is also increasingly becoming critical. In India, with changes in regulatory rules, safety surveillance, and concept of medical liaisons, the role of the medical advisor is evolving continuously and is further likely to evolve in the coming years in important areas like health economics, public private partnerships, and strategic planning. PMID:22347701

  19. Reliability Quantification of Advanced Stirling Convertor (ASC) Components

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward

    2010-01-01

    The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.

  20. Reliability analysis of common hazardous waste treatment processes

    SciTech Connect

    Waters, R.D.

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  1. Approximation of reliability of direct genomic breeding values

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Two methods to efficiently approximate theoretical genomic reliabilities are presented. The first method is based on the direct inverse of the left hand side (LHS) of mixed model equations. It uses the genomic relationship matrix for a small subset of individuals with the highest genomic relationshi...

  2. The Reliability of Content Analysis of Computer Conference Communication

    ERIC Educational Resources Information Center

    Rattleff, Pernille

    2007-01-01

    The focus of this article is the reliability of content analysis of students' computer conference communication. Content analysis is often used when researching the relationship between learning and the use of information and communications technology in educational settings. A number of studies where content analysis is used and classification…

  3. Reliability analysis in intelligent machines

    NASA Technical Reports Server (NTRS)

    Mcinroy, John E.; Saridis, George N.

    1990-01-01

    Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.

  4. Reliability and Maintainability (RAM) Training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Packard, Michael H. (Editor)

    2000-01-01

    The theme of this manual is failure physics-the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low-cost reliable products. In a broader sense the manual should do more. It should underscore the urgent need CI for mature attitudes toward reliability. Five of the chapters were originally presented as a classroom course to over 1000 Martin Marietta engineers and technicians. Another four chapters and three appendixes have been added, We begin with a view of reliability from the years 1940 to 2000. Chapter 2 starts the training material with a review of mathematics and a description of what elements contribute to product failures. The remaining chapters elucidate basic reliability theory and the disciplines that allow us to control and eliminate failures.

  5. An experiment in software reliability

    NASA Technical Reports Server (NTRS)

    Dunham, J. R.; Pierce, J. L.

    1986-01-01

    The results of a software reliability experiment conducted in a controlled laboratory setting are reported. The experiment was undertaken to gather data on software failures and is one in a series of experiments being pursued by the Fault Tolerant Systems Branch of NASA Langley Research Center to find a means of credibly performing reliability evaluations of flight control software. The experiment tests a small sample of implementations of radar tracking software having ultra-reliability requirements and uses n-version programming for error detection, and repetitive run modeling for failure and fault rate estimation. The experiment results agree with those of Nagel and Skrivan in that the program error rates suggest an approximate log-linear pattern and the individual faults occurred with significantly different error rates. Additional analysis of the experimental data raises new questions concerning the phenomenon of interacting faults. This phenomenon may provide one explanation for software reliability decay.

  6. Failure Analysis for Improved Reliability

    NASA Technical Reports Server (NTRS)

    Sood, Bhanu

    2016-01-01

    Outline: Section 1 - What is reliability and root cause? Section 2 - Overview of failure mechanisms. Section 3 - Failure analysis techniques (1. Non destructive analysis techniques, 2. Destructive Analysis, 3. Materials Characterization). Section 4 - Summary and Closure

  7. GaAs Reliability Database

    NASA Technical Reports Server (NTRS)

    Sacco, T.; Gonzalez, S.; Kayali, S.

    1993-01-01

    The database consists of two main sections, the data references and the device reliability records. The reference section contains 8 fields: reference number, date of publication, authors, article title, publisher, volume, and page numbers.

  8. Photovoltaics Performance and Reliability Workshop

    NASA Astrophysics Data System (ADS)

    Mrig, L.

    This document consists of papers and viewgraphs compiled from the proceedings of a workshop held in September 1992. This workshop was the fifth in a series sponsored by NREL/DOE under the general subject areas of photovoltaic module testing and reliability. PV manufacturers, DOE laboratories, electric utilities, and others exchanged technical knowledge and field experience. The topics of cell and module characterization, module and system performance, materials and module durability/reliability research, solar radiation, and applications are discussed.

  9. Accelerator Availability and Reliability Issues

    SciTech Connect

    Steve Suhring

    2003-05-01

    Maintaining reliable machine operations for existing machines as well as planning for future machines' operability present significant challenges to those responsible for system performance and improvement. Changes to machine requirements and beam specifications often reduce overall machine availability in an effort to meet user needs. Accelerator reliability issues from around the world will be presented, followed by a discussion of the major factors influencing machine availability.

  10. Qualitative Reliability Issues for In-Vessel Solid and Liquid Wall Fusion Designs

    SciTech Connect

    Cadwallader, Lee Charles; Nygren, R. E.

    2001-10-01

    This paper presents the results of a study of the qualitative aspects of plasma facing component (PFC) reliability for actively cooled solid wall and liquid wall concepts for magnetic fusion reactor vessels. These two designs have been analyzed for component failure modes. The most important results of that study are given here. A brief discussion of reliability growth in design is included to illustrate how solid wall designs have begun as workable designs and have evolved over time to become more optimized designs with better longevity. The increase in tolerable heat fluxes shows the improvement. Liquid walls could also have reliability growth if the designs had similar development efforts.

  11. Robust fusion with reliabilities weights

    NASA Astrophysics Data System (ADS)

    Grandin, Jean-Francois; Marques, Miguel

    2002-03-01

    The reliability is a value of the degree of trust in a given measurement. We analyze and compare: ML (Classical Maximum Likelihood), MLE (Maximum Likelihood weighted by Entropy), MLR (Maximum Likelihood weighted by Reliability), MLRE (Maximum Likelihood weighted by Reliability and Entropy), DS (Credibility Plausibility), DSR (DS weighted by reliabilities). The analysis is based on a model of a dynamical fusion process. It is composed of three sensors, which have each it's own discriminatory capacity, reliability rate, unknown bias and measurement noise. The knowledge of uncertainties is also severely corrupted, in order to analyze the robustness of the different fusion operators. Two sensor models are used: the first type of sensor is able to estimate the probability of each elementary hypothesis (probabilistic masses), the second type of sensor delivers masses on union of elementary hypotheses (DS masses). In the second case probabilistic reasoning leads to sharing the mass abusively between elementary hypotheses. Compared to the classical ML or DS which achieves just 50% of correct classification in some experiments, DSR, MLE, MLR and MLRE reveals very good performances on all experiments (more than 80% of correct classification rate). The experiment was performed with large variations of the reliability coefficients for each sensor (from 0 to 1), and with large variations on the knowledge of these coefficients (from 0 0.8). All four operators reveal good robustness, but the MLR reveals to be uniformly dominant on all the experiments in the Bayesian case and achieves the best mean performance under incomplete a priori information.

  12. Reliability measure for segmenting algorithms

    NASA Astrophysics Data System (ADS)

    Alvarez, Robert E.

    2004-05-01

    Segmenting is a key initial step in many computer-aided detection (CAD) systems. Our purpose is to develop a method to estimate the reliability of segmenting algorithm results. We use a statistical shape model computed using principal component analysis. The model retains a small number of eigenvectors, or modes, that represent a large fraction of the variance. The residuals between the segmenting result and its projection into the space of retained modes are computed. The sum of the squares of residuals is transformed to a zero-mean, unit standard deviation Gaussian random variable. We also use the standardized scale parameter. The reliability measure is the probability that the transformed residuals and scale parameter are greater than the absolute value of the observed values. We tested the reliability measure with thirty chest x-ray images with "leave-out-one" testing. The Gaussian assumption was verified using normal probability plots. For each image, a statistical shape model was computed from the hand-digitized data of the rest of the images in the training set. The residuals and scale parameter with automated segment results for the image were used to compute the reliability measure in each case. The reliability measure was significantly lower for two images in the training set with unusual lung fields or processing errors. The data and Matlab scripts for reproducing the figures are at http://www.aprendtech.com/papers/relmsr.zip Errors detected by the new reliability measure can be used to adjust processing or warn the user.

  13. MEMS reliability: coming of age

    NASA Astrophysics Data System (ADS)

    Douglass, Michael R.

    2008-02-01

    In today's high-volume semiconductor world, one could easily take reliability for granted. As the MOEMS/MEMS industry continues to establish itself as a viable alternative to conventional manufacturing in the macro world, reliability can be of high concern. Currently, there are several emerging market opportunities in which MOEMS/MEMS is gaining a foothold. Markets such as mobile media, consumer electronics, biomedical devices, and homeland security are all showing great interest in microfabricated products. At the same time, these markets are among the most demanding when it comes to reliability assurance. To be successful, each company developing a MOEMS/MEMS device must consider reliability on an equal footing with cost, performance and manufacturability. What can this maturing industry learn from the successful development of DLP technology, air bag accelerometers and inkjet printheads? This paper discusses some basic reliability principles which any MOEMS/MEMS device development must use. Examples from the commercially successful and highly reliable Digital Micromirror Device complement the discussion.

  14. Developing Classroom Based Instructional Products: An Evolving Set of Guidelines

    ERIC Educational Resources Information Center

    Niedermeyer, Fred C.

    1976-01-01

    The guidelines suggested in this article have evolved from the development of nationally distributed instructional systems over the past seven years at SWRL, a National Institute of Education-sponsored educational research and development laboratory. (Author)

  15. An Evolved International Lunar Decade Global Exploration Roadmap

    NASA Astrophysics Data System (ADS)

    Dunlop, D.; Holder, K.

    2015-10-01

    An Evolved Global Exploration Roadmap (GER) reflecting a proposed International Lunar Decade is presented by an NSS chapter to address many of the omissions and new prospective commercial mission developments since the 2013 edition of the ISECG GER.

  16. Further Comments on Reliability and Power of Significance Tests and Reliability, Power, Functions, and Relations: A Reply to Humphreys.

    ERIC Educational Resources Information Center

    Humphreys, Lloyd G.; And Others

    1993-01-01

    Two articles discuss the controversy about the relationship between reliability and the power of significance tests in response to the discussion of Donald W. Zimmerman, Richard H. Williams, and Bruno D. Zumbo. Lloyd G. Humphreys emphasizes the differences between what statisticians can do and constraints on researchers. Zimmerman, Williams, and…

  17. The interday reliability of leg and ankle musculotendinous stiffness measures.

    PubMed

    McLachlan, Ken A; Murphy, Aron J; Watsford, Mark L; Rees, Sven

    2006-11-01

    Two popular methods of assessing lower body musculotendinous stiffness include the hopping and oscillation tests. The disparity and paucity of reliability data prompted this investigation into leg musculotendinous stiffness (Kleg) and ankle musculotendinous stiffness (Kank) measures. Kleg and Kank were assessed on three separate occasions in 20 female subjects. Kleg was determined using bilateral hopping procedures conducted at 2.2 Hz and 3.2 Hz frequencies. Kank was assessed by perturbation of the subject's ankle musculotendinous unit on an instrumented calf raise apparatus at 70% of maximum isometric force (MIF). Excellent reliability was produced for all Kleg measures between all days, whereas Kank exhibited acceptable reliability after one session of familiarization. No relationship was evident between Kleg and Kank. It was concluded that no familiarization session was required for Kleg at the test frequencies and conditions tested, whereas at least one familiarization session was needed to ensure the reliable assessment of Kank. PMID:17293626

  18. Evolved Stars: Interferometer Baby Food or Staple Diet?

    NASA Astrophysics Data System (ADS)

    Tuthill, Peter

    With their extreme red and infrared luminosities and large apparent diameters, evolved stars have nurtured generations of interferometers (beginning with Michelson's work on Betelgeuse) with unique science programs at attainable resolutions. Furthermore, the inflated photosphere and circumstellar material associated with dying stars presents complex targets with asymmetric structure on many scales encoding a wealth of poorly-understood astrophysics. A brief review the major past milestones and future prospects for interferometry's contribution to studies of circumstellar matter in evolved stars is presented.

  19. CSP Manufacturing Challenges and Assembly Reliability

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    2000-01-01

    Although the expression of CSP is widely used by industry from suppliers to users, its implied definition had evolved as the technology has matured. There are "expert definition"- package that is up to 1.5 time die- or "interim definition". CSPs are miniature new packages that industry is starting to implement and there are many unresolved technical issues associated with their implementation. For example, in early 1997, packages with 1 mm pitch and lower were the dominant CSPs, whereas in early 1998 packages with 0.8 mm and lower became the norm for CSPs. Other changes included the use of flip chip die rather than wire bond in CSP. Nonetheless the emerging CSPs are competing with bare die assemblies and are becoming the package of choice for size reduction applications. These packages provide the benefits of small size and performance of the bare die or flip chip, with the advantage of standard die packages. The JPL-led MicrotypeBGA Consortium of enterprises representing government agencies and private companies have jointed together to pool in-kind resources for developing the quality and reliability of chip scale packages (CSPs) for a variety of projects. This talk will cover specifically the experience of our consortium on technology implementation challenges, including design and build of both standard and microvia boards, assembly of two types of test vehicles, and the most current environmental thermal cycling test results.

  20. A new evolutionary system for evolving artificial neural networks.

    PubMed

    Yao, X; Liu, Y

    1997-01-01

    This paper presents a new evolutionary system, i.e., EPNet, for evolving artificial neural networks (ANNs). The evolutionary algorithm used in EPNet is based on Fogel's evolutionary programming (EP). Unlike most previous studies on evolving ANN's, this paper puts its emphasis on evolving ANN's behaviors. Five mutation operators proposed in EPNet reflect such an emphasis on evolving behaviors. Close behavioral links between parents and their offspring are maintained by various mutations, such as partial training and node splitting. EPNet evolves ANN's architectures and connection weights (including biases) simultaneously in order to reduce the noise in fitness evaluation. The parsimony of evolved ANN's is encouraged by preferring node/connection deletion to addition. EPNet has been tested on a number of benchmark problems in machine learning and ANNs, such as the parity problem, the medical diagnosis problems, the Australian credit card assessment problem, and the Mackey-Glass time series prediction problem. The experimental results show that EPNet can produce very compact ANNs with good generalization ability in comparison with other algorithms. PMID:18255671

  1. Reliability of BGA Packages for Highly Reliable Application and Chip Scale Package Board Level Reliability

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    1997-01-01

    Diffenent aspects of advanced surface mount package technology have been investigated for aerospace applications. Three key areas included understanding assembly reliability behavior of conventional surface Mount, Ball Grid Arrays (BGAs), and Chip Scale Packages.

  2. The States and Higher Education: An Evolving Relationship at a Pivotal Moment

    ERIC Educational Resources Information Center

    Meotti, Michael P.

    2016-01-01

    The "proud-parent" attitude of states towards higher education between 1945 and 1970--due to the baby boom, the technological contributions that research universities had made to the war effort, and the GI Bill--began to cool in the late 1960s, when inflation and increasing demands from other state services such as Medicaid, prisons,…

  3. Crossroads and Connections: An Evolving Relationship between NASA and the Navajo Nation

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Carron, A.

    2010-08-01

    Is working with Native Americans business as usual? We live in a project-based world that operates on three-to-five-year grants. A long term commitment can be next to impossible to keep, even if you have the best of intentions. Are there things one "must know" before approaching an indigenous population? How is it best to evaluate projects and programs involving Native Americans? In the NASA and the Navajo Nation project, which will turn five in January, 2010, we have compiled some key lessons learned that we hope will inform and encourage future partnerships between the space science education and Native American communities.

  4. 18 CFR 39.5 - Reliability Standards.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Reliability Standards... RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.5 Reliability Standards. (a) The Electric Reliability Organization shall file...

  5. 18 CFR 39.11 - Reliability reports.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Reliability reports. 39... RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.11 Reliability reports. (a) The Electric Reliability Organization shall...

  6. Climate in Context - How partnerships evolve in regions

    NASA Astrophysics Data System (ADS)

    Parris, A. S.

    2014-12-01

    In 2015, NOAA's RISA program will celebrate its 20th year of exploration in the development of usable climate information. In the mid-1990s, a vision emerged to develop interdisciplinary research efforts at the regional scale for several important reasons. Recognizable climate patterns, such as the El Nino Southern Oscillation (ENSO), emerge at the regional level where our understanding of observations and models coalesce. Critical resources for society are managed in a context of regional systems, such as water supply and human populations. Multiple scales of governance (local, state, and federal) with complex institutional relationships can be examined across a region. Climate information (i.e. data, science, research etc) developed within these contexts has greater potential for use. All of this work rests on a foundation of iterative engagement between scientists and decision makers. Throughout these interactions, RISAs have navigated diverse politics, extreme events and disasters, socio-economic and ecological disruptions, and advances in both science and technology. Our understanding of information needs is evolving into a richer understanding of complex institutional, legal, political, and cultural contexts within which people can use science to make informed decisions. The outcome of RISA work includes both cases where climate information was used in decisions and cases where capacity for using climate information and making climate resilient decisions has increased over time. In addition to balancing supply and demand of scientific information, RISAs are engaged in a social process of reconciling climate information use with important drivers of society. Because partnerships are critical for sustained engagement, and because engagement is critically important to the use of science, the rapid development of new capacity in regionally-based science programs focused on providing climate decision support is both needed and challenging. New actors can bolster

  7. Emergence of memory-driven command neurons in evolved artificial agents.

    PubMed

    Aharonov-Barki, R; Beker, T; Ruppin, E

    2001-03-01

    Using evolutionary simulations, we develop autonomous agents controlled by artificial neural networks (ANNs). In simple lifelike tasks of foraging and navigation, high performance levels are attained by agents equipped with fully recurrent ANN controllers. In a set of experiments sharing the same behavioral task but differing in the sensory input available to the agents, we find a common structure of a command neuron switching the dynamics of the network between radically different behavioral modes. When sensory position information is available, the command neuron reflects a map of the environment, acting as a location-dependent cell sensitive to the location and orientation of the agent. When such information is unavailable, the command neuron's activity is based on a spontaneously evolving short-term memory mechanism, which underlies its apparent place-sensitive activity. A two-parameter stochastic model for this memory mechanism is proposed. We show that the parameter values emerging from the evolutionary simulations are near optimal; evolution takes advantage of seemingly harmful features of the environment to maximize the agent's foraging efficiency. The accessibility of evolved ANNs for a detailed inspection, together with the resemblance of some of the results to known findings from neurobiology, places evolved ANNs as an excellent candidate model for the study of structure and function relationship in complex nervous systems. PMID:11244562

  8. TEMPI: probabilistic modeling time-evolving differential PPI networks with multiPle information

    PubMed Central

    Kim, Yongsoo; Jang, Jin-Hyeok; Choi, Seungjin; Hwang, Daehee

    2014-01-01

    Motivation: Time-evolving differential protein–protein interaction (PPI) networks are essential to understand serial activation of differentially regulated (up- or downregulated) cellular processes (DRPs) and their interplays over time. Despite developments in the network inference, current methods are still limited in identifying temporal transition of structures of PPI networks, DRPs associated with the structural transition and the interplays among the DRPs over time. Results: Here, we present a probabilistic model for estimating Time-Evolving differential PPI networks with MultiPle Information (TEMPI). This model describes probabilistic relationships among network structures, time-course gene expression data and Gene Ontology biological processes (GOBPs). By maximizing the likelihood of the probabilistic model, TEMPI estimates jointly the time-evolving differential PPI networks (TDNs) describing temporal transition of PPI network structures together with serial activation of DRPs associated with transiting networks. This joint estimation enables us to interpret the TDNs in terms of temporal transition of the DRPs. To demonstrate the utility of TEMPI, we applied it to two time-course datasets. TEMPI identified the TDNs that correctly delineated temporal transition of DRPs and time-dependent associations between the DRPs. These TDNs provide hypotheses for mechanisms underlying serial activation of key DRPs and their temporal associations. Availability and implementation: Source code and sample data files are available at http://sbm.postech.ac.kr/tempi/sources.zip. Contact: seungjin@postech.ac.kr or dhwang@dgist.ac.kr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25161233

  9. System reliability and risk assessment task goals and status

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Mahadevan, S.

    1991-01-01

    The major focus for continued development of the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) codes is in support of system testing and certification of advanced propulsion systems. Propulsion system testing has evolved over the years from tests designed to show success, to tests designed to reveal reliability issues before service use. Such test conditions as performance envelope corners, high rotor imbalance, power dwells, and overspeed tests are designed to shake out problems that can be associated with low and high cycle fatigue, creep, and stress rupture, bearing durability, and the like. Subsystem testing supports system certification by standing as an early evaluation of the same durability and reliability concerns as for the entire system. The NESSUS software system is being further developed to support the definition of rigorous subsystem and system test definition and reliability certification. The principal technical issues are outlined which are related to system reliability, including key technology issues such as failure mode synergism, sequential failure mechanisms, and fault tree definition.

  10. Assessment of NDE reliability data

    NASA Technical Reports Server (NTRS)

    Yee, B. G. W.; Couchman, J. C.; Chang, F. H.; Packman, D. F.

    1975-01-01

    Twenty sets of relevant nondestructive test (NDT) reliability data were identified, collected, compiled, and categorized. A criterion for the selection of data for statistical analysis considerations was formulated, and a model to grade the quality and validity of the data sets was developed. Data input formats, which record the pertinent parameters of the defect/specimen and inspection procedures, were formulated for each NDE method. A comprehensive computer program was written and debugged to calculate the probability of flaw detection at several confidence limits by the binomial distribution. This program also selects the desired data sets for pooling and tests the statistical pooling criteria before calculating the composite detection reliability. An example of the calculated reliability of crack detection in bolt holes by an automatic eddy current method is presented.

  11. Electronics reliability and measurement technology

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S. (Editor)

    1987-01-01

    A summary is presented of the Electronics Reliability and Measurement Technology Workshop. The meeting examined the U.S. electronics industry with particular focus on reliability and state-of-the-art technology. A general consensus of the approximately 75 attendees was that "the U.S. electronics industries are facing a crisis that may threaten their existence". The workshop had specific objectives to discuss mechanisms to improve areas such as reliability, yield, and performance while reducing failure rates, delivery times, and cost. The findings of the workshop addressed various aspects of the industry from wafers to parts to assemblies. Key problem areas that were singled out for attention are identified, and action items necessary to accomplish their resolution are recommended.

  12. A Review of Score Reliability: Contemporary Thinking on Reliability Issues

    ERIC Educational Resources Information Center

    Rosen, Gerald A.

    2004-01-01

    Bruce Thompson's edited volume begins with a basic principle, one might call it a basic truth, "reliability is a property that applies to scores, and not immutably across all conceivable uses everywhere of a given measure." (p. 3). The author claims that this principle is little known and-or little understood. While that is an arguable point, the…

  13. Reliability in the design phase

    SciTech Connect

    Siahpush, A.S.; Hills, S.W.; Pham, H.; Majumdar, D.

    1991-12-01

    A study was performed to determine the common methods and tools that are available to calculated or predict a system`s reliability. A literature review and software survey are included. The desired product of this developmental work is a tool for the system designer to use in the early design phase so that the final design will achieve the desired system reliability without lengthy testing and rework. Three computer programs were written which provide the first attempt at fulfilling this need. The programs are described and a case study is presented for each one. This is a continuing effort which will be furthered in FY-1992. 10 refs.

  14. Reliability in the design phase

    SciTech Connect

    Siahpush, A.S.; Hills, S.W.; Pham, H. ); Majumdar, D. )

    1991-12-01

    A study was performed to determine the common methods and tools that are available to calculated or predict a system's reliability. A literature review and software survey are included. The desired product of this developmental work is a tool for the system designer to use in the early design phase so that the final design will achieve the desired system reliability without lengthy testing and rework. Three computer programs were written which provide the first attempt at fulfilling this need. The programs are described and a case study is presented for each one. This is a continuing effort which will be furthered in FY-1992. 10 refs.

  15. Photovoltaic power system reliability considerations

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.

    1980-01-01

    An example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems is presented. This particular application is for a solar cell power system demonstration project designed to provide electric power requirements for remote villages. The techniques utilized involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of fail-safe and planned spare parts engineering philosophy.

  16. Metrological Reliability of Medical Devices

    NASA Astrophysics Data System (ADS)

    Costa Monteiro, E.; Leon, L. F.

    2015-02-01

    The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.

  17. Reliability growth models for NASA applications

    NASA Technical Reports Server (NTRS)

    Taneja, Vidya S.

    1991-01-01

    The objective of any reliability growth study is prediction of reliability at some future instant. Another objective is statistical inference, estimation of reliability for reliability demonstration. A cause of concern for the development engineer and management is that reliability demands an excessive number of tests for reliability demonstration. For example, the Space Transportation Main Engine (STME) program requirements call for .99 reliability at 90 pct. confidence for demonstration. This requires running 230 tests with zero failure if a classical binomial model is used. It is therefore also an objective to explore the reliability growth models for reliability demonstration and tracking and their applicability to NASA programs. A reliability growth model is an analytical tool used to monitor the reliability progress during the development program and to establish a test plan to demonstrate an acceptable system reliability.

  18. Evolving scalable and modular adaptive networks with Developmental Symbolic Encoding.

    PubMed

    Suchorzewski, Marcin

    2011-09-01

    Evolutionary neural networks, or neuroevolution, appear to be a promising way to build versatile adaptive systems, combining evolution and learning. One of the most challenging problems of neuroevolution is finding a scalable and robust genetic representation, which would allow to effectively grow increasingly complex networks for increasingly complex tasks. In this paper we propose a novel developmental encoding for networks, featuring scalability, modularity, regularity and hierarchy. The encoding allows to represent structural regularities of networks and build them from encapsulated and possibly reused subnetworks. These capabilities are demonstrated on several test problems. In particular for parity and symmetry problems we evolve solutions, which are fully general with respect to the number of inputs. We also evolve scalable and modular weightless recurrent networks capable of autonomous learning in a simple generic classification task. The encoding is very flexible and we demonstrate this by evolving networks capable of learning via neuromodulation. Finally, we evolve modular solutions to the retina problem, for which another well known neuroevolution method-HyperNEAT-was previously shown to fail. The proposed encoding outperformed HyperNEAT and Cellular Encoding also in another experiment, in which certain connectivity patterns must be discovered between layers. Therefore we conclude the proposed encoding is an interesting and competitive approach to evolve networks. PMID:21957432

  19. The value of monitoring to control evolving populations

    PubMed Central

    Fischer, Andrej; Mustonen, Ville

    2015-01-01

    Populations can evolve to adapt to external changes. The capacity to evolve and adapt makes successful treatment of infectious diseases and cancer difficult. Indeed, therapy resistance has become a key challenge for global health. Therefore, ideas of how to control evolving populations to overcome this threat are valuable. Here we use the mathematical concepts of stochastic optimal control to study what is needed to control evolving populations. Following established routes to calculate control strategies, we first study how a polymorphism can be maintained in a finite population by adaptively tuning selection. We then introduce a minimal model of drug resistance in a stochastically evolving cancer cell population and compute adaptive therapies. When decisions are in this manner based on monitoring the response of the tumor, this can outperform established therapy paradigms. For both case studies, we demonstrate the importance of high-resolution monitoring of the target population to achieve a given control objective, thus quantifying the intuition that to control, one must monitor. PMID:25587136

  20. How Hierarchical Topics Evolve in Large Text Corpora.

    PubMed

    Cui, Weiwei; Liu, Shixia; Wu, Zhuofeng; Wei, Hao

    2014-12-01

    Using a sequence of topic trees to organize documents is a popular way to represent hierarchical and evolving topics in text corpora. However, following evolving topics in the context of topic trees remains difficult for users. To address this issue, we present an interactive visual text analysis approach to allow users to progressively explore and analyze the complex evolutionary patterns of hierarchical topics. The key idea behind our approach is to exploit a tree cut to approximate each tree and allow users to interactively modify the tree cuts based on their interests. In particular, we propose an incremental evolutionary tree cut algorithm with the goal of balancing 1) the fitness of each tree cut and the smoothness between adjacent tree cuts; 2) the historical and new information related to user interests. A time-based visualization is designed to illustrate the evolving topics over time. To preserve the mental map, we develop a stable layout algorithm. As a result, our approach can quickly guide users to progressively gain profound insights into evolving hierarchical topics. We evaluate the effectiveness of the proposed method on Amazon's Mechanical Turk and real-world news data. The results show that users are able to successfully analyze evolving topics in text data. PMID:26356942

  1. Evolving Synaptic Plasticity with an Evolutionary Cellular Development Model

    PubMed Central

    Yerushalmi, Uri; Teicher, Mina

    2008-01-01

    Since synaptic plasticity is regarded as a potential mechanism for memory formation and learning, there is growing interest in the study of its underlying mechanisms. Recently several evolutionary models of cellular development have been presented, but none have been shown to be able to evolve a range of biological synaptic plasticity regimes. In this paper we present a biologically plausible evolutionary cellular development model and test its ability to evolve different biological synaptic plasticity regimes. The core of the model is a genomic and proteomic regulation network which controls cells and their neurites in a 2D environment. The model has previously been shown to successfully evolve behaving organisms, enable gene related phenomena, and produce biological neural mechanisms such as temporal representations. Several experiments are described in which the model evolves different synaptic plasticity regimes using a direct fitness function. Other experiments examine the ability of the model to evolve simple plasticity regimes in a task -based fitness function environment. These results suggest that such evolutionary cellular development models have the potential to be used as a research tool for investigating the evolutionary aspects of synaptic plasticity and at the same time can serve as the basis for novel artificial computational systems. PMID:19002249

  2. The value of monitoring to control evolving populations.

    PubMed

    Fischer, Andrej; Vázquez-García, Ignacio; Mustonen, Ville

    2015-01-27

    Populations can evolve to adapt to external changes. The capacity to evolve and adapt makes successful treatment of infectious diseases and cancer difficult. Indeed, therapy resistance has become a key challenge for global health. Therefore, ideas of how to control evolving populations to overcome this threat are valuable. Here we use the mathematical concepts of stochastic optimal control to study what is needed to control evolving populations. Following established routes to calculate control strategies, we first study how a polymorphism can be maintained in a finite population by adaptively tuning selection. We then introduce a minimal model of drug resistance in a stochastically evolving cancer cell population and compute adaptive therapies. When decisions are in this manner based on monitoring the response of the tumor, this can outperform established therapy paradigms. For both case studies, we demonstrate the importance of high-resolution monitoring of the target population to achieve a given control objective, thus quantifying the intuition that to control, one must monitor. PMID:25587136

  3. Preliminary study of the reliability of imaging charge coupled devices

    NASA Technical Reports Server (NTRS)

    Beall, J. R.; Borenstein, M. D.; Homan, R. A.; Johnson, D. L.; Wilson, D. D.; Young, V. F.

    1978-01-01

    Imaging CCDs are capable of low light level response and high signal-to-noise ratios. In space applications they offer the user the ability to achieve extremely high resolution imaging with minimum circuitry in the photo sensor array. This work relates the CCD121H Fairchild device to the fundamentals of CCDs and the representative technologies. Several failure modes are described, construction is analyzed and test results are reported. In addition, the relationship of the device reliability to packaging principles is analyzed and test data presented. Finally, a test program is defined for more general reliability evaluation of CCDs.

  4. Creating good relationships: responsiveness, relationship quality, and interpersonal goals.

    PubMed

    Canevello, Amy; Crocker, Jennifer

    2010-07-01

    Perceived partner responsiveness is a core feature of close, satisfying relationships. But how does responsiveness originate? Can people create relationships characterized by high responsiveness and, consequently, higher quality relationships? The authors suggest that goals contribute to cycles of responsiveness between two people, improving relationship quality for both of them. The present studies examine (a) how interpersonal goals initiate responsiveness processes in close relationships, (b) the self-perpetuating nature of these processes, and (c) how responsiveness evolves dynamically over time through both intrapersonal projection and reciprocal interpersonal relationship processes. In a semester-long study of 115 roommate dyads, actors' compassionate and self-image goals predicted a cycle of responsiveness between roommates, occurring within weeks and across the semester. In a 3-week study of 65 roommate dyads, actors' goals again predicted cycles of responsiveness between roommates, which then contributed to both actors' and partners' relationship quality. Results suggest that both projection and reciprocation of responsiveness associated with compassionate goals create upward spirals of responsiveness that ultimately enhance relationship quality for both people. PMID:20565187

  5. The Reliability of College Grades

    ERIC Educational Resources Information Center

    Beatty, Adam S.; Walmsley, Philip T.; Sackett, Paul R.; Kuncel, Nathan R.; Koch, Amanda J.

    2015-01-01

    Little is known about the reliability of college grades relative to how prominently they are used in educational research, and the results to date tend to be based on small sample studies or are decades old. This study uses two large databases (N > 800,000) from over 200 educational institutions spanning 13 years and finds that both first-year…

  6. Web Awards: Are They Reliable?

    ERIC Educational Resources Information Center

    Everhart, Nancy; McKnight, Kathleen

    1997-01-01

    School library media specialists recommend quality Web sites to children based on evaluations and Web awards. This article examines three types of Web awards and who grants them, suggests ways to determine their reliability, and discusses specific award sites. Includes a bibliography of Web sites. (PEN)

  7. Reliability Analysis of Money Habitudes

    ERIC Educational Resources Information Center

    Delgadillo, Lucy M.; Bushman, Brittani S.

    2015-01-01

    Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…

  8. Wind turbine reliability database update.

    SciTech Connect

    Peters, Valerie A.; Hill, Roger Ray; Stinebaugh, Jennifer A.; Veers, Paul S.

    2009-03-01

    This report documents the status of the Sandia National Laboratories' Wind Plant Reliability Database. Included in this report are updates on the form and contents of the Database, which stems from a fivestep process of data partnerships, data definition and transfer, data formatting and normalization, analysis, and reporting. Selected observations are also reported.

  9. Averaging Internal Consistency Reliability Coefficients

    ERIC Educational Resources Information Center

    Feldt, Leonard S.; Charter, Richard A.

    2006-01-01

    Seven approaches to averaging reliability coefficients are presented. Each approach starts with a unique definition of the concept of "average," and no approach is more correct than the others. Six of the approaches are applicable to internal consistency coefficients. The seventh approach is specific to alternate-forms coefficients. Although the…

  10. Photovoltaic performance and reliability workshop

    SciTech Connect

    Kroposki, B

    1996-10-01

    This proceedings is the compilation of papers presented at the ninth PV Performance and Reliability Workshop held at the Sheraton Denver West Hotel on September 4--6, 1996. This years workshop included presentations from 25 speakers and had over 100 attendees. All of the presentations that were given are included in this proceedings. Topics of the papers included: defining service lifetime and developing models for PV module lifetime; examining and determining failure and degradation mechanisms in PV modules; combining IEEE/IEC/UL testing procedures; AC module performance and reliability testing; inverter reliability/qualification testing; standardization of utility interconnect requirements for PV systems; need activities to separate variables by testing individual components of PV systems (e.g. cells, modules, batteries, inverters,charge controllers) for individual reliability and then test them in actual system configurations; more results reported from field experience on modules, inverters, batteries, and charge controllers from field deployed PV systems; and system certification and standardized testing for stand-alone and grid-tied systems.

  11. Wanted: A Solid, Reliable PC

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    This article discusses PC reliability, one of the most pressing issues regarding computers. Nearly a quarter century after the introduction of the first IBM PC and the outset of the personal computer revolution, PCs have largely become commodities, with little differentiating one brand from another in terms of capability and performance. Most of…

  12. Discourse Analysis Procedures: Reliability Issues.

    ERIC Educational Resources Information Center

    Hux, Karen; And Others

    1997-01-01

    A study evaluated and compared four methods of assessing reliability on one discourse analysis procedure--a modified version of Damico's Clinical Discourse Analysis. The methods were Pearson product-moment correlations; interobserver agreement; Cohen's kappa; and generalizability coefficients. The strengths and weaknesses of the methods are…

  13. Synthesis of Evolving Cells for Reconfigurable Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Padayachee, J.; Bright, G.

    2014-07-01

    The concept of Reconfigurable Manufacturing Systems (RMSs) was formulated due to the global necessity for production systems that are able to economically evolve according to changes in markets and products. Technologies and design methods are under development to enable RMSs to exhibit transformable system layouts, reconfigurable processes, cells and machines. Existing factory design methods and software have not yet advanced to include reconfigurable manufacturing concepts. This paper presents the underlying group technology framework for the design of manufacturing cells that are able to evolve according to a changing product mix by mechanisms of reconfiguration. The framework is based on a Norton- Bass forecast and time variant BOM models. An adaptation of legacy group technology methods is presented for the synthesis of evolving cells and two optimization problems are presented within this context.

  14. Hybridization Reveals the Evolving Genomic Architecture of Speciation

    PubMed Central

    Kronforst, Marcus R.; Hansen, Matthew E.B.; Crawford, Nicholas G.; Gallant, Jason R.; Zhang, Wei; Kulathinal, Rob J.; Kapan, Durrell D.; Mullen, Sean P.

    2014-01-01

    SUMMARY The rate at which genomes diverge during speciation is unknown, as are the physical dynamics of the process. Here, we compare full genome sequences of 32 butterflies, representing five species from a hybridizing Heliconius butterfly community, to examine genome-wide patterns of introgression and infer how divergence evolves during the speciation process. Our analyses reveal that initial divergence is restricted to a small fraction of the genome, largely clustered around known wing-patterning genes. Over time, divergence evolves rapidly, due primarily to the origin of new divergent regions. Furthermore, divergent genomic regions display signatures of both selection and adaptive introgression, demonstrating the link between microevolutionary processes acting within species and the origin of species across macroevolutionary timescales. Our results provide a uniquely comprehensive portrait of the evolving species boundary due to the role that hybridization plays in reducing the background accumulation of divergence at neutral sites. PMID:24183670

  15. Lessons learned from the comparison of evolve and chain

    NASA Astrophysics Data System (ADS)

    Reynolds, R.; Eichler, P.

    1997-05-01

    EVOLVE and CHAIN are two models for the projection of the orbital debris environment. They were developed independently using very different conceptual approaches. Consequently, their comparison has proven to be valuable for validating the debris environment projections for both programs. The project to use EVOLVE and CHAIN to validate one another and to develop a complementary use of the two codes has been documented in a series of papers. The early papers in this series were focused on a comparison of results for environment projections over the next 100 years. These comparison produced relatively minor changes (and improvements) in both programs that could be explained by conceptual differences designed into the original codes. Later papers in the series focused on using EVOLVE to establish rate coefficients to be used by CHAIN for longer term environment projections.

  16. Composition of the Silicates around Evolved Stars and Protostars

    NASA Astrophysics Data System (ADS)

    Demyk, K.; Dartois, E.; Wiesemeyer, H.; Jones, A.; D'Hendecourt, L.; Jourdain de Muizon, M.; Heras, A. M.

    2000-11-01

    We present a study of the composition of the silicates around five evolved stars and three high-mass protostars. Around evolved stars, the oxygen-rich dust is composed of amorphous olivine, crystalline silicates (enstatite, forsterite, diopside) and some oxides (FeO, Al2O3). Using a radiative transfer code we have modelled the SED of two OH/IR stars. We estimate that the amount of crystalline silicates in these objects is of the order of 20%. Around protostars, the dust is composed of porous pyroxene and/or aluminosilicate grains containing iron oxide. We calculate that at most 1-2% of the dust mass is crystalline. The newly formed dust around evolved stars has a different structure and composition from the old dust found around protostars. This implies that some mechanism, which remains to be found, occurs during the grain lifetime and alters the chemical composition and structure of the grains.

  17. The evolved basis and adaptive functions of cognitive distortions.

    PubMed

    Gilbert, P

    1998-12-01

    This paper explores common cognitive distortions from the perspective of evolutionary psychology. It is suggested that cognitive distortions are natural consequences of using fast track defensive algorithms that are sensitive to threat. In various contexts, especially those of threat, humans evolved to think adaptively rather than logically. Hence cognitive distortions are not strictly errors in brain functioning and it can be useful to inform patients that 'negative thinking' may be dysfunctional but is a reflection of basic brain design and not personal irrationality. The evolved nature of cognitive distortions has been implicit in cognitive therapy from its early days (Beck, 1963; Ellis, 1962) but has not been fully articulated in what is now known about evolved mental processes. Many forms of cognitive distortion can be seen to use the (previously) adaptive heuristic of better safe than sorry. PMID:9875955

  18. Evolving Lorentzian wormholes supported by phantom matter and cosmological constant

    SciTech Connect

    Cataldo, Mauricio; Campo, Sergio del; Minning, Paul; Salgado, Patricio

    2009-01-15

    In this paper we study the possibility of sustaining an evolving wormhole via exotic matter made of phantom energy in the presence of a cosmological constant. We derive analytical evolving wormhole geometries by supposing that the radial tension of the phantom matter, which is negative to the radial pressure, and the pressure measured in the tangential directions have barotropic equations of state with constant state parameters. In this case the presence of a cosmological constant ensures accelerated expansion of the wormhole configurations. More specifically, for positive cosmological constant we have wormholes which expand forever and, for negative cosmological constant we have wormholes which expand to a maximum value and then recollapse. At spatial infinity the energy density and the pressures of the anisotropic phantom matter threading the wormholes vanish; thus these evolving wormholes are asymptotically vacuum {lambda}-Friedmann models with either open or closed or flat topologies.

  19. Evolving Systems: An Outcome of Fondest Hopes and Wildest Dreams

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Balas, Mark J.

    2012-01-01

    New theory is presented for evolving systems, which are autonomously controlled subsystems that self-assemble into a new evolved system with a higher purpose. Evolving systems of aerospace structures often require additional control when assembling to maintain stability during the entire evolution process. This is the concept of Adaptive Key Component Control that operates through one specific component to maintain stability during the evolution. In addition, this control must often overcome persistent disturbances that occur while the evolution is in progress. Theoretical results will be presented for Adaptive Key Component control for persistent disturbance rejection. An illustrative example will demonstrate the Adaptive Key Component controller on a system composed of rigid body and flexible body modes.

  20. How is cyber threat evolving and what do organisations need to consider?

    PubMed

    Borrett, Martin; Carter, Roger; Wespi, Andreas

    Organisations and members of the public are becoming accustomed to the increasing velocity, frequency and variety of cyber-attacks that they have been facing over the last few years. In response to this challenge, it is important to explore what can be done to offer commercial and private users a reliable and functioning environment. This paper discusses how cyber threats might evolve in the future and seeks to explore these threats more fully. Attention is paid to the changing nature of cyber-attackers and their motivations and what this means for organisations. Finally, useful and actionable steps are provided, which practitioners can use to understand how they can start to address the future challenges of cyber security. PMID:24457327

  1. A nonspecific defensive compound evolves into a competition avoidance cue and a female sex pheromone

    PubMed Central

    Weiss, Ingmar; Rössler, Thomas; Hofferberth, John; Brummer, Michael; Ruther, Joachim; Stökl, Johannes

    2013-01-01

    The evolution of chemical communication and the origin of pheromones are among the most challenging issues in chemical ecology. Current theory predicts that chemical communication can arise from compounds primarily evolved for non-communicative purposes but experimental evidence showing a gradual evolution of non-informative compounds into cues and true signals is scarce. Here we report that females of the parasitic wasp Leptopilina heterotoma use the defensive compound (−)-iridomyrmecin as a semiochemical cue to avoid interference with con- and heterospecific competitors and as the main component of a species-specific sex pheromone. Although competition avoidance is mediated by (−)-iridomyrmecin alone, several structurally related minor compounds are necessary for reliable mate attraction and recognition. Our findings provide insights into the evolution of insect pheromones by demonstrating that the increasing specificity of chemical information is accompanied by an increasing complexity of the chemical messengers involved and the evolution of the chemosensory adaptations for their exploitation. PMID:24231727

  2. A nonspecific defensive compound evolves into a competition avoidance cue and a female sex pheromone.

    PubMed

    Weiss, Ingmar; Rössler, Thomas; Hofferberth, John; Brummer, Michael; Ruther, Joachim; Stökl, Johannes

    2013-01-01

    The evolution of chemical communication and the origin of pheromones are among the most challenging issues in chemical ecology. Current theory predicts that chemical communication can arise from compounds primarily evolved for non-communicative purposes but experimental evidence showing a gradual evolution of non-informative compounds into cues and true signals is scarce. Here we report that females of the parasitic wasp Leptopilina heterotoma use the defensive compound (-)-iridomyrmecin as a semiochemical cue to avoid interference with con- and heterospecific competitors and as the main component of a species-specific sex pheromone. Although competition avoidance is mediated by (-)-iridomyrmecin alone, several structurally related minor compounds are necessary for reliable mate attraction and recognition. Our findings provide insights into the evolution of insect pheromones by demonstrating that the increasing specificity of chemical information is accompanied by an increasing complexity of the chemical messengers involved and the evolution of the chemosensory adaptations for their exploitation. PMID:24231727

  3. A Model for Estimating the Reliability and Validity of Criterion-Referenced Measures.

    ERIC Educational Resources Information Center

    Edmonston, Leon P.; Randall, Robert S.

    A decision model designed to determine the reliability and validity of criterion referenced measures (CRMs) is presented. General procedures which pertain to the model are discussed as to: Measures of relationship, Reliability, Validity (content, criterion-oriented, and construct validation), and Item Analysis. The decision model is presented in…

  4. Differences in Reliability of Reproductive History Recall among Women in North Africa

    ERIC Educational Resources Information Center

    Soliman, Amr; Allen, Katharine; Lo, An-Chi; Banerjee, Mousumi; Hablas, Ahmed; Benider, Abdellatif; Benchekroun, Nadya; Samir, Salwa; Omar, Hoda G.; Merajver, Sofia; Mullan, Patricia

    2009-01-01

    Breast cancer is the most common cancer among women in North Africa. Women in this region have unique reproductive profiles. It is essential to obtain reliable information on reproductive histories to help better understand the relationship between reductive health and breast cancer. We tested the reliability of a reproductive history-based…

  5. [A score in transference. BIP--experiencing the relationship in psychoanalysis].

    PubMed

    Herold, R

    1998-08-01

    The (emotional) experience of reference in psychoanalyses, presented here, has been developed by the author in continuation of a method evolved by Gill and Hoffman 1982 in Chicago, which they termed "The Patient's Experience of the Relationship with the Therapist", to describe the course of analytical work on patient resistance against transference. The "Experience of reference in psychoanalyses" is a rating method that links by means of its two manuals quantitative methods (which are empirical in the conventional sense) with qualitative, clinical-hermeneutical research approaches. After reliable coding of categorial data of a tape recorder transcription, a condensed description of the course and a clinical comment are obtained, the graphic representation of which reads like a score transference. PMID:9745320

  6. Seyfert's Sextet (HGC 79): An Evolved Stephan's Quintet?

    NASA Astrophysics Data System (ADS)

    Durbala, A.; Sulentic, J.; Rosado, M.; Del Olmo, A.; Perea, J.; Plana, H.

    Scanning Fabry-Perot interferometers MOS/SIS (3.6m CFHT)+PUMA (2.1m OAN-SPM, México) and the long-slit spectrograph ALFOSC (2.5m NOT, La Palma) were used to measure the kinematics of gas and stars in Seyfert's Sextet (HCG79). We interpret it as a highly evolved group that formed from sequential acquistion of mostly late-type galaxies that are now slowly coalescing and undergoing strong secular evolution. We find evidence for possible feedback as revealed by accretion and minor merger events in two of the most evolved members.

  7. Active Printed Materials for Complex Self-Evolving Deformations

    PubMed Central

    Raviv, Dan; Zhao, Wei; McKnelly, Carrie; Papadopoulou, Athina; Kadambi, Achuta; Shi, Boxin; Hirsch, Shai; Dikovsky, Daniel; Zyracki, Michael; Olguin, Carlos; Raskar, Ramesh; Tibbits, Skylar

    2014-01-01

    We propose a new design of complex self-evolving structures that vary over time due to environmental interaction. In conventional 3D printing systems, materials are meant to be stable rather than active and fabricated models are designed and printed as static objects. Here, we introduce a novel approach for simulating and fabricating self-evolving structures that transform into a predetermined shape, changing property and function after fabrication. The new locally coordinated bending primitives combine into a single system, allowing for a global deformation which can stretch, fold and bend given environmental stimulus. PMID:25522053

  8. Active printed materials for complex self-evolving deformations.

    PubMed

    Raviv, Dan; Zhao, Wei; McKnelly, Carrie; Papadopoulou, Athina; Kadambi, Achuta; Shi, Boxin; Hirsch, Shai; Dikovsky, Daniel; Zyracki, Michael; Olguin, Carlos; Raskar, Ramesh; Tibbits, Skylar

    2014-01-01

    We propose a new design of complex self-evolving structures that vary over time due to environmental interaction. In conventional 3D printing systems, materials are meant to be stable rather than active and fabricated models are designed and printed as static objects. Here, we introduce a novel approach for simulating and fabricating self-evolving structures that transform into a predetermined shape, changing property and function after fabrication. The new locally coordinated bending primitives combine into a single system, allowing for a global deformation which can stretch, fold and bend given environmental stimulus. PMID:25522053

  9. An Approach to Generating Program Code in Quickly Evolving Environments

    NASA Astrophysics Data System (ADS)

    Ablonskis, Linas

    In model-driven engineering (MDE) program code generators are used to generate program code from abstract program models, thus bringing the final code closer to program specification and saving time that would be spent in coding. Current approach to program code generation from abstract program models does not work well in quickly evolving environments due to the large amount of work that is required to fully prepare and maintain program code generator. This chapter presents analysis of current approach to program code generation and presents an alternative approach tailored for generating program code in quickly evolving environments by using self-configuring program code generator.

  10. Reconstruction of evolved dynamic networks from degree correlations

    NASA Astrophysics Data System (ADS)

    Karalus, Steffen; Krug, Joachim

    2016-06-01

    We study the importance of local structural properties in networks which have been evolved for a power-law scaling in their Laplacian spectrum. To this end, the degree distribution, two-point degree correlations, and degree-dependent clustering are extracted from the evolved networks and used to construct random networks with the prescribed distributions. In the analysis of these reconstructed networks it turns out that the degree distribution alone is not sufficient to generate the spectral scaling and the degree-dependent clustering has only an indirect influence. The two-point correlations are found to be the dominant characteristic for the power-law scaling over a broader eigenvalue range.

  11. Transport of nuclear-encoded proteins into secondarily evolved plastids.

    PubMed

    Hempel, Franziska; Bozarth, Andrew; Sommer, Maik S; Zauner, Stefan; Przyborski, Jude M; Maier, Uwe-G

    2007-09-01

    Many algal groups evolved by engulfment and intracellular reduction of a eukaryotic phototroph within a heterotrophic cell. Via this process, so-called secondary plastids evolved, surrounded by three or four membranes. In these organisms most of the genetic material encoding plastid functions is localized in the cell nucleus, with the result that many proteins have to pass three, four, or even five membranes to reach their final destination within the plastid. In this article, we review recent models and findings that help to explain important cellular mechanisms involved in the complex process of protein transport into secondary plastids. PMID:17696773

  12. Reliability Evaluation of Passive Systems Through Functional Reliability Assessment

    SciTech Connect

    Burgazzi, Luciano

    2003-11-15

    A methodology, to quantify the reliability of passive safety systems, proposed for use in advanced reactor design, is developed. Passive systems are identified as systems that do not need any external input or energy to operate and rely only upon natural physical laws (e.g., gravity, natural circulation, heat conduction, internally stored energy, etc.) and/or intelligent use of the energy inherently available in the system (e.g., chemical reaction, decay heat, etc.). The reliability of a passive system refers to the ability of the system to carry out the required function under the prevailing condition when required: The passive system may fail its mission, in addition to the classical mechanical failure of its components, for deviation from the expected behavior, due to physical phenomena or to different boundary and initial conditions. The present research activity is finalized at the reliability estimation of passive B systems (i.e., implementing moving working fluids, see IAEA); the selected system is a loop operating in natural circulation including a heat source and a heat sink.The functional reliability concept, defined as the probability to perform the required mission, is introduced, and the R-S (Resistance-Stress) model taken from fracture mechanics is adopted. R and S are coined as expressions of functional Requirement and system State. Water mass flow circulating through the system is accounted as a parameter defining the passive system performance, and probability distribution functions (pdf's) are assigned to both R and S quantities; thus, the mission of the passive system defines which parameter values are considered a failure by comparing the corresponding pdfs according to a defined safety criteria. The methodology, its application, and results of the analysis are presented and discussed.

  13. Highly reliable multisensor array (MSA) smart transducers

    NASA Astrophysics Data System (ADS)

    Perotti, José; Lucena, Angel; Mackey, Paul; Mata, Carlos; Immer, Christopher

    2006-05-01

    Many developments in the field of multisensor array (MSA) transducers have taken place in the last few years. Advancements in fabrication technology, such as Micro-Electro-Mechanical Systems (MEMS) and nanotechnology, have made implementation of MSA devices a reality. NASA Kennedy Space Center (KSC) has been developing this type of technology because of the increases in safety, reliability, and performance and the reduction in operational and maintenance costs that can be achieved with these devices. To demonstrate the MSA technology benefits, KSC quantified the relationship between the number of sensors (N) and the associated improvement in sensor life and reliability. A software algorithm was developed to monitor and assess the health of each element and the overall MSA. Furthermore, the software algorithm implemented criteria on how these elements would contribute to the MSA-calculated output to ensure required performance. The hypothesis was that a greater number of statistically independent sensor elements would provide a measurable increase in measurement reliability. A computer simulation was created to answer this question. An array of N sensors underwent random failures in the simulation and a life extension factor (LEF equals the percentage of the life of a single sensor) was calculated by the program. When LEF was plotted as a function of N, a quasiexponential behavior was detected with marginal improvement above N = 30. The hypothesis and follow-on simulation results were then corroborated experimentally. An array composed of eight independent pressure sensors was fabricated. To accelerate sensor life cycle and failure and to simulate degradation over time, the MSA was exposed to an environmental tem-perature of 125°C. Every 24 hours, the experiment's environmental temperature was returned to ambient temperature (27°C), and the outputs of all the MSA sensor elements were measured. Once per week, the MSA calibration was verified at five different

  14. JMP Applications in Photovoltaic Reliability (Presentation)

    SciTech Connect

    Jordan, D.; Gotwalt, C.

    2011-09-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted into power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders?utility companies, integrators, investors, and scientist alike. Outdoor testing plays a vital part in quantifying degradation rates of different technologies in various climates. Due to seasonal changes, however, several complete cycles (typically 3-5 years) need to be completed traditionally to obtain reasonably accurate degradation rates. In a rapidly evolving industry such a time span is often unacceptable and the need exists to determine degradation rates more accurately in a shorter period of time. Advanced time series modeling such as ARIMA (Autoregressive Integrated Moving Average) modeling can be utilized to decrease the required time span and is compared with some non-linear modeling. In addition, it will be demonstrated how the JMP 9 map feature was used to reveal important technological trends by climate.

  15. Significant lexical relationships

    SciTech Connect

    Pedersen, T.; Kayaalp, M.; Bruce, R.

    1996-12-31

    Statistical NLP inevitably deals with a large number of rare events. As a consequence, NLP data often violates the assumptions implicit in traditional statistical procedures such as significance testing. We describe a significance test, an exact conditional test, that is appropriate for NLP data and can be performed using freely available software. We apply this test to the study of lexical relationships and demonstrate that the results obtained using this test are both theoretically more reliable and different from the results obtained using previously applied tests.

  16. 77 FR 26686 - Transmission Planning Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-07

    ... Standard TPL-002-0b, submitted by the North American Electric Reliability Corporation (NERC), the...) Reliability Standard TPL- 002-0b, submitted by the North American Electric Reliability Corporation (NERC), the... Reliability Standards, Notice of Proposed Rulemaking, 76 FR 66229 (Oct. 20, 2011), FERC Stats. & Regs. ]...

  17. Towards Evolving Electronic Circuits for Autonomous Space Applications

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Haith, Gary L.; Colombano, Silvano P.; Stassinopoulos, Dimitris

    2000-01-01

    The relatively new field of Evolvable Hardware studies how simulated evolution can reconfigure, adapt, and design hardware structures in an automated manner. Space applications, especially those requiring autonomy, are potential beneficiaries of evolvable hardware. For example, robotic drilling from a mobile platform requires high-bandwidth controller circuits that are difficult to design. In this paper, we present automated design techniques based on evolutionary search that could potentially be used in such applications. First, we present a method of automatically generating analog circuit designs using evolutionary search and a circuit construction language. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. Using a parallel genetic algorithm, we present experimental results for five design tasks. Second, we investigate the use of coevolution in automated circuit design. We examine fitness evaluation by comparing the effectiveness of four fitness schedules. The results indicate that solution quality is highest with static and co-evolving fitness schedules as compared to the other two dynamic schedules. We discuss these results and offer two possible explanations for the observed behavior: retention of useful information, and alignment of problem difficulty with circuit proficiency.

  18. Research at the Crossroads: How Intellectual Initiatives across Disciplines Evolve

    ERIC Educational Resources Information Center

    Frost, Susan H.; Jean, Paul M.; Teodorescu, Daniel; Brown, Amy B.

    2004-01-01

    How do intellectual initiatives across disciplines evolve? This qualitative case study of 11 interdisciplinary research initiatives at Emory University identifies key factors in their development: the passionate commitments of scholarly leaders, the presence of strong collegial networks, access to timely and multiple resources, flexible practices,…

  19. The Evolving Significance of Race: Living, Learning, and Teaching

    ERIC Educational Resources Information Center

    Hughes, Sherick A., Ed.; Berry, Theodorea Regina, Ed.

    2012-01-01

    Individuals are living, learning, and teaching by questioning how to address race in a society that consistently prefers to see itself as colorblind, a society claiming to seek a "post-racial" existence. This edited volume offers evidence of the evolving significance of race from a diverse group of male and female contributors self-identifying as…

  20. Hip Hop Is Now: An Evolving Youth Culture

    ERIC Educational Resources Information Center

    Taylor, Carl; Taylor, Virgil

    2007-01-01

    Emerging from Rap music, Hip Hop has become a lifestyle to many modern youth around the world. Embodying both creativity and controversy, Hip Hop mirrors the values, violence, and hypocrisy of modern culture. The authors dispel some of the simplistic views that surround this evolving youth movement embraced by millions of young people who are…

  1. Evolving Strategies for Cancer and Autoimmunity: Back to the Future

    PubMed Central

    Lane, Peter J. L.; McConnell, Fiona M.; Anderson, Graham; Nawaf, Maher G.; Gaspal, Fabrina M.; Withers, David R.

    2014-01-01

    Although current thinking has focused on genetic variation between individuals and environmental influences as underpinning susceptibility to both autoimmunity and cancer, an alternative view is that human susceptibility to these diseases is a consequence of the way the immune system evolved. It is important to remember that the immunological genes that we inherit and the systems that they control were shaped by the drive for reproductive success rather than for individual survival. It is our view that human susceptibility to autoimmunity and cancer is the evolutionarily acceptable side effect of the immune adaptations that evolved in early placental mammals to accommodate a fundamental change in reproductive strategy. Studies of immune function in mammals show that high affinity antibodies and CD4 memory, along with its regulation, co-evolved with placentation. By dissection of the immunologically active genes and proteins that evolved to regulate this step change in the mammalian immune system, clues have emerged that may reveal ways of de-tuning both effector and regulatory arms of the immune system to abrogate autoimmune responses whilst preserving protection against infection. Paradoxically, it appears that such a detuned and deregulated immune system is much better equipped to mount anti-tumor immune responses against cancers. PMID:24782861

  2. Coevolution Drives the Emergence of Complex Traits and Promotes Evolvability

    PubMed Central

    Zaman, Luis; Meyer, Justin R.; Devangam, Suhas; Bryson, David M.; Lenski, Richard E.; Ofria, Charles

    2014-01-01

    The evolution of complex organismal traits is obvious as a historical fact, but the underlying causes—including the role of natural selection—are contested. Gould argued that a random walk from a necessarily simple beginning would produce the appearance of increasing complexity over time. Others contend that selection, including coevolutionary arms races, can systematically push organisms toward more complex traits. Methodological challenges have largely precluded experimental tests of these hypotheses. Using the Avida platform for digital evolution, we show that coevolution of hosts and parasites greatly increases organismal complexity relative to that otherwise achieved. As parasites evolve to counter the rise of resistant hosts, parasite populations retain a genetic record of past coevolutionary states. As a consequence, hosts differentially escape by performing progressively more complex functions. We show that coevolution's unique feedback between host and parasite frequencies is a key process in the evolution of complexity. Strikingly, the hosts evolve genomes that are also more phenotypically evolvable, similar to the phenomenon of contingency loci observed in bacterial pathogens. Because coevolution is ubiquitous in nature, our results support a general model whereby antagonistic interactions and natural selection together favor both increased complexity and evolvability. PMID:25514332

  3. An Evolved Wavelet Library Based on Genetic Algorithm

    PubMed Central

    Vaithiyanathan, D.; Seshasayanan, R.; Kunaraj, K.; Keerthiga, J.

    2014-01-01

    As the size of the images being captured increases, there is a need for a robust algorithm for image compression which satiates the bandwidth limitation of the transmitted channels and preserves the image resolution without considerable loss in the image quality. Many conventional image compression algorithms use wavelet transform which can significantly reduce the number of bits needed to represent a pixel and the process of quantization and thresholding further increases the compression. In this paper the authors evolve two sets of wavelet filter coefficients using genetic algorithm (GA), one for the whole image portion except the edge areas and the other for the portions near the edges in the image (i.e., global and local filters). Images are initially separated into several groups based on their frequency content, edges, and textures and the wavelet filter coefficients are evolved separately for each group. As there is a possibility of the GA settling in local maximum, we introduce a new shuffling operator to prevent the GA from this effect. The GA used to evolve filter coefficients primarily focuses on maximizing the peak signal to noise ratio (PSNR). The evolved filter coefficients by the proposed method outperform the existing methods by a 0.31 dB improvement in the average PSNR and a 0.39 dB improvement in the maximum PSNR. PMID:25405225

  4. The Evolving Understanding of the Construct of Intellectual Disability

    ERIC Educational Resources Information Center

    Schalock, Robert L.

    2011-01-01

    This article addresses two major areas concerned with the evolving understanding of the construct of intellectual disability. The first part of the article discusses current answers to five critical questions that have revolved around the general question, "What is Intellectual Disability?" These five are what to call the phenomenon, how to…

  5. Tensions inherent in the evolving role of the infection preventionist

    PubMed Central

    Conway, Laurie J.; Raveis, Victoria H.; Pogorzelska-Maziarz, Monika; Uchida, May; Stone, Patricia W.; Larson, Elaine L.

    2014-01-01

    Background The role of infection preventionists (IPs) is expanding in response to demands for quality and transparency in health care. Practice analyses and survey research have demonstrated that IPs spend a majority of their time on surveillance and are increasingly responsible for prevention activities and management; however, deeper qualitative aspects of the IP role have rarely been explored. Methods We conducted a qualitative content analysis of in-depth interviews with 19 IPs at hospitals throughout the United States to describe the current IP role, specifically the ways that IPs effect improvements and the facilitators and barriers they face. Results The narratives document that the IP role is evolving in response to recent changes in the health care landscape and reveal that this progression is associated with friction and uncertainty. Tensions inherent in the evolving role of the IP emerged from the interviews as 4 broad themes: (1) expanding responsibilities outstrip resources, (2) shifting role boundaries create uncertainty, (3) evolving mechanisms of influence involve trade-offs, and (4) the stress of constant change is compounded by chronic recurring challenges. Conclusion Advances in implementation science, data standardization, and training in leadership skills are needed to support IPs in their evolving role. PMID:23880116

  6. The Evolving Theme of Teaching Multicultural Art Education. Monograph Series.

    ERIC Educational Resources Information Center

    La Pierre, Sharon Greenleaf, Ed.; Ballengee-Morris, Christine, Ed.

    This publication, sponsored by the U.S. Society of Education through Art (USSEA) as a forum of past presidents involving audience participation, aims to stimulate dialogue on the evolving theme of teaching multicultural issues and what affects student learning. Session participants were past presidents of the USSEA who prepared written statements…

  7. Evolving Nature of Sexual Orientation and Gender Identity

    ERIC Educational Resources Information Center

    Jourian, T. J.

    2015-01-01

    This chapter discusses the historical and evolving terminology, constructs, and ideologies that inform the language used by those who are lesbian, gay, bisexual, and same-gender loving, who may identify as queer, as well as those who are members of trans* communities from multiple and intersectional perspectives.

  8. A View from Above: The Evolving Sociological Landscape

    ERIC Educational Resources Information Center

    Moody, James; Light, Ryan

    2006-01-01

    How has sociology evolved over the last 40 years? In this paper, we examine networks built on thousands of sociology-relevant papers to map sociology's position in the wider social sciences and identify changes in the most prominent research fronts in the discipline. We find first that sociology seems to have traded centrality in the field of…

  9. Two New Evolved Gabbroic Samples from Apollo 16

    NASA Technical Reports Server (NTRS)

    Zeigler, R. A.; Korotev, R. L.; Jolliff, B. L.; Haskin, L. A.

    2002-01-01

    We have found petrographic and geochemical data for two evolved monomict mafic rocks collected at the Apollo 16 site. While they somewhat resemble sodic ferrogabbro, they may be fragments of the Th-rich plutonic rocks thought to underlie the PKT. Additional information is contained in the original extended abstract.

  10. Do Infants Possess an Evolved Spider-Detection Mechanism?

    ERIC Educational Resources Information Center

    Rakison, David H.; Derringer, Jaime

    2008-01-01

    Previous studies with various non-human animals have revealed that they possess an evolved predator recognition mechanism that specifies the appearance of recurring threats. We used the preferential looking and habituation paradigms in three experiments to investigate whether 5-month-old human infants have a perceptual template for spiders that…

  11. Today`s control systems evolved from early pioneers` dreams

    SciTech Connect

    Smith, D.J.

    1996-04-01

    In the last 100 years, power plant controls have evolved from manual operation and simple instruments to automatic state-of-the-art computerized control systems using smart instruments. This article traces the evolution of controls. The topics of the article include early control systems, developments in the early 20th century, Bailey controls, and developments in the late 20th century.

  12. A Conceptual Framework for Evolving, Recommender Online Learning Systems

    ERIC Educational Resources Information Center

    Peiris, K. Dharini Amitha; Gallupe, R. Brent

    2012-01-01

    A comprehensive conceptual framework is developed and described for evolving recommender-driven online learning systems (ROLS). This framework describes how such systems can support students, course authors, course instructors, systems administrators, and policy makers in developing and using these ROLS. The design science information systems…

  13. A review of evolving critical priorities for irrigated agriculture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The evolving roles and critical priorities of irrigated agriculture, as perceived by practitioners, researchers, and policy makers, were reviewed. Irrigated agriculture has played a vital role in meeting food and fiber demands on a relatively small proportion of total arable land. This role is prese...

  14. Strategic Planning for Policy Development--An Evolving Model.

    ERIC Educational Resources Information Center

    Verstegen, Deborah A.; Wagoner, Jennings L., Jr.

    1989-01-01

    Strategic planning, a necessary alternative to logical incrementalism in turbulent environments, will let educators move from a reactive to a proactive posture. This article briefly reviews strategic planning literature, focuses on environmental scanning, and describes an evolving model developed for the chief state school officers of a four-state…

  15. A New Framework for Dynamically Evolving Database Environments.

    ERIC Educational Resources Information Center

    Yannakoudakis, E. J.; Tsionos, C. X.; Kapetis, C. A.

    1999-01-01

    Describes research aimed at investigating dynamically evolving database environments and corresponding schemata that allow storage and manipulation of variable length data, a variable number of fields per record, variable length records and fields, and dynamically defined objects. Proposes a new framework for dynamic database environments.…

  16. The Evolving Military Learner Population: A Review of the Literature

    ERIC Educational Resources Information Center

    Ford, Kate; Vignare, Karen

    2015-01-01

    This literature review examines the evolving online military learner population with emphasis on current generation military learners, who are most frequently Post-9/11 veterans. The review synthesizes recent scholarly and grey literature on military learner demographics and attributes, college experiences, and academic outcomes against a backdrop…

  17. Genetic redundancy in evolving populations of simulated robots.

    PubMed

    Miglino, Orazio; Walker, Richard

    2002-01-01

    A number of authors have argued that redundancy in biological organisms contributes to their evolvability. We investigate this hypothesis via the experimental manipulation of genetic redundancy in evolving populations of simulated robots controlled by artificial neural networks. A genetic algorithm is used to simulate the evolution of robots with the ability to perform a previously studied task. Redundancy is measured using systematic lesioning. In our experiments, populations of robots with larger genotypes achieve systematically higher fitness than populations whose genotypes are smaller. It is shown that, in principle, robots with smaller genotypes have enough computational power to achieve optimal fitness. Populations with larger (redundant) genotypes appear, however, to be more evolvable and display significantly higher diversity. It is argued that this enhanced evolvability is a direct effect of genetic redundancy, which allows populations of redundant robots to explore neutral networks spanning large areas of genotype space. We conjecture that, where cost considerations allow, redundancy in functional or potentially functional components of the genome may make a valuable contribution to evolution in artificial and perhaps in biological systems. The methods described in the article provide a practical way of testing this hypothesis for the artificial case. PMID:12537686

  18. Optimists' Creed: Brave New Cyberlearning, Evolving Utopias (Circa 2041)

    ERIC Educational Resources Information Center

    Burleson, Winslow; Lewis, Armanda

    2016-01-01

    This essay imagines the role that artificial intelligence innovations play in the integrated living, learning and research environments of 2041. Here, in 2041, in the context of increasingly complex wicked challenges, whose solutions by their very nature continue to evade even the most capable experts, society and technology have co-evolved to…

  19. The Evolving Status of Photojournalism Education. ERIC Digest.

    ERIC Educational Resources Information Center

    Cookman, Claude

    Noting that new technologies are resulting in extensive changes in the field of photojournalism, both as it is practiced and taught, this Digest reviews this rapidly evolving field of education and professional practice. It discusses what digital photography is; the history of digital photography; how digital photography has changed…

  20. The Evolving Role of the Head of Department

    ERIC Educational Resources Information Center

    Kerry, Trevor

    2005-01-01

    This paper examines three concepts relating to the role of heads of department (HoDs) in secondary schools: boundary management; the roles of subject leadership and departmental functioning as HoD activities; and the place of HoDs in evolving school hierarchies. To throw light on the last an empirical study is reported that explores hierarchies in…

  1. Gearbox Reliability Collaborative Bearing Calibration

    SciTech Connect

    van Dam, J.

    2011-10-01

    NREL has initiated the Gearbox Reliability Collaborative (GRC) to investigate the root cause of the low wind turbine gearbox reliability. The GRC follows a multi-pronged approach based on a collaborative of manufacturers, owners, researchers and consultants. The project combines analysis, field testing, dynamometer testing, condition monitoring, and the development and population of a gearbox failure database. At the core of the project are two 750kW gearboxes that have been redesigned and rebuilt so that they are representative of the multi-megawatt gearbox topology currently used in the industry. These gearboxes are heavily instrumented and are tested in the field and on the dynamometer. This report discusses the bearing calibrations of the gearboxes.

  2. On-orbit spacecraft reliability

    NASA Technical Reports Server (NTRS)

    Bloomquist, C.; Demars, D.; Graham, W.; Henmi, P.

    1978-01-01

    Operational and historic data for 350 spacecraft from 52 U.S. space programs were analyzed for on-orbit reliability. Failure rates estimates are made for on-orbit operation of spacecraft subsystems, components, and piece parts, as well as estimates of failure probability for the same elements during launch. Confidence intervals for both parameters are also given. The results indicate that: (1) the success of spacecraft operation is only slightly affected by most reported incidents of anomalous behavior; (2) the occurrence of the majority of anomalous incidents could have been prevented piror to launch; (3) no detrimental effect of spacecraft dormancy is evident; (4) cycled components in general are not demonstrably less reliable than uncycled components; and (5) application of product assurance elements is conductive to spacecraft success.

  3. Three approaches to reliability analysis

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.

    1989-01-01

    It is noted that current reliability analysis tools differ not only in their solution techniques, but also in their approach to model abstraction. The analyst must be satisfied with the constraints that are intrinsic to any combination of solution technique and model abstraction. To get a better idea of the nature of these constraints, three reliability analysis tools (HARP, ASSIST/SURE, and CAME) were used to model portions of the Integrated Airframe/Propulsion Control System architecture. When presented with the example problem, all three tools failed to produce correct results. In all cases, either the tool or the model had to be modified. It is suggested that most of the difficulty is rooted in the large model size and long computational times which are characteristic of Markov model solutions.

  4. Assessment of NDE Reliability Data

    NASA Technical Reports Server (NTRS)

    Yee, B. G. W.; Chang, F. H.; Covchman, J. C.; Lemon, G. H.; Packman, P. F.

    1976-01-01

    Twenty sets of relevant Nondestructive Evaluation (NDE) reliability data have been identified, collected, compiled, and categorized. A criterion for the selection of data for statistical analysis considerations has been formulated. A model to grade the quality and validity of the data sets has been developed. Data input formats, which record the pertinent parameters of the defect/specimen and inspection procedures, have been formulated for each NDE method. A comprehensive computer program has been written to calculate the probability of flaw detection at several confidence levels by the binomial distribution. This program also selects the desired data sets for pooling and tests the statistical pooling criteria before calculating the composite detection reliability. Probability of detection curves at 95 and 50 percent confidence levels have been plotted for individual sets of relevant data as well as for several sets of merged data with common sets of NDE parameters.

  5. What makes a family reliable?

    NASA Technical Reports Server (NTRS)

    Williams, James G.

    1992-01-01

    Asteroid families are clusters of asteroids in proper element space which are thought to be fragments from former collisions. Studies of families promise to improve understanding of large collision events and a large event can open up the interior of a former parent body to view. While a variety of searches for families have found the same heavily populated families, and some searches have found the same families of lower population, there is much apparent disagreement between proposed families of lower population of different investigations. Indicators of reliability, factors compromising reliability, an illustration of the influence of different data samples, and a discussion of how several investigations perceived families in the same region of proper element space are given.

  6. Reliability Research for Photovoltaic Modules

    NASA Technical Reports Server (NTRS)

    Ross, Ronald J., Jr.

    1986-01-01

    Report describes research approach used to improve reliability of photovoltaic modules. Aimed at raising useful module lifetime to 20 to 30 years. Development of cost-effective solutions to module-lifetime problem requires compromises between degradation rates, failure rates, and lifetimes, on one hand, and costs of initial manufacture, maintenance, and lost energy, on other hand. Life-cycle costing integrates disparate economic terms, allowing cost effectiveness to be quantified, allowing comparison of different design alternatives.

  7. Defining Requirements for Improved Photovoltaic System Reliability

    SciTech Connect

    Maish, A.B.

    1998-12-21

    Reliable systems are an essential ingredient of any technology progressing toward commercial maturity and large-scale deployment. This paper defines reliability as meeting system fictional requirements, and then develops a framework to understand and quantify photovoltaic system reliability based on initial and ongoing costs and system value. The core elements necessary to achieve reliable PV systems are reviewed. These include appropriate system design, satisfactory component reliability, and proper installation and servicing. Reliability status, key issues, and present needs in system reliability are summarized for four application sectors.

  8. Mechanical system reliability for long life space systems

    NASA Technical Reports Server (NTRS)

    Kowal, Michael T.

    1994-01-01

    The creation of a compendium of mechanical limit states was undertaken in order to provide a reference base for the application of first-order reliability methods to mechanical systems in the context of the development of a system level design methodology. The compendium was conceived as a reference source specific to the problem of developing the noted design methodology, and not an exhaustive or exclusive compilation of mechanical limit states. The compendium is not intended to be a handbook of mechanical limit states for general use. The compendium provides a diverse set of limit-state relationships for use in demonstrating the application of probabilistic reliability methods to mechanical systems. The compendium is to be used in the reliability analysis of moderately complex mechanical systems.

  9. Methods to Improve Reliability of Video Recorded Behavioral Data

    PubMed Central

    Haidet, Kim Kopenhaver; Tate, Judith; Divirgilio-Thomas, Dana; Kolanowski, Ann; Happ, Mary Beth

    2009-01-01

    Behavioral observation is a fundamental component of nursing practice and a primary source of clinical research data. The use of video technology in behavioral research offers important advantages to nurse scientists in assessing complex behaviors and relationships between behaviors. The appeal of using this method should be balanced, however, by an informed approach to reliability issues. In this paper, we focus on factors that influence reliability, such as the use of sensitizing sessions to minimize participant reactivity and the importance of training protocols for video coders. In addition, we discuss data quality, the selection and use of observational tools, calculating reliability coefficients, and coding considerations for special populations based on our collective experiences across three different populations and settings. PMID:19434651

  10. Reliability Testing Procedure for MEMS IMUs Applied to Vibrating Environments

    PubMed Central

    De Pasquale, Giorgio; Somà, Aurelio

    2010-01-01

    The diffusion of micro electro-mechanical systems (MEMS) technology applied to navigation systems is rapidly increasing, but currently, there is a lack of knowledge about the reliability of this typology of devices, representing a serious limitation to their use in aerospace vehicles and other fields with medium and high requirements. In this paper, a reliability testing procedure for inertial sensors and inertial measurement units (IMU) based on MEMS for applications in vibrating environments is presented. The sensing performances were evaluated in terms of signal accuracy, systematic errors, and accidental errors; the actual working conditions were simulated by means of an accelerated dynamic excitation. A commercial MEMS-based IMU was analyzed to validate the proposed procedure. The main weaknesses of the system have been localized by providing important information about the relationship between the reliability levels of the system and individual components. PMID:22315550

  11. Reliable and robust entanglement witness

    NASA Astrophysics Data System (ADS)

    Yuan, Xiao; Mei, Quanxin; Zhou, Shan; Ma, Xiongfeng

    2016-04-01

    Entanglement, a critical resource for quantum information processing, needs to be witnessed in many practical scenarios. Theoretically, witnessing entanglement is by measuring a special Hermitian observable, called an entanglement witness (EW), which has non-negative expected outcomes for all separable states but can have negative expectations for certain entangled states. In practice, an EW implementation may suffer from two problems. The first one is reliability. Due to unreliable realization devices, a separable state could be falsely identified as an entangled one. The second problem relates to robustness. A witness may not be optimal for a target state and fail to identify its entanglement. To overcome the reliability problem, we employ a recently proposed measurement-device-independent entanglement witness scheme, in which the correctness of the conclusion is independent of the implemented measurement devices. In order to overcome the robustness problem, we optimize the EW to draw a better conclusion given certain experimental data. With the proposed EW scheme, where only data postprocessing needs to be modified compared to the original measurement-device-independent scheme, one can efficiently take advantage of the measurement results to maximally draw reliable conclusions.

  12. Reliability in individual monitoring service.

    PubMed

    Mod Ali, N

    2011-03-01

    As a laboratory certified to ISO 9001:2008 and accredited to ISO/IEC 17025, the Secondary Standard Dosimetry Laboratory (SSDL)-Nuclear Malaysia has incorporated an overall comprehensive system for technical and quality management in promoting a reliable individual monitoring service (IMS). Faster identification and resolution of issues regarding dosemeter preparation and issuing of reports, personnel enhancement, improved customer satisfaction and overall efficiency of laboratory activities are all results of the implementation of an effective quality system. Review of these measures and responses to observed trends provide continuous improvement of the system. By having these mechanisms, reliability of the IMS can be assured in the promotion of safe behaviour at all levels of the workforce utilising ionising radiation facilities. Upgradation of in the reporting program through a web-based e-SSDL marks a major improvement in Nuclear Malaysia's IMS reliability on the whole. The system is a vital step in providing a user friendly and effective occupational exposure evaluation program in the country. It provides a higher level of confidence in the results generated for occupational dose monitoring of the IMS, thus, enhances the status of the radiation protection framework of the country. PMID:21147789

  13. Adaptive inferential sensors based on evolving fuzzy models.

    PubMed

    Angelov, Plamen; Kordon, Arthur

    2010-04-01

    A new technique to the design and use of inferential sensors in the process industry is proposed in this paper, which is based on the recently introduced concept of evolving fuzzy models (EFMs). They address the challenge that the modern process industry faces today, namely, to develop such adaptive and self-calibrating online inferential sensors that reduce the maintenance costs while keeping the high precision and interpretability/transparency. The proposed new methodology makes possible inferential sensors to recalibrate automatically, which reduces significantly the life-cycle efforts for their maintenance. This is achieved by the adaptive and flexible open-structure EFM used. The novelty of this paper lies in the following: (1) the overall concept of inferential sensors with evolving and self-developing structure from the data streams; (2) the new methodology for online automatic selection of input variables that are most relevant for the prediction; (3) the technique to detect automatically a shift in the data pattern using the age of the clusters (and fuzzy rules); (4) the online standardization technique used by the learning procedure of the evolving model; and (5) the application of this innovative approach to several real-life industrial processes from the chemical industry (evolving inferential sensors, namely, eSensors, were used for predicting the chemical properties of different products in The Dow Chemical Company, Freeport, TX). It should be noted, however, that the methodology and conclusions of this paper are valid for the broader area of chemical and process industries in general. The results demonstrate that well-interpretable and with-simple-structure inferential sensors can automatically be designed from the data stream in real time, which predict various process variables of interest. The proposed approach can be used as a basis for the development of a new generation of adaptive and evolving inferential sensors that can address the

  14. Evolvable mathematical models: A new artificial Intelligence paradigm

    NASA Astrophysics Data System (ADS)

    Grouchy, Paul

    We develop a novel Artificial Intelligence paradigm to generate autonomously artificial agents as mathematical models of behaviour. Agent/environment inputs are mapped to agent outputs via equation trees which are evolved in a manner similar to Symbolic Regression in Genetic Programming. Equations are comprised of only the four basic mathematical operators, addition, subtraction, multiplication and division, as well as input and output variables and constants. From these operations, equations can be constructed that approximate any analytic function. These Evolvable Mathematical Models (EMMs) are tested and compared to their Artificial Neural Network (ANN) counterparts on two benchmarking tasks: the double-pole balancing without velocity information benchmark and the challenging discrete Double-T Maze experiments with homing. The results from these experiments show that EMMs are capable of solving tasks typically solved by ANNs, and that they have the ability to produce agents that demonstrate learning behaviours. To further explore the capabilities of EMMs, as well as to investigate the evolutionary origins of communication, we develop NoiseWorld, an Artificial Life simulation in which interagent communication emerges and evolves from initially noncommunicating EMM-based agents. Agents develop the capability to transmit their x and y position information over a one-dimensional channel via a complex, dialogue-based communication scheme. These evolved communication schemes are analyzed and their evolutionary trajectories examined, yielding significant insight into the emergence and subsequent evolution of cooperative communication. Evolved agents from NoiseWorld are successfully transferred onto physical robots, demonstrating the transferability of EMM-based AIs from simulation into physical reality.

  15. Developing Architectures and Technologies for an Evolvable NASA Space Communication Infrastructure

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Hayden, Jeffrey

    2004-01-01

    Space communications architecture concepts play a key role in the development and deployment of NASA's future exploration and science missions. Once a mission is deployed, the communication link to the user needs to provide maximum information delivery and flexibility to handle the expected large and complex data sets and to enable direct interaction with the spacecraft and experiments. In human and robotic missions, communication systems need to offer maximum reliability with robust two-way links for software uploads and virtual interactions. Identifying the capabilities to cost effectively meet the demanding space communication needs of 21st century missions, proper formulation of the requirements for these missions, and identifying the early technology developments that will be needed can only be resolved with architecture design. This paper will describe the development of evolvable space communication architecture models and the technologies needed to support Earth sensor web and collaborative observation formation missions; robotic scientific missions for detailed investigation of planets, moons, and small bodies in the solar system; human missions for exploration of the Moon, Mars, Ganymede, Callisto, and asteroids; human settlements in space, on the Moon, and on Mars; and great in-space observatories for observing other star systems and the universe. The resulting architectures will enable the reliable, multipoint, high data rate capabilities needed on demand to provide continuous, maximum coverage of areas of concentrated activities, such as in the vicinity of outposts in-space, on the Moon or on Mars.

  16. Reliability analysis of repairable systems using system dynamics modeling and simulation

    NASA Astrophysics Data System (ADS)

    Srinivasa Rao, M.; Naikan, V. N. A.

    2014-07-01

    Repairable standby system's study and analysis is an important topic in reliability. Analytical techniques become very complicated and unrealistic especially for modern complex systems. There have been attempts in the literature to evolve more realistic techniques using simulation approach for reliability analysis of systems. This paper proposes a hybrid approach called as Markov system dynamics (MSD) approach which combines the Markov approach with system dynamics simulation approach for reliability analysis and to study the dynamic behavior of systems. This approach will have the advantages of both Markov as well as system dynamics methodologies. The proposed framework is illustrated for a standby system with repair. The results of the simulation when compared with that obtained by traditional Markov analysis clearly validate the MSD approach as an alternative approach for reliability analysis.

  17. Cell biology, molecular embryology, Lamarckian and Darwinian selection as evolvability.

    PubMed

    Hoenigsberg, H

    2003-01-01

    The evolvability of vertebrate systems involves various mechanisms that eventually generate cooperative and nonlethal functional variation on which Darwinian selection can operate. It is a truism that to get vertebrate animals to develop a coherent machine they first had to inherit the right multicellular ontogeny. The ontogeny of a metazoan involves cell lineages that progressively deny their own capacity for increase and for totipotency in benefit of the collective interest of the individual. To achieve such cell altruism Darwinian dynamics rescinded its original unicellular mandate to reproduce. The distinction between heritability at the level of the cell lineage and at the level of the individual is crucial. However, its implications have seldom been explored in depth. While all out reproduction is the Darwinian measure of success among unicellular organisms, a high replication rate of cell lineages within the organism may be deleterious to the individual as a functional unit. If a harmoniously functioning unit is to evolve, mechanisms must have evolved whereby variants that increase their own replication rate by failing to accept their own somatic duties are controlled. For questions involving organelle origins, see Godelle and Reboud, 1995 and Hoekstra, 1990. In other words, modifiers of conflict that control cell lineages with conflicting genes and new mutant replication rates that deviate from their somatic duties had to evolve. Our thesis is that selection at the level of the (multicellular) individual must have opposed selection at the level of the cell lineage. The metazoan embryo is not immune to this conflict especially with the evolution of set-aside cells and other modes of self-policing modifiers (Blackstone and Ellison, 1998; Ransick et al., 1996. In fact, the conflict between the two selection processes permitted a Lamarckian soma-to-germline feedback loop. This new element in metazoan ontogeny became the evolvability of the vertebrate adaptive

  18. Occasions and the Reliability of Classroom Observations: Alternative Conceptualizations and Methods of Analysis

    ERIC Educational Resources Information Center

    Meyer, J. Patrick; Cash, Anne H.; Mashburn, Andrew

    2011-01-01

    Student-teacher interactions are dynamic relationships that change and evolve over the course of a school year. Measuring classroom quality through observations that focus on these interactions presents challenges when observations are conducted throughout the school year. Variability in observed scores could reflect true changes in the quality of…

  19. Multi-hop routing mechanism for reliable sensor computing.

    PubMed

    Chen, Jiann-Liang; Ma, Yi-Wei; Lai, Chia-Ping; Hu, Chia-Cheng; Huang, Yueh-Min

    2009-01-01

    Current research on routing in wireless sensor computing concentrates on increasing the service lifetime, enabling scalability for large number of sensors and supporting fault tolerance for battery exhaustion and broken nodes. A sensor node is naturally exposed to various sources of unreliable communication channels and node failures. Sensor nodes have many failure modes, and each failure degrades the network performance. This work develops a novel mechanism, called Reliable Routing Mechanism (RRM), based on a hybrid cluster-based routing protocol to specify the best reliable routing path for sensor computing. Table-driven intra-cluster routing and on-demand inter-cluster routing are combined by changing the relationship between clusters for sensor computing. Applying a reliable routing mechanism in sensor computing can improve routing reliability, maintain low packet loss, minimize management overhead and save energy consumption. Simulation results indicate that the reliability of the proposed RRM mechanism is around 25% higher than that of the Dynamic Source Routing (DSR) and ad hoc On-demand Distance Vector routing (AODV) mechanisms. PMID:22303165

  20. Chaos and reliability in balanced spiking networks with temporal drive

    NASA Astrophysics Data System (ADS)

    Lajoie, Guillaume; Lin, Kevin K.; Shea-Brown, Eric

    2013-05-01

    Biological information processing is often carried out by complex networks of interconnected dynamical units. A basic question about such networks is that of reliability: If the same signal is presented many times with the network in different initial states, will the system entrain to the signal in a repeatable way? Reliability is of particular interest in neuroscience, where large, complex networks of excitatory and inhibitory cells are ubiquitous. These networks are known to autonomously produce strongly chaotic dynamics—an obvious threat to reliability. Here, we show that such chaos persists in the presence of weak and strong stimuli, but that even in the presence of chaos, intermittent periods of highly reliable spiking often coexist with unreliable activity. We elucidate the local dynamical mechanisms involved in this intermittent reliability, and investigate the relationship between this phenomenon and certain time-dependent attractors arising from the dynamics. A conclusion is that chaotic dynamics do not have to be an obstacle to precise spike responses, a fact with implications for signal coding in large networks.