Science.gov

Sample records for evolving reliable relationships

  1. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, T.; Toon, J.; Conner, A.; Adams, T.; Miranda, D.

    2016-01-01

    This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) programs subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.

  2. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, Jamie; Toon, Troy; Adams, Timothy C.; Miranda, David J.

    2016-01-01

    This paper describes the methodology that was developed to allocate reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program's subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. Allocating is an iterative process; as systems moved beyond their conceptual and preliminary design phases this provided an opportunity for the reliability engineering team to reevaluate allocations based on updated designs and maintainability characteristics of the components. Trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper will discuss the value of modifying reliability and maintainability allocations made for the GSDO subsystems as the program nears the end of its design phase.

  3. Evolving Reliability and Maintainability Allocations for NASA Ground Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Gisela; Toon, Troy; Toon, Jamie; Conner, Angelo C.; Adams, Timothy C.; Miranda, David J.

    2016-01-01

    This paper describes the methodology and value of modifying allocations to reliability and maintainability requirements for the NASA Ground Systems Development and Operations (GSDO) program’s subsystems. As systems progressed through their design life cycle and hardware data became available, it became necessary to reexamine the previously derived allocations. This iterative process provided an opportunity for the reliability engineering team to reevaluate allocations as systems moved beyond their conceptual and preliminary design phases. These new allocations are based on updated designs and maintainability characteristics of the components. It was found that trade-offs in reliability and maintainability were essential to ensuring the integrity of the reliability and maintainability analysis. This paper discusses the results of reliability and maintainability reallocations made for the GSDO subsystems as the program nears the end of its design phase.

  4. Towards resolving Lamiales relationships: insights from rapidly evolving chloroplast sequences

    PubMed Central

    2010-01-01

    Background In the large angiosperm order Lamiales, a diverse array of highly specialized life strategies such as carnivory, parasitism, epiphytism, and desiccation tolerance occur, and some lineages possess drastically accelerated DNA substitutional rates or miniaturized genomes. However, understanding the evolution of these phenomena in the order, and clarifying borders of and relationships among lamialean families, has been hindered by largely unresolved trees in the past. Results Our analysis of the rapidly evolving trnK/matK, trnL-F and rps16 chloroplast regions enabled us to infer more precise phylogenetic hypotheses for the Lamiales. Relationships among the nine first-branching families in the Lamiales tree are now resolved with very strong support. Subsequent to Plocospermataceae, a clade consisting of Carlemanniaceae plus Oleaceae branches, followed by Tetrachondraceae and a newly inferred clade composed of Gesneriaceae plus Calceolariaceae, which is also supported by morphological characters. Plantaginaceae (incl. Gratioleae) and Scrophulariaceae are well separated in the backbone grade; Lamiaceae and Verbenaceae appear in distant clades, while the recently described Linderniaceae are confirmed to be monophyletic and in an isolated position. Conclusions Confidence about deep nodes of the Lamiales tree is an important step towards understanding the evolutionary diversification of a major clade of flowering plants. The degree of resolution obtained here now provides a first opportunity to discuss the evolution of morphological and biochemical traits in Lamiales. The multiple independent evolution of the carnivorous syndrome, once in Lentibulariaceae and a second time in Byblidaceae, is strongly supported by all analyses and topological tests. The evolution of selected morphological characters such as flower symmetry is discussed. The addition of further sequence data from introns and spacers holds promise to eventually obtain a fully resolved plastid tree of

  5. Young People and Alcohol in Italy: An Evolving Relationship

    ERIC Educational Resources Information Center

    Beccaria, Franca; Prina, Franco

    2010-01-01

    In Italy, commonly held opinions and interpretations about the relationship between young people and alcohol are often expressed as generalizations and approximations. In order to further understanding of the relationship between young people and alcohol in contemporary Italy, we have gathered, compared and discussed all the available data, both…

  6. Models of Shared Leadership: Evolving Structures and Relationships.

    ERIC Educational Resources Information Center

    Hallinger, Philip; Richardson, Don

    1988-01-01

    Explores potential changes in the power relationships among teachers and principals. Describes and analyzes the following models of teacher decision-making: (1) Instructional Leadership Teams; (2) Principals' Advisory Councils; (3) School Improvement Teams; and (4) Lead Teacher Committees. (FMW)

  7. Do aggressive signals evolve towards higher reliability or lower costs of assessment?

    PubMed

    Ręk, P

    2014-12-01

    It has been suggested that the evolution of signals must be a wasteful process for the signaller, aimed at the maximization of signal honesty. However, the reliability of communication depends not only on the costs paid by signallers but also on the costs paid by receivers during assessment, and less attention has been given to the interaction between these two types of costs during the evolution of signalling systems. A signaller and receiver may accept some level of signal dishonesty by choosing signals that are cheaper in terms of assessment but that are stabilized with less reliable mechanisms. I studied the potential trade-off between signal reliability and the costs of signal assessment in the corncrake (Crex crex). I found that the birds prefer signals that are less costly regarding assessment rather than more reliable. Despite the fact that the fundamental frequency of calls was a strong predictor of male size, it was ignored by receivers unless they could directly compare signal variants. My data revealed a response advantage of costly signals when comparison between calls differing with fundamental frequencies is fast and straightforward, whereas cheap signalling is preferred in natural conditions. These data might improve our understanding of the influence of receivers on signal design because they support the hypothesis that fully honest signalling systems may be prone to dishonesty based on the effects of receiver costs and be replaced by signals that are cheaper in production and reception but more susceptible to cheating.

  8. ELECTRICAL SUBSTATION RELIABILITY EVALUATION WITH EMPHASIS ON EVOLVING INTERDEPENDENCE ON COMMUNICATION INFRASTRUCTURE.

    SciTech Connect

    AZARM,M.A.; BARI,R.; YUE,M.; MUSICKI,Z.

    2004-09-12

    This study developed a probabilistic methodology for assessment of the reliability and security of electrical energy distribution networks. This included consideration of the future grid system, which will rely heavily on the existing digitally based communication infrastructure for monitoring and protection. Event tree and fault tree methods were utilized. The approach extensively modeled the types of faults that a grid could potentially experience, the response of the grid, and the specific design of the protection schemes. We demonstrated the methods by applying it to a small sub-section of a hypothetical grid based on an existing electrical grid system of a metropolitan area. The results showed that for a typical design that relies on communication network for protection, the communication network reliability could contribute significantly to the frequency of loss of electrical power. The reliability of the communication network could become a more important contributor to the electrical grid reliability as the utilization of the communication network significantly increases in the near future to support ''smart'' transmission and/or distributed generation.

  9. Evolving Cross-Group Relationships: The Story of Miller High, 1950-2000

    ERIC Educational Resources Information Center

    Eick, Caroline

    2011-01-01

    This paper examines students' evolving cross-group relationships in a comprehensive high school in Baltimore County, Maryland, USA, between 1950 and 2000. The findings of this research, situated at the intersections of two lenses of inquiry: oral historical analysis and critical studies, uncover both the power of students accustomed to integrated…

  10. Considering context: reliable entity networks through contextual relationship extraction

    NASA Astrophysics Data System (ADS)

    David, Peter; Hawes, Timothy; Hansen, Nichole; Nolan, James J.

    2016-05-01

    Existing information extraction techniques can only partially address the problem of exploiting unreadable-large amounts text. When discussion of events and relationships is limited to simple, past-tense, factual descriptions of events, current NLP-based systems can identify events and relationships and extract a limited amount of additional information. But the simple subset of available information that existing tools can extract from text is only useful to a small set of users and problems. Automated systems need to find and separate information based on what is threatened or planned to occur, has occurred in the past, or could potentially occur. We address the problem of advanced event and relationship extraction with our event and relationship attribute recognition system, which labels generic, planned, recurring, and potential events. The approach is based on a combination of new machine learning methods, novel linguistic features, and crowd-sourced labeling. The attribute labeler closes the gap between structured event and relationship models and the complicated and nuanced language that people use to describe them. Our operational-quality event and relationship attribute labeler enables Warfighters and analysts to more thoroughly exploit information in unstructured text. This is made possible through 1) More precise event and relationship interpretation, 2) More detailed information about extracted events and relationships, and 3) More reliable and informative entity networks that acknowledge the different attributes of entity-entity relationships.

  11. ELECTRICAL SUBSTATION RELIABILITY EVALUATION WITH EMPHASIS ON EVOLVING INTERDEPENDENCE ON COMMUNICATION INFRASTRUCTURE.

    SciTech Connect

    AZARM,M.A.BARI,R.A.MUSICKI,Z.

    2004-01-15

    The objective of this study is to develop a methodology for a probabilistic assessment of the reliability and security of electrical energy distribution networks. This includes consideration of the future grid system, which will rely heavily on the existing digitally based communication infrastructure for monitoring and protection. Another important objective of this study is to provide information and insights from this research to Consolidated Edison Company (Con Edison) that could be useful in the design of the new network segment to be installed in the area of the World Trade Center in lower Manhattan. Our method is microscopic in nature and relies heavily on the specific design of the portion of the grid being analyzed. It extensively models the types of faults that a grid could potentially experience, the response of the grid, and the specific design of the protection schemes. We demonstrate that the existing technology can be extended and applied to the electrical grid and to the supporting communication network. A small subsection of a hypothetical grid based on the existing New York City electrical grid system of Con Edison is used to demonstrate the methods. Sensitivity studies show that in the current design the frequency for the loss of the main station is sensitive to the communication network reliability. The reliability of the communication network could become a more important contributor to the electrical grid reliability as the utilization of the communication network significantly increases in the near future to support ''smart'' transmission and/or distributed generation. The identification of potential failure modes and their likelihood can support decisions on potential modifications to the network including hardware, monitoring instrumentation, and protection systems.

  12. Changes of scaling relationships in an evolving population: The example of "sedimentary" stylolites

    NASA Astrophysics Data System (ADS)

    Peacock, D. C. P.; Korneva, I.; Nixon, C. W.; Rotevatn, A.

    2017-03-01

    Bed-parallel (;sedimentary;) stylolites are used as an example of a population that evolves by the addition of new components, their growth and their merger. It is shown that this style of growth controls the changes in the scaling relationships of the population. Stylolites tend to evolve in carbonate rocks through time, for example by compaction during progressive burial. The evolution of a population of stylolites, and their likely effects on porosity, are demonstrated using simple numerical models. Starting with a power-law distribution, the adding of new stylolites, the increase in their amplitudes and their merger decrease the slope of magnitude versus cumulative frequency of the population. The population changes to a non-power-law distribution as smaller stylolites merge to form larger stylolites. The results suggest that other populations can be forward- or backward-modelled, such as fault lengths, which also evolve by the addition of components, their growth and merger. Consideration of the ways in which populations change improves understanding of scaling relationships and vice versa, and would assist in the management of geofluid reservoirs.

  13. Craniosacral rhythm: reliability and relationships with cardiac and respiratory rates.

    PubMed

    Hanten, W P; Dawson, D D; Iwata, M; Seiden, M; Whitten, F G; Zink, T

    1998-03-01

    Craniosacral rhythm (CSR) has long been the subject of debate, both over its existence and its use as a therapeutic tool in evaluation and treatment. Origins of this rhythm are unknown, and palpatory findings lack scientific support. The purpose of this study was to determine the intra- and inter-examiner reliabilities of the palpation of the rate of the CSR and the relationship between the rate of the CSR and the heart or respiratory rates of subjects and examiners. The rates of the CSR of 40 healthy adults were palpated twice by each of two examiners. The heart and respiratory rates of the examiners and the subjects were recorded while the rates of the subjects' CSR were palpated by the examiners. Intraclass correlation coefficients were calculated to determine the intra- and inter-examiner reliabilities of the palpation. Two multiple regression analyses, one for each examiner, were conducted to analyze the relationships between the rate of the CSR and the heart and respiratory rates of the subjects and the examiners. The intraexaminer reliability coefficients were 0.78 for examiner A and 0.83 for examiner B, and the interexaminer reliability coefficient was 0.22. The result of the multiple regression analysis for examiner A was R = 0.46 and adjusted R2 = 0.12 (p = 0.078) and for examiner B was R = 0.63 and adjusted R2 = 0.32 (p = 0.001). The highest bivariate correlation was found between the CSR and the subject's heart rate (r = 0.30) for examiner A and between the CSR and the examiner's heart rate (r = 0.42) for examiner B. The results indicated that a single examiner may be able to palpate the rate of the CSR consistently, if that is what we truly measured. It is possible that the perception of CSR is illusory. The rate of the CSR palpated by two examiners is not consistent. The results of the regression analysis of one examiner offered no validation to those of the other. It appears that a subject's CSR is not related to the heart or respiratory rates of the

  14. On the relationship between coefficient alpha and composite reliability.

    PubMed

    Peterson, Robert A; Kim, Yeolib

    2013-01-01

    Cronbach's coefficient alpha is the most widely used estimator of the reliability of tests and scales. However, it has been criticized as being a lower bound and hence underestimating true reliability. A popular alternative to coefficient alpha is composite reliability, which is usually calculated in conjunction with structural equation modeling. A quantitative analysis of 2,524 pairs of coefficient alpha and composite reliability values derived from empirical investigations revealed that although the average composite reliability value (.86) exceeded the average corresponding coefficient alpha value (.84), the difference was relatively inconsequential for practical applications such as meta-analysis.

  15. How Mentoring Relationships Evolve: A Longitudinal Study of Academic Pediatricians in a Physician Educator Faculty Development Program

    ERIC Educational Resources Information Center

    Balmer, Dorene; D'Alessandro, Donna; Risko, Wanessa; Gusic, Maryellen E.

    2011-01-01

    Introduction: Mentoring is increasingly recognized as central to career development. Less attention has been paid, however, to how mentoring relationships evolve over time. To provide a more complete picture of these complex relationships, the authors explored mentoring from a mentee's perspective within the context of a three-year faculty…

  16. The Unidimensional Relationship Closeness Scale (URCS): Reliability and Validity Evidence for a New Measure of Relationship Closeness

    ERIC Educational Resources Information Center

    Dibble, Jayson L.; Levine, Timothy R.; Park, Hee Sun

    2012-01-01

    A fundamental dimension along which all social and personal relationships vary is closeness. The Unidimensional Relationship Closeness Scale (URCS) is a 12-item self-report scale measuring the closeness of social and personal relationships. The reliability and validity of the URCS were assessed with college dating couples (N = 192), female friends…

  17. Reliability assurance program and its relationship to other regulations

    SciTech Connect

    Polich, T.J.

    1994-12-31

    The need for a safety-oriented reliability effort for the nuclear industry was identified by the U.S. Nuclear Regulatory Commission (NRC) in the Three Mile Island Action Plan (NUREG-0660) Item II.C.4. In SECY-89-013, {open_quotes}Design Requirements Related to the Evolutionary ALWR,{close_quotes} the staff stated that the reliability assurance program (RAP) would be required for design certification to ensure that the design reliability of safety-significant structures, systems, and components (SSCs) is maintained over the life of a plant. In November 1988, the staff informed the advanced light water reactor (ALWR) vendors and the Electric Power Research Institute (EPRI) that it was considering this matter. Since that time, the staff has had numerous interactions with industry regarding RAP. These include discussions and subsequent safety evaluation reports on the EPRI utilities requirements document and for both Evolutionary Designs. The RAP has also been discussed in SECY-93-087, {open_quotes}Policy, Technical, and Licensing Issues Pertaining to Evolutionary and Advanced Light-Water Reactor (ALWR) Designs{close_quotes} and SECY-94-084, {open_quotes}Policy and Technical Issues Associated With the Regulatory Treatment of Non-Safety Systems in Passive Plant Designs.{close_quotes}

  18. Generalized storage-reliability-yield relationships for rainwater harvesting systems

    NASA Astrophysics Data System (ADS)

    Hanson, L. S.; Vogel, R. M.

    2014-07-01

    Sizing storage for rainwater harvesting (RWH) systems is often a difficult design consideration, as the system must be designed specifically for the local rainfall pattern. We introduce a generally applicable method for estimating the required storage by using regional regression equations to account for climatic differences in the behavior of RWH systems across the entire continental United States. A series of simulations for 231 locations with continuous daily precipitation records enable the development of storage-reliability-yield (SRY) relations at four useful reliabilities, 0.8, 0.9, 0.95, and 0.98. Multivariate, log-linear regression results in storage equations that include demand, collection area and local precipitation statistics. The continental regression equations demonstrated excellent goodness-of-fit (R2 0.96-0.99) using only two precipitation parameters, and fits improved when three geographic regions with more homogeneous rainfall characteristics were considered. The SRY models can be used to obtain a preliminary estimate of how large to build a storage tank almost anywhere in the United States based on desired yield and reliability, collection area, and local rainfall statistics. Our methodology could be extended to other regions of world, and the equations presented herein could be used to investigate how RWH systems would respond to changes in climatic variability. The resulting model may also prove useful in regional planning studies to evaluate the net benefits which result from the broad use of RWH to meet water supply requirements. We outline numerous other possible extensions to our work, which when taken together, illustrate the value of our initial generalized SRY model for RWH systems.

  19. The relationship between reliability and bonding techniques in hybrid microcircuits

    NASA Technical Reports Server (NTRS)

    Caruso, S. V.; Kinser, D. L.; Graff, S. M.; Allen, R. V.

    1975-01-01

    Differential thermal expansion was shown to be responsible for many observed failures in ceramic chip capacitors mounted on alumina substrates. It is shown that the mounting techniques used in bonding the capacitors have a marked effect upon the thermally induced mechanical stress and thus the failure rate. A mathematical analysis was conducted of a composite model of the capacitor-substrate system to predict the magnitude of thermally induced stresses. It was experimentally observed that the stresses in more compliant bonding systems such as soft lead tin and indium solders are significantly lower than those in hard solder and epoxy systems. The marked dependence upon heating and cooling rate was proven to be a determining factor in the prediction of failure solder systems. It was found that the harder or higher melting solders are less susceptible to thermal cycling effects but that they are more likely to fail during initial processing operations. Strain gage techniques were used to determine thermally induced expansion stresses of the capacitors and the alumina substrates. The compliance of the different bonding mediums was determined. From the data obtained, several recommendations are made concerning the optimum bonding system for the achievement of maximum reliability.

  20. On the relationships between generative encodings, regularity, and learning abilities when evolving plastic artificial neural networks.

    PubMed

    Tonelli, Paul; Mouret, Jean-Baptiste

    2013-01-01

    A major goal of bio-inspired artificial intelligence is to design artificial neural networks with abilities that resemble those of animal nervous systems. It is commonly believed that two keys for evolving nature-like artificial neural networks are (1) the developmental process that links genes to nervous systems, which enables the evolution of large, regular neural networks, and (2) synaptic plasticity, which allows neural networks to change during their lifetime. So far, these two topics have been mainly studied separately. The present paper shows that they are actually deeply connected. Using a simple operant conditioning task and a classic evolutionary algorithm, we compare three ways to encode plastic neural networks: a direct encoding, a developmental encoding inspired by computational neuroscience models, and a developmental encoding inspired by morphogen gradients (similar to HyperNEAT). Our results suggest that using a developmental encoding could improve the learning abilities of evolved, plastic neural networks. Complementary experiments reveal that this result is likely the consequence of the bias of developmental encodings towards regular structures: (1) in our experimental setup, encodings that tend to produce more regular networks yield networks with better general learning abilities; (2) whatever the encoding is, networks that are the more regular are statistically those that have the best learning abilities.

  1. Animals Used in Research and Education, 1966-2016: Evolving Attitudes, Policies, and Relationships.

    PubMed

    Lairmore, Michael D; Ilkiw, Jan

    2015-01-01

    Since the inception of the Association of American Veterinary Medical Colleges (AAVMC), the use of animals in research and education has been a central element of the programs of member institutions. As veterinary education and research programs have evolved over the past 50 years, so too have societal views and regulatory policies. AAVMC member institutions have continually responded to these events by exchanging best practices in training their students in the framework of comparative medicine and the needs of society. Animals provide students and faculty with the tools to learn the fundamental knowledge and skills of veterinary medicine and scientific discovery. The study of animal models has contributed extensively to medicine, veterinary medicine, and basic sciences as these disciplines seek to understand life processes. Changing societal views over the past 50 years have provided active examination and continued refinement of the use of animals in veterinary medical education and research. The future use of animals to educate and train veterinarians will likely continue to evolve as technological advances are applied to experimental design and educational systems. Natural animal models of both human and animal health will undoubtedly continue to serve a significant role in the education of veterinarians and in the development of new treatments of animal and human disease. As it looks to the future, the AAVMC as an organization will need to continue to support and promote best practices in the humane care and appropriate use of animals in both education and research.

  2. Do phytoplankton communities evolve through a self-regulatory abundance-diversity relationship?

    PubMed

    Roy, Shovonlal

    2009-02-01

    A small group of phytoplankton species that produce toxic or allelopathic chemicals has a significant effect on plankton dynamics in marine ecosystems. The species of non-toxic phytoplankton, which are large in number, are affected by the toxin-allelopathy of those species. By analysis of the abundance data of marine phytoplankton collected from the North-West coast of the Bay of Bengal, an empirical relationship between the abundance of the potential toxin-producing species and the species diversity of the non-toxic phytoplankton is formulated. A change-point analysis demonstrates that the diversity of non-toxic phytoplankton increases with the increase of toxic species up to a certain level. However, for a massive increase of the toxin-producing species the diversity of phytoplankton at species level reduces gradually. Following the results, a deterministic relationship between the abundance of toxic phytoplankton and the diversity of non-toxic phytoplankton is developed. The abundance-diversity relationship develops a unimodal pathway through which the abundance of toxic species regulates the diversity of phytoplankton. These results contribute to the current understanding of the coexistence and biodiversity of phytoplankton, the top-down vs. bottom-up debate, and to that of abundance-diversity relationship in marine ecosystems.

  3. Evolving Relationship Structures in Multi-sourcing Arrangements: The Case of Mission Critical Outsourcing

    NASA Astrophysics Data System (ADS)

    Heitlager, Ilja; Helms, Remko; Brinkkemper, Sjaak

    Information Technology Outsourcing practice and research mainly considers the outsourcing phenomenon as a generic fulfilment of the IT function by external parties. Inspired by the logic of commodity, core competencies and economies of scale; assets, existing departments and IT functions are transferred to external parties. Although the generic approach might work for desktop outsourcing, where standardisation is the dominant factor, it does not work for the management of mission critical applications. Managing mission critical applications requires a different approach where building relationships is critical. The relationships involve inter and intra organisational parties in a multi-sourcing arrangement, called an IT service chain, consisting of multiple (specialist) parties that have to collaborate closely to deliver high quality services.

  4. Pudor, honor, and autoridad: the evolving patient-physician relationship in Spain.

    PubMed

    Epstein, R M; Borrell i Carrió, F

    2001-10-01

    The expression of emotion and the sharing of information are determined by cultural factors, consultation time, and the structure of the health care system. Two emblematic situations in Spain - the expression of aggression in the patient-physician encounter, and the withholding of diagnostic information from the patient - have not been well-described in their sociocultural context. To explore these, the authors observed and participated in clinical practice and teaching in several settings throughout Spain and analyzed field notes using qualitative methods. In this paper, we explore three central constructs - modesty (pudor), dignity (honor), and authority (autoridad) - and their expressions in patient-physician encounters. We define two types of emotions in clinical settings - public, extroverted expressions of anger and exuberance; and private, deeply held feelings of fear and grief that tend to be expressed through the arts and religion. Premature reassurance and withholding of information are interpreted as attempts to reconstruct the honor and pudor of the patient. Physician authority and perceived loyalty to the government-run health care system generate conflict and aggression in the patient-physician relationship. These clinical behaviors are contextualized within cultural definitions of effective communication, an ideal patient-physician relationship, the role of the family, and ethical behavior. Despite agreement on the goals of medicine, the behavioral manifestations of empathy and caring in Spain contrast substantially with northern European and North American cultures.

  5. Assessing the Complex and Evolving Relationship between Charges and Payments in US Hospitals: 1996 – 2012

    PubMed Central

    Bulchis, Anne G.; Lomsadze, Liya; Joseph, Jonathan; Baral, Ranju; Bui, Anthony L.; Horst, Cody; Johnson, Elizabeth; Dieleman, Joseph L.

    2016-01-01

    Background In 2013 the United States spent $2.9 trillion on health care, more than in any previous year. Much of the debate around slowing health care spending growth focuses on the complicated pricing system for services. Our investigation contributes to knowledge of health care spending by assessing the relationship between charges and payments in the inpatient hospital setting. In the US, charges and payments differ because of a complex set of incentives that connect health care providers and funders. Our methodology can also be applied to adjust charge data to reflect actual spending. Methods We extracted cause of health care encounter (cause), primary payer (payer), charge, and payment information for 50,172 inpatient hospital stays from 1996 through 2012. We used linear regression to assess the relationship between charges and payments, stratified by payer, year, and cause. We applied our estimates to a large, nationally representative hospital charge sample to estimate payments. Results The average amount paid per $1 charged varies significantly across three dimensions: payer, year, and cause. Among the 10 largest causes of health care spending, average payments range from 23 to 55 cents per dollar charged. Over time, the amount paid per dollar charged is decreasing for those with private or public insurance, signifying that inpatient charges are increasing faster than the amount insurers pay. Conversely, the amount paid by out-of-pocket payers per dollar charged is increasing over time for several causes. Applying our estimates to a nationally representative hospital charge sample generates payment estimates which align with the official US estimates of inpatient spending. Conclusions The amount paid per $1 charged fluctuates significantly depending on the cause of a health care encounter and the primary payer. In addition, the amount paid per charge is changing over time. Transparent accounting of hospital spending requires a detailed assessment of the

  6. Review: The evolving placenta: different developmental paths to a hemochorial relationship.

    PubMed

    Enders, A C; Carter, A M

    2012-02-01

    The way in which maternal blood is associated with trophoblast prior to the formation of the different types of hemochorial placenta may be conveniently grouped into four main patterns: a transitory endotheliochorial condition; maternal blood released into a mass of trophoblast; maternal blood confined to lacunae; and fetal villi entering preexisting maternal blood sinuses. Although it might be considered logical that developing placentas would pass through an endotheliochorial stage to become hemochorial, this developmental pattern is seen only as a transient stage in several species of bats and sciuromorph rodents. More commonly a mass of trophoblast at the junction with the endometrium serves as a meshwork through which maternal blood passes, with subsequent organization of a labyrinth when the fetal vascular component is organized. The initial trophoblast meshwork may be cellular or syncytial, often leading to a similar relationship in the spongy zone and labyrinth. Old World monkeys, apes and humans have a lacunar stage prior to establishing a villous hemochorial condition. New World monkeys lack a true lacunar stage, retaining portions of maternal vessels for some time and initially forming a trabecular arrangement similar to though differently arrived at than that in the tarsier. In armadillos, preexisting maternal venous sinuses are converted into an intervillous blood space by intruding fetal villi. Variations from the major patterns of development also occur. The way in which the definitive placental form is achieved developmentally should be considered when using placental structure to extrapolate evolution of placentation.

  7. The Neighborhood Environment Walkability Scale for the Republic of Korea: Reliability and Relationship with Walking

    PubMed Central

    KIM, Hyunshik; CHOI, Younglae; MA, Jiameng; HYUNG, Kuam; MIYASHITA, Masashi; LEE, Sunkyoung

    2016-01-01

    Background: The aim of the study was to analyze the reliability of the Korean version of the NEWS and to investigate the relationship between walking and environmental factors by gender. Methods: A total of 1407 Korean adults, aged 20–59 yr, participated in the study. Data were collected between Sep 2013 and Oct 2013. To examine the test-retest reliability, 281 of the 1407 participants were asked to answer the same questionnaire (Korean NEWS-A scale) after a 7-d interval. Results: The ICC range of the entire questionnaire was 0.71–0.88. The item on land use mix-diversity had the highest ICC, and that on physical barriers had the lowest. In addition, presents the partial correlation coefficients for walking and the NEWS-A score, adjusted for social demographic variables. Overall, land use mix-diversity (P<0.034) and land use mix-access (P<0.014) showed a positive relationship with walking. Discussion: Examination of the reliability of the Korean NEWS-A scale based on Korean adults who reside in large cities showed that all items had statistically satisfactory reliability. Korean NEWS-A scale may be a useful measure for assessing environmental correlates of walking among population in Korea. PMID:28032060

  8. Visual perspective in autobiographical memories: reliability, consistency, and relationship to objective memory performance.

    PubMed

    Siedlecki, Karen L

    2015-01-01

    Visual perspective in autobiographical memories was examined in terms of reliability, consistency, and relationship to objective memory performance in a sample of 99 individuals. Autobiographical memories may be recalled from two visual perspectives--a field perspective in which individuals experience the memory through their own eyes, or an observer perspective in which individuals experience the memory from the viewpoint of an observer in which they can see themselves. Participants recalled nine word-cued memories that differed in emotional valence (positive, negative and neutral) and rated their memories on 18 scales. Results indicate that visual perspective was the most reliable memory characteristic overall and is consistently related to emotional intensity at the time of recall and amount of emotion experienced during the memory. Visual perspective is unrelated to memory for words, stories, abstract line drawings or faces.

  9. Structural and reliability analysis of quality of relationship index in cancer patients.

    PubMed

    Cousson-Gélie, Florence; de Chalvron, Stéphanie; Zozaya, Carole; Lafaye, Anaïs

    2013-01-01

    Among psychosocial factors affecting emotional adjustment and quality of life, social support is one of the most important and widely studied in cancer patients, but little is known about the perception of support in specific significant relationships in patients with cancer. This study examined the psychometric properties of the Quality of Relationship Inventory (QRI) by evaluating its factor structure and its convergent and discriminant validity in a sample of cancer patients. A total of 388 patients completed the QRI. Convergent validity was evaluated by testing the correlations between the QRI subscales and measures of general social support, anxiety and depression symptoms. Discriminant validity was examined by testing group comparison. The QRI's longitudinal invariance across time was also tested. Principal axis factor analysis with promax rotation identified three factors accounting for 42.99% of variance: perceived social support, depth, and interpersonal conflict. Estimates of reliability with McDonald's ω coefficient were satisfactory for all the QRI subscales (ω ranging from 0.75 - 0.85). Satisfaction from general social support was negatively correlated with the interpersonal conflict subscale and positively with the depth subscale. The interpersonal conflict and social support scales were correlated with depression and anxiety scores. We also found a relative stability of QRI subscales (measured 3 months after the first evaluation) and differences between partner status and gender groups. The Quality of Relationship Inventory is a valid tool for assessing the quality of social support in a particular relationship with cancer patients.

  10. Impact of relationships between test and training animals and among training animals on reliability of genomic prediction.

    PubMed

    Wu, X; Lund, M S; Sun, D; Zhang, Q; Su, G

    2015-10-01

    One of the factors affecting the reliability of genomic prediction is the relationship among the animals of interest. This study investigated the reliability of genomic prediction in various scenarios with regard to the relationship between test and training animals, and among animals within the training data set. Different training data sets were generated from EuroGenomics data and a group of Nordic Holstein bulls (born in 2005 and afterwards) as a common test data set. Genomic breeding values were predicted using a genomic best linear unbiased prediction model and a Bayesian mixture model. The results showed that a closer relationship between test and training animals led to a higher reliability of genomic predictions for the test animals, while a closer relationship among training animals resulted in a lower reliability. In addition, the Bayesian mixture model in general led to a slightly higher reliability of genomic prediction, especially for the scenario of distant relationships between training and test animals. Therefore, to prevent a decrease in reliability, constant updates of the training population with animals from more recent generations are required. Moreover, a training population consisting of less-related animals is favourable for reliability of genomic prediction.

  11. Establishing a Reliable Depth-Age Relationship for the Denali Ice Core

    NASA Astrophysics Data System (ADS)

    Wake, C. P.; Osterberg, E. C.; Winski, D.; Ferris, D.; Kreutz, K. J.; Introne, D.; Dalton, M.

    2015-12-01

    Reliable climate reconstruction from ice core records requires the development of a reliable depth-age relationship. We have established a sub-annual resolution depth-age relationship for the upper 198 meters of a 208 m ice core recovered in 2013 from Mt. Hunter (3,900 m asl), Denali National Park, central Alaska. The dating of the ice core was accomplished via annual layer counting of glaciochemical time-series combined with identification of reference horizons from volcanic eruptions and atmospheric nuclear weapons testing. Using the continuous ice core melter system at Dartmouth College, sub-seasonal samples have been collected and analyzed for major ions, liquid conductivity, particle size and concentration, and stable isotope ratios. Annual signals are apparent in several of the chemical species measured in the ice core samples. Calcium and magnesium peak in the spring, ammonium peaks in the summer, methanesulfonic acid (MSA) peaks in the autumn, and stable isotopes display a strong seasonal cycle with the most depleted values occurring during the winter. Thin ice layers representing infrequent summertime melt were also used to identify summer layers in the core. Analysis of approximately one meter sections of the core via nondestructive gamma spectrometry over depths from 84 to 124 m identified a strong radioactive cesium-137 peak at 89 m which corresponds to the 1963 layer deposited during extensive atmospheric nuclear weapons testing. Peaks in the sulfate and chloride record have been used for the preliminary identification of volcanic signals preserved in the ice core, including ten events since 1883. We are confident that the combination of robust annual layers combined with reference horizons provides a timescale for the 20th century that has an error of less than 0.5 years, making calibrations between ice core records and the instrumental climate data particularly robust. Initial annual layer counting through the entire 198 m suggests the Denali Ice

  12. Study on Precipitation Anomalies of North of China in April and Its relationship to Sea Surface Temperature Evolvement

    NASA Astrophysics Data System (ADS)

    Song, Y.; Li, Z.; Guan, Y.

    2012-04-01

    Using monthly precipitation data in North of China for 1960-2007, American NCEP/NCAR monthly reanalysis data and NOAA SST (sea surface temperature) data, and SST indices data in Climate System Monitoring Bulletin collected by National Climate Center, this paper studied the general circulation, large-scale weather system anomalous characteristics and SSTA evolvement with more rainfall of North of China in April. The results showed that precipitation differences between months in spring in North of China were quite obvious, and the correlation coefficients between precipitation of North of China in April and that in March and in May were not significant respectively. The linear trend of precipitation in April was out of phase with that in spring. It was meaningful to study precipitation in April solely. The space pattern of first leading mode of EOF analysis for precipitation of North of China in April indicated that rainfall changed synchronously. In years of more rainfall in April showed negative phase of EU pattern in 500hPa geopotential height field of high latitude in the Northern Hemisphere, and North of China located at where cold and warm air masses met, which availed reinforcement of south wind and ascending motion. In middle and high latitudes was latitudinal circulation, and North of China was controlled by warm ridge and latitudinal large-scale front zone; In years of less rainfall, meridional circulation prevailed and large-scale front zone located northward and presented meridional pattern, and North of China was affected by cold air mass. At the same time, water vapor was transported strongly from Pacific, South China Sea and southwest of China, and reached Northeast of China. In years of less rainfall, the water vapor transportation was quite weak. The rainfall was related closely to sea surface temperature anomalies, especially to the Indian Ocean, the middle and east of Pacific, middle and south of Pacific and northwest of Pacific where there were

  13. An Examination of Coach and Player Relationships According to the Adapted LMX 7 Scale: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Caliskan, Gokhan

    2015-01-01

    The current study aims to test the reliability and validity of the Leader-Member Exchange (LMX 7) scale with regard to coach--player relationships in sports settings. A total of 330 professional soccer players from the Turkish Super League as well as from the First and Second Leagues participated in this study. Factor analyses were performed to…

  14. Merlino-Perkins Father-Daughter Relationship Inventory (MP-FDI): Construction, Reliability, Validity, and Implications for Counseling and Research

    ERIC Educational Resources Information Center

    Merlino Perkins, Rose J.

    2008-01-01

    The Merlino-Perkins Father-Daughter Relationship Inventory, a self-report instrument, assesses women's childhood interactions with supportive, doting, distant, controlling, tyrannical, physically abusive, absent, and seductive fathers. Item and scale development, psychometric findings drawn from factor analyses, reliability assessments, and…

  15. Reliability and Validity of a Self-Concept Scale for Researchers in Family Relationships

    ERIC Educational Resources Information Center

    Rathus, Spencer A.; Siegel, Larry J.

    1976-01-01

    Self-concept questionnaire was shown to have high test-retest reliability, but only fair to moderate split-half (odd-even) reliability. Validity was adequate. The scale will serve as a heuristic device for family counselors who require a rapid assessment of a child's self-esteem. (Author)

  16. Reliability of a Field Test of Defending and Attacking Agility in Australian Football and Relationships to Reactive Strength.

    PubMed

    Young, Warren B; Murray, Mitch P

    2017-02-01

    Young, WB and Murray, MP. Reliability of a field test of defending and attacking agility in Australian football and relationships to reactive strength. J Strength Cond Res 31(2): 509-516, 2017-Defending and attacking agility tests for Australian football do not exist, and it is unknown whether any physical qualities correlate with these types of agility. The purposes of this study were to develop new field tests of defending and attacking agility for Australian Rules football, to determine whether they were reliable, and to describe the relationship between the agility tests to determine their specificity. Because the reactive strength (RS) of the lower limb muscles has been previously correlated with change-of-direction speed, we also investigated the relationship between this quality and the agility tests. Nineteen male competitive recreational-level Australian Rules football players were assessed on the agility tests and a drop jump test to assess RS. Interday and interrater reliability was also assessed. The agility tests involved performing 10 trials of one-on-one agility tasks against 2 testers (opponents), in which the objective was to be in a position to tackle (defending) or to evade (attacking) the opponent. Both agility tests had good reliability (intraclass correlation > 0.8, %CV < 3, and no significant differences between test occasions [p > 0.05], and interrater reliability was very high [r = 0.997, p < 0.001]). The common variance between the agility tests was 45%, indicating that they represented relatively independent skills. There was a large correlation between RS and defending agility (r = 0.625, p = 0.004), and a very large correlation with attacking agility (r = 0.731, p < 0.001). Defending and attacking agility have different characteristics, possibly related to the footwork, physical, and cognitive demands of each. Nonetheless, RS seems to be important for agility, especially for attacking agility.

  17. Reliable Attention Network Scores and Mutually Inhibited Inter-network Relationships Revealed by Mixed Design and Non-orthogonal Method.

    PubMed

    Wang, Yi-Feng; Jing, Xiu-Juan; Liu, Feng; Li, Mei-Ling; Long, Zhi-Liang; Yan, Jin H; Chen, Hua-Fu

    2015-05-21

    The attention system can be divided into alerting, orienting, and executive control networks. The efficiency and independence of attention networks have been widely tested with the attention network test (ANT) and its revised versions. However, many studies have failed to find effects of attention network scores (ANSs) and inter-network relationships (INRs). Moreover, the low reliability of ANSs can not meet the demands of theoretical and empirical investigations. Two methodological factors (the inter-trial influence in the event-related design and the inter-network interference in orthogonal contrast) may be responsible for the unreliability of ANT. In this study, we combined the mixed design and non-orthogonal method to explore ANSs and directional INRs. With a small number of trials, we obtained reliable and independent ANSs (split-half reliability of alerting: 0.684; orienting: 0.588; and executive control: 0.616), suggesting an individual and specific attention system. Furthermore, mutual inhibition was observed when two networks were operated simultaneously, indicating a differentiated but integrated attention system. Overall, the reliable and individual specific ANSs and mutually inhibited INRs provide novel insight into the understanding of the developmental, physiological and pathological mechanisms of attention networks, and can benefit future experimental and clinical investigations of attention using ANT.

  18. Resolution of phylogenetic relationships among recently evolved species as a function of amount of DNA sequence: an empirical study based on woodpeckers (Aves: Picidae).

    PubMed

    DeFilippis, V R; Moore, W S

    2000-07-01

    Synonymous substitutions in the 13 mitochondrial encoded protein genes form a large pool of characters that should approach the ideal for phylogenetic analysis of being independently and identically distributed. Pooling sequences from multiple mitochondrial protein-coding genes should result in statistically more powerful estimates of relationships among species that diverged sufficiently recently that most nucleotide substitutions are synonymous. Cytochrome oxidase I (COI) was sequenced for woodpecker species for which cytochrome b (cyt b) sequences were available. A pairing-design test based on the normal distribution indicated that cyt b evolves more rapidly than COI when all nucleotides are compared but their rates are equal for synonymous substitutions. Nearly all of the phylogenetically informative substitutions among woodpeckers are synonymous. Statistical support for relationships, as measured by bootstrap proportions, increased as the number of nucleotides increased from 1047 (cyt b) to 1512 (COI) to 2559 nucleotides (aggregate data set). Pseudo-bootstrap replicates showed the same trend and increasing the amount of sequence beyond the actual length of 2559 nucleotides to 5120 (2x) resulted in stronger bootstrap support, even though the amount of phylogenetic information was the same. However, the amount of sequence required to resolve an internode depends on the length of the internode and its depth in the phylogeny.

  19. The Relationship between Scoring Procedures and Focus and the Reliability of Direct Writing Assessment Scores.

    ERIC Educational Resources Information Center

    Wolfe, Edward W.; Kao, Chi-Wen

    This paper reports the results of an analysis of the relationship between scorer behaviors and score variability. Thirty-six essay scorers were interviewed and asked to perform a think-aloud task as they scored 24 essays. Each comment made by a scorer was coded according to its content focus (i.e. appearance, assignment, mechanics, communication,…

  20. Relationship between evolving epileptiform activity and delayed loss of mitochondrial activity after asphyxia measured by near-infrared spectroscopy in preterm fetal sheep

    PubMed Central

    Bennet, L; Roelfsema, V; Pathipati, P; Quaedackers, J S; Gunn, A J

    2006-01-01

    Early onset cerebral hypoperfusion after birth is highly correlated with neurological injury in premature infants, but the relationship with the evolution of injury remains unclear. We studied changes in cerebral oxygenation, and cytochrome oxidase (CytOx) using near-infrared spectroscopy in preterm fetal sheep (103–104 days of gestation, term is 147 days) during recovery from a profound asphyxial insult (n = 7) that we have shown produces severe subcortical injury, or sham asphyxia (n = 7). From 1 h after asphyxia there was a significant secondary fall in carotid blood flow (P < 0.001), and total cerebral blood volume, as reflected by total haemoglobin (P < 0.005), which only partially recovered after 72 h. Intracerebral oxygenation (difference between oxygenated and deoxygenated haemoglobin concentrations) fell transiently at 3 and 4 h after asphyxia (P < 0.01), followed by a substantial increase to well over sham control levels (P < 0.001). CytOx levels were normal in the first hour after occlusion, was greater than sham control values at 2–3 h (P < 0.05), but then progressively fell, and became significantly suppressed from 10 h onward (P < 0.01). In the early hours after reperfusion the fetal EEG was highly suppressed, with a superimposed mixture of fast and slow epileptiform transients; overt seizures developed from 8 ± 0.5 h. These data strongly indicate that severe asphyxia leads to delayed, evolving loss of mitochondrial oxidative metabolism, accompanied by late seizures and relative luxury perfusion. In contrast, the combination of relative cerebral deoxygenation with evolving epileptiform transients in the early recovery phase raises the possibility that these early events accelerate or worsen the subsequent mitochondrial failure. PMID:16484298

  1. Relationship between evolving epileptiform activity and delayed loss of mitochondrial activity after asphyxia measured by near-infrared spectroscopy in preterm fetal sheep.

    PubMed

    Bennet, L; Roelfsema, V; Pathipati, P; Quaedackers, J S; Gunn, A J

    2006-04-01

    Early onset cerebral hypoperfusion after birth is highly correlated with neurological injury in premature infants, but the relationship with the evolution of injury remains unclear. We studied changes in cerebral oxygenation, and cytochrome oxidase (CytOx) using near-infrared spectroscopy in preterm fetal sheep (103-104 days of gestation, term is 147 days) during recovery from a profound asphyxial insult (n= 7) that we have shown produces severe subcortical injury, or sham asphyxia (n= 7). From 1 h after asphyxia there was a significant secondary fall in carotid blood flow (P < 0.001), and total cerebral blood volume, as reflected by total haemoglobin (P < 0.005), which only partially recovered after 72 h. Intracerebral oxygenation (difference between oxygenated and deoxygenated haemoglobin concentrations) fell transiently at 3 and 4 h after asphyxia (P < 0.01), followed by a substantial increase to well over sham control levels (P < 0.001). CytOx levels were normal in the first hour after occlusion, was greater than sham control values at 2-3 h (P < 0.05), but then progressively fell, and became significantly suppressed from 10 h onward (P < 0.01). In the early hours after reperfusion the fetal EEG was highly suppressed, with a superimposed mixture of fast and slow epileptiform transients; overt seizures developed from 8 +/- 0.5 h. These data strongly indicate that severe asphyxia leads to delayed, evolving loss of mitochondrial oxidative metabolism, accompanied by late seizures and relative luxury perfusion. In contrast, the combination of relative cerebral deoxygenation with evolving epileptiform transients in the early recovery phase raises the possibility that these early events accelerate or worsen the subsequent mitochondrial failure.

  2. The Inventory of Teacher-Student Relationships: Factor Structure, Reliability, and Validity among African American Youth in Low-Income Urban Schools

    ERIC Educational Resources Information Center

    Murray, Christopher; Zvoch, Keith

    2011-01-01

    This study investigates the factor structure, reliability, and validity of the Inventory of Teacher-Student Relationships (IT-SR), a measure that was developed by adapting the widely used Inventory of Parent and Peer Attachments (Armsden & Greenberg, 1987) for use in the context of teacher-student relationships. The instrument was field tested…

  3. Self Evolving Modular Network

    NASA Astrophysics Data System (ADS)

    Tokunaga, Kazuhiro; Kawabata, Nobuyuki; Furukawa, Tetsuo

    We propose a novel modular network called the Self-Evolving Modular Network (SEEM). The SEEM has a modular network architecture with a graph structure and these following advantages: (1) new modules are added incrementally to allow the network to adapt in a self-organizing manner, and (2) graph's paths are formed based on the relationships between the models represented by modules. The SEEM is expected to be applicable to evolving functions of an autonomous robot in a self-organizing manner through interaction with the robot's environment and categorizing large-scale information. This paper presents the architecture and an algorithm for the SEEM. Moreover, performance characteristic and effectiveness of the network are shown by simulations using cubic functions and a set of 3D-objects.

  4. Relationship of lung function loss to level of initial function: correcting for measurement error using the reliability coefficient.

    PubMed Central

    Irwig, L; Groeneveld, H; Becklake, M

    1988-01-01

    The regression of lung function change on the initial lung function level is biased when the initial level is measured with random error. Several methods have been proposed to obtain unbiased estimates of regression coefficients in such circumstances. We apply these methods to examine the relationship between lung function loss over 11 years and its initial level in 433 men aged about 20 when first seen. On theoretical and practical grounds the best method is the correction of the regression coefficient using the reliability coefficient. This is defined as the ratio of the error free variance to the variance of the variable measured with error, and is easily estimated as the correlation between repeat measurements of the underlying level. In young men the loss of some lung functions (forced vital capacity [FVC], forced expiratory volume in one second [FEV1], forced expiratory flow in the middle half of expiration, and the ratio FEV1/FVC) do not appear to be related to initial level. PMID:3256581

  5. How Does the Strength of the Relationships between Cognitive Abilities Evolve over the Life Span for Low-IQ vs High-IQ Adults?

    ERIC Educational Resources Information Center

    Facon, Bruno

    2008-01-01

    The present study was designed to examine how the correlations between cognitive abilities evolve during adulthood. Data from 1104 participants on the French version of the Wechsler Adult Intelligence Scale-Third Edition were analyzed. The entire sample was divided into four age groups (16-24 years; 25-44 years; 45-69 years and 70-89 years), which…

  6. Changing and Evolving Relationships between Two- and Four-Year Colleges and Universities: They're Not Your Parents' Community Colleges Anymore

    ERIC Educational Resources Information Center

    Labov, Jay B.

    2012-01-01

    This paper describes a summit on Community Colleges in the Evolving STEM Education Landscape organized by a committee of the National Research Council (NRC) and the National Academy of Engineering (NAE) and held at the Carnegie Institution for Science on December 15, 2011. This summit followed a similar event organized by Dr. Jill Biden, spouse of…

  7. Evolvable synthetic neural system

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A. (Inventor)

    2009-01-01

    An evolvable synthetic neural system includes an evolvable neural interface operably coupled to at least one neural basis function. Each neural basis function includes an evolvable neural interface operably coupled to a heuristic neural system to perform high-level functions and an autonomic neural system to perform low-level functions. In some embodiments, the evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy.

  8. Making Reliability Arguments in Classrooms

    ERIC Educational Resources Information Center

    Parkes, Jay; Giron, Tilia

    2006-01-01

    Reliability methodology needs to evolve as validity has done into an argument supported by theory and empirical evidence. Nowhere is the inadequacy of current methods more visible than in classroom assessment. Reliability arguments would also permit additional methodologies for evidencing reliability in classrooms. It would liberalize methodology…

  9. Reliability, Validity, and Associations with Sexual Behavior among Ghanaian Teenagers of Scales Measuring Four Dimensions Relationships with Parents and Other Adults

    PubMed Central

    Bingenheimer, Jeffrey B.; Asante, Elizabeth; Ahiadeke, Clement

    2013-01-01

    Little research has been done on the social contexts of adolescent sexual behaviors in sub-Saharan Africa. As part of a longitudinal cohort study (N=1275) of teenage girls and boys in two Ghanaian towns, interviewers administered a 26 item questionnaire module intended to assess four dimensions of youth-adult relationships: monitoring conflict, emotional support, and financial support. Confirmatory factor and traditional psychometric analyses showed the four scales to be reliable. Known-groups comparisons provided evidence of their validity. All four scales had strong bivariate associations with self-reported sexual behavior (odds ratios = 1.66, 0.74, 0.47, and 0.60 for conflict, support, monitoring, and financial support). The instrument is practical for use in sub-Saharan African settings and produces measures that are reliable, valid, and predictive of sexual behavior in youth. PMID:25821286

  10. Reprocessing the Hipparcos data of evolved stars. III. Revised Hipparcos period-luminosity relationship for galactic long-period variable stars

    NASA Astrophysics Data System (ADS)

    Knapp, G. R.; Pourbaix, D.; Platais, I.; Jorissen, A.

    2003-06-01

    We analyze the K band luminosities of a sample of galactic long-period variables using parallaxes measured by the Hipparcos mission. The parallaxes are in most cases re-computed from the Hipparcos Intermediate Astrometric Data using improved astrometric fits and chromaticity corrections. The K band magnitudes are taken from the literature and from measurements by COBE, and are corrected for interstellar and circumstellar extinction. The sample contains stars of several spectral types: M, S and C, and of several variability classes: Mira, semiregular SRa, and SRb. We find that the distribution of stars in the period-luminosity plane is independent of circumstellar chemistry, but that the different variability types have different P-L distributions. Both the Mira variables and the SRb variables have reasonably well-defined period-luminosity relationships, but with very different slopes. The SRa variables are distributed between the two classes, suggesting that they are a mixture of Miras and SRb, rather than a separate class of stars. New period-luminosity relationships are derived based on our revised Hipparcos parallaxes. The Miras show a similar period-luminosity relationship to that found for Large Magellanic Cloud Miras by Feast et al. (\\cite{Feast-1989:a}). The maximum absolute K magnitude of the sample is about -8.2 for both Miras and semi-regular stars, only slightly fainter than the expected AGB limit. We show that the stars with the longest periods (P>400 d) have high mass loss rates and are almost all Mira variables. Based on observations from the Hipparcos astrometric satellite operated by the European Space Agency (ESA \\cite{Hipparcos}). Table \\ref{Tab:data1} is only available in electronic form at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsweb.u-strasbg.fr/cgi-bin/qcat?J/A+A/403/993

  11. Traditional vs. Sport-Specific Vertical Jump Tests: Reliability, Validity, and Relationship With the Legs Strength and Sprint Performance in Adult and Teen Soccer and Basketball Players.

    PubMed

    Rodríguez-Rosell, David; Mora-Custodio, Ricardo; Franco-Márquez, Felipe; Yáñez-García, Juan M; González-Badillo, Juan J

    2017-01-01

    Rodríguez-Rosell, D, Mora-Custodio, R, Franco-Márquez, F, Yáñez-García, JM, González-Badillo, JJ. Traditional vs. sport-specific vertical jump tests: reliability, validity, and relationship with the legs strength and sprint performance in adult and teen soccer and basketball players. J Strength Cond Res 31(1): 196-206, 2017-The vertical jump is considered an essential motor skill in many team sports. Many protocols have been used to assess vertical jump ability. However, controversy regarding test selection still exists based on the reliability and specificity of the tests. The main aim of this study was to analyze the reliability and validity of 2 standardized (countermovement jump [CMJ] and Abalakov jump [AJ]) and 2 sport-specific (run-up with 2 [2-LEGS] or 1 leg [1-LEG] take-off jump) vertical jump tests, and their usefulness as predictors of sprint and strength performance for soccer (n = 127) and basketball (n = 59) players in 3 different categories (Under-15, Under-18, and Adults). Three attempts for each of the 4 jump tests were recorded. Twenty-meter sprint time and estimated 1 repetition maximum in full squat were also evaluated. All jump tests showed high intraclass correlation coefficients (0.969-0.995) and low coefficients of variation (1.54-4.82%), although 1-LEG was the jump test with the lowest absolute and relative reliability. All selected jump tests were significantly correlated (r = 0.580-0.983). Factor analysis resulted in the extraction of one principal component, which explained 82.90-95.79% of the variance of all jump tests. The 1-LEG test showed the lowest associations with sprint and strength performance. The results of this study suggest that CMJ and AJ are the most reliable tests for the estimation of explosive force in soccer and basketball players in different age categories.

  12. Reliability, Factor Structure, and Associations With Measures of Problem Relationship and Behavior of the Personality Inventory for DSM-5 in a Sample of Italian Community-Dwelling Adolescents.

    PubMed

    Somma, Antonella; Borroni, Serena; Maffei, Cesare; Giarolli, Laura E; Markon, Kristian E; Krueger, Robert F; Fossati, Andrea

    2017-01-10

    In order to assess the reliability, factorial validity, and criterion validity of the Personality Inventory for DSM-5 (PID-5) among adolescents, 1,264 Italian high school students were administered the PID-5. Participants were also administered the Questionnaire on Relationships and Substance Use as a criterion measure. In the full sample, McDonald's ω values were adequate for the PID-5 scales (median ω = .85, SD = .06), except for Suspiciousness. However, all PID-5 scales showed average inter-item correlation values in the .20-.55 range. Exploratory structural equation modeling analyses provided moderate support for the a priori model of PID-5 trait scales. Ordinal logistic regression analyses showed that selected PID-5 trait scales predicted a significant, albeit moderate (Cox & Snell R(2) values ranged from .08 to .15, all ps < .001) amount of variance in Questionnaire on Relationships and Substance Use variables.

  13. Evolving Sensitivity Balances Boolean Networks

    PubMed Central

    Luo, Jamie X.; Turner, Matthew S.

    2012-01-01

    We investigate the sensitivity of Boolean Networks (BNs) to mutations. We are interested in Boolean Networks as a model of Gene Regulatory Networks (GRNs). We adopt Ribeiro and Kauffman’s Ergodic Set and use it to study the long term dynamics of a BN. We define the sensitivity of a BN to be the mean change in its Ergodic Set structure under all possible loss of interaction mutations. Insilico experiments were used to selectively evolve BNs for sensitivity to losing interactions. We find that maximum sensitivity was often achievable and resulted in the BNs becoming topologically balanced, i.e. they evolve towards network structures in which they have a similar number of inhibitory and excitatory interactions. In terms of the dynamics, the dominant sensitivity strategy that evolved was to build BNs with Ergodic Sets dominated by a single long limit cycle which is easily destabilised by mutations. We discuss the relevance of our findings in the context of Stem Cell Differentiation and propose a relationship between pluripotent stem cells and our evolved sensitive networks. PMID:22586459

  14. Analyzing Evolving Social Network 2 (EVOLVE2)

    DTIC Science & Technology

    2015-04-01

    over time, and how changes in topology affect evolution of influence and groups -Understand the impact of dynamics and network flows on the...incorporate time. The research had two major threads: • Understand how networks evolve over time, and how changes in topology affect evolution of...1958 14 Meissner Effect 1958 307 Random-Phase Approximation ... Superconductivity 1959 40 Evidence for Anisotropy of the Superconducting Energy... 1989

  15. Genomic medicine: evolving science, evolving ethics

    PubMed Central

    Soden, Sarah E; Farrow, Emily G; Saunders, Carol J; Lantos, John D

    2012-01-01

    Genomic medicine is rapidly evolving. Next-generation sequencing is changing the diagnostic paradigm by allowing genetic testing to be carried out more quickly, less expensively and with much higher resolution; pushing the envelope on existing moral norms and legal regulations. Early experience with implementation of next-generation sequencing to diagnose rare genetic conditions in symptomatic children suggests ways that genomic medicine might come to be used and some of the ethical issues that arise, impacting test design, patient selection, consent, sequencing analysis and communication of results. The ethical issues that arise from use of new technologies cannot be satisfactorily analyzed until they are understood and they cannot be understood until the technologies are deployed in the real world. PMID:23173007

  16. The Skin Cancer and Sun Knowledge (SCSK) Scale: Validity, Reliability, and Relationship to Sun-Related Behaviors Among Young Western Adults.

    PubMed

    Day, Ashley K; Wilson, Carlene; Roberts, Rachel M; Hutchinson, Amanda D

    2014-08-01

    Increasing public knowledge remains one of the key aims of skin cancer awareness campaigns, yet diagnosis rates continue to rise. It is essential we measure skin cancer knowledge adequately so as to determine the nature of its relationship to sun-related behaviors. This study investigated the psychometric properties of a new measure of skin cancer knowledge, the Skin Cancer and Sun Knowledge (SCSK) scale. A total of 514 Western young adults (females n = 320, males n = 194) aged 18 to 26 years completed measures of skin type, skin cancer knowledge, tanning behavior, sun exposure, and sun protection. Two-week test-retest of the SCSK was conducted with 52 participants. Internal reliability of the SCSK scale was acceptable (KR-20 = .69), test-retest reliability was high (r = .83, n = 52), and acceptable levels of face, content, and incremental validity were demonstrated. Skin cancer knowledge (as measured by SCSK) correlated with sun protection, sun exposure, and tanning behaviors in the female sample, but not in the males. Skin cancer knowledge appears to be more relevant to the behavior of young women than that of young males. We recommend that future research establish the validity of the SCSK across a range of participant groups.

  17. Relationship Between Agility Tests and Short Sprints: Reliability and Smallest Worthwhile Difference in National Collegiate Athletic Association Division-I Football Players.

    PubMed

    Mann, J Bryan; Ivey, Pat A; Mayhew, Jerry L; Schumacher, Richard M; Brechue, William F

    2016-04-01

    The Pro-Agility test (I-Test) and 3-cone drill (3-CD) are widely used in football to assess quickness in change of direction. Likewise, the 10-yard (yd) sprint, a test of sprint acceleration, is gaining popularity for testing physical competency in football players. Despite their frequent use, little information exists on the relationship between agility and sprint tests as well the reliability and degree of change necessary to indicate meaningful improvement resulting from training. The purpose of this study was to determine the reliability and smallest worthwhile difference (SWD) of the I-Test and 3-CD and the relationship of sprint acceleration to their performance. Division-I football players (n = 64, age = 20.5 ± 1.2 years, height = 185.2 ± 6.1 cm, body mass = 107.8 ± 20.7 kg) performed duplicate trials in each test during 2 separate weeks at the conclusion of a winter conditioning period. The better time of the 2 trials for each week was used for comparison. The 10-yd sprint was timed electronically, whereas the I-Test and 3-CD were hand timed by experienced testers. Each trial was performed on an indoor synthetic turf, with players wearing multicleated turf shoes. There was no significant difference (p > 0.06) between test weeks for the I-Test (4.53 ± 0.35 vs. 4.54 ± 0.31 seconds), 3-CD (7.45 ± 0.06 vs. 7.49 ± 0.06 seconds), or 10-yd sprint (1.85 ± 0.12 vs. 1.84 ± 0.12 seconds). The intraclass correlation coefficients (ICC) for 3-CD (ICC = 0.962) and 10-yd sprint (ICC = 0.974) were slightly higher than for the I-Test (ICC = 0.914). These values lead to acceptable levels of the coefficient of variation for each test (1.2, 1.2, and 1.9%, respectively). The SWD% indicated that a meaningful improvement due to training would require players to decrease their times by 6.6% for I-Test, 3.7% for 3-CD, and 3.8% for 10-yd sprint. Performance in agility and short sprint tests are highly related and reliable in college football players, providing quantifiable

  18. Methods Evolved by Observation

    ERIC Educational Resources Information Center

    Montessori, Maria

    2016-01-01

    Montessori's idea of the child's nature and the teacher's perceptiveness begins with amazing simplicity, and when she speaks of "methods evolved," she is unveiling a methodological system for observation. She begins with the early childhood explosion into writing, which is a familiar child phenomenon that Montessori has written about…

  19. Evolvable Neural Software System

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  20. Evolving Robust Gene Regulatory Networks

    PubMed Central

    Noman, Nasimul; Monjo, Taku; Moscato, Pablo; Iba, Hitoshi

    2015-01-01

    Design and implementation of robust network modules is essential for construction of complex biological systems through hierarchical assembly of ‘parts’ and ‘devices’. The robustness of gene regulatory networks (GRNs) is ascribed chiefly to the underlying topology. The automatic designing capability of GRN topology that can exhibit robust behavior can dramatically change the current practice in synthetic biology. A recent study shows that Darwinian evolution can gradually develop higher topological robustness. Subsequently, this work presents an evolutionary algorithm that simulates natural evolution in silico, for identifying network topologies that are robust to perturbations. We present a Monte Carlo based method for quantifying topological robustness and designed a fitness approximation approach for efficient calculation of topological robustness which is computationally very intensive. The proposed framework was verified using two classic GRN behaviors: oscillation and bistability, although the framework is generalized for evolving other types of responses. The algorithm identified robust GRN architectures which were verified using different analysis and comparison. Analysis of the results also shed light on the relationship among robustness, cooperativity and complexity. This study also shows that nature has already evolved very robust architectures for its crucial systems; hence simulation of this natural process can be very valuable for designing robust biological systems. PMID:25616055

  1. Highly-evolved stars

    NASA Technical Reports Server (NTRS)

    Heap, S. R.

    1981-01-01

    The ways in which the IUE has proved useful in studying highly evolved stars are reviewed. The importance of high dispersion spectra for abundance analyses of the sd0 stars and for studies of the wind from the central star of NGC 6543 and the wind from the 0 type component of Vela X-1 is shown. Low dispersion spectra are used for absolute spectrophotometry of the dwarf nova, Ex Hya. Angular resolution is important for detecting and locating UV sources in globular clusters.

  2. Regolith Evolved Gas Analyzer

    NASA Technical Reports Server (NTRS)

    Hoffman, John H.; Hedgecock, Jud; Nienaber, Terry; Cooper, Bonnie; Allen, Carlton; Ming, Doug

    2000-01-01

    The Regolith Evolved Gas Analyzer (REGA) is a high-temperature furnace and mass spectrometer instrument for determining the mineralogical composition and reactivity of soil samples. REGA provides key mineralogical and reactivity data that is needed to understand the soil chemistry of an asteroid, which then aids in determining in-situ which materials should be selected for return to earth. REGA is capable of conducting a number of direct soil measurements that are unique to this instrument. These experimental measurements include: (1) Mass spectrum analysis of evolved gases from soil samples as they are heated from ambient temperature to 900 C; and (2) Identification of liberated chemicals, e.g., water, oxygen, sulfur, chlorine, and fluorine. REGA would be placed on the surface of a near earth asteroid. It is an autonomous instrument that is controlled from earth but does the analysis of regolith materials automatically. The REGA instrument consists of four primary components: (1) a flight-proven mass spectrometer, (2) a high-temperature furnace, (3) a soil handling system, and (4) a microcontroller. An external arm containing a scoop or drill gathers regolith samples. A sample is placed in the inlet orifice where the finest-grained particles are sifted into a metering volume and subsequently moved into a crucible. A movable arm then places the crucible in the furnace. The furnace is closed, thereby sealing the inner volume to collect the evolved gases for analysis. Owing to the very low g forces on an asteroid compared to Mars or the moon, the sample must be moved from inlet to crucible by mechanical means rather than by gravity. As the soil sample is heated through a programmed pattern, the gases evolved at each temperature are passed through a transfer tube to the mass spectrometer for analysis and identification. Return data from the instrument will lead to new insights and discoveries including: (1) Identification of the molecular masses of all of the gases

  3. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1990-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predicitons for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates prodcued by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  4. Recalibrating software reliability models

    NASA Technical Reports Server (NTRS)

    Brocklehurst, Sarah; Chan, P. Y.; Littlewood, Bev; Snell, John

    1989-01-01

    In spite of much research effort, there is no universally applicable software reliability growth model which can be trusted to give accurate predictions of reliability in all circumstances. Further, it is not even possible to decide a priori which of the many models is most suitable in a particular context. In an attempt to resolve this problem, techniques were developed whereby, for each program, the accuracy of various models can be analyzed. A user is thus enabled to select that model which is giving the most accurate reliability predictions for the particular program under examination. One of these ways of analyzing predictive accuracy, called the u-plot, in fact allows a user to estimate the relationship between the predicted reliability and the true reliability. It is shown how this can be used to improve reliability predictions in a completely general way by a process of recalibration. Simulation results show that the technique gives improved reliability predictions in a large proportion of cases. However, a user does not need to trust the efficacy of recalibration, since the new reliability estimates produced by the technique are truly predictive and so their accuracy in a particular application can be judged using the earlier methods. The generality of this approach would therefore suggest that it be applied as a matter of course whenever a software reliability model is used.

  5. Reliability training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  6. Why did heterospory evolve?

    PubMed

    Petersen, Kurt B; Burd, Martin

    2016-10-11

    The primitive land plant life cycle featured the production of spores of unimodal size, a condition called homospory. The evolution of bimodal size distributions with small male spores and large female spores, known as heterospory, was an innovation that occurred repeatedly in the history of land plants. The importance of desiccation-resistant spores for colonization of the land is well known, but the adaptive value of heterospory has never been well established. It was an addition to a sexual life cycle that already involved male and female gametes. Its role as a precursor to the evolution of seeds has received much attention, but this is an evolutionary consequence of heterospory that cannot explain the transition from homospory to heterospory (and the lack of evolutionary reversal from heterospory to homospory). Enforced outcrossing of gametophytes has often been mentioned in connection to heterospory, but we review the shortcomings of this argument as an explanation of the selective advantage of heterospory. Few alternative arguments concerning the selective forces favouring heterospory have been proposed, a paucity of attention that is surprising given the importance of this innovation in land plant evolution. In this review we highlight two ideas that may lead us to a better understanding of why heterospory evolved. First, models of optimal resource allocation - an approach that has been used for decades in evolutionary ecology to help understand parental investment and other life-history patterns - suggest that an evolutionary increase in spore size could reach a threshold at which small spores yielding small, sperm-producing gametophytes would return greater fitness per unit of resource investment than would large spores and bisexual gametophytes. With the advent of such microspores, megaspores would evolve under frequency-dependent selection. This argument can account for the appearance of heterospory in the Devonian, when increasingly tall and complex

  7. Person Reliability

    ERIC Educational Resources Information Center

    Lumsden, James

    1977-01-01

    Person changes can be of three kinds: developmental trends, swells, and tremors. Person unreliability in the tremor sense (momentary fluctuations) can be estimated from person characteristic curves. Average person reliability for groups can be compared from item characteristic curves. (Author)

  8. Evolving a photosynthetic organelle.

    PubMed

    Nakayama, Takuro; Archibald, John M

    2012-04-24

    The evolution of plastids from cyanobacteria is believed to represent a singularity in the history of life. The enigmatic amoeba Paulinella and its 'recently' acquired photosynthetic inclusions provide a fascinating system through which to gain fresh insight into how endosymbionts become organelles.The plastids, or chloroplasts, of algae and plants evolved from cyanobacteria by endosymbiosis. This landmark event conferred on eukaryotes the benefits of photosynthesis--the conversion of solar energy into chemical energy--and in so doing had a huge impact on the course of evolution and the climate of Earth 1. From the present state of plastids, however, it is difficult to trace the evolutionary steps involved in this momentous development, because all modern-day plastids have fully integrated into their hosts. Paulinella chromatophora is a unicellular eukaryote that bears photosynthetic entities called chromatophores that are derived from cyanobacteria and has thus received much attention as a possible example of an organism in the early stages of organellogenesis. Recent studies have unlocked the genomic secrets of its chromatophore 23 and provided concrete evidence that the Paulinella chromatophore is a bona fide photosynthetic organelle 4. The question is how Paulinella can help us to understand the process by which an endosymbiont is converted into an organelle.

  9. Evolving synergetic interactions

    PubMed Central

    Wu, Bin; Arranz, Jordi; Du, Jinming; Zhou, Da; Traulsen, Arne

    2016-01-01

    Cooperators forgo their own interests to benefit others. This reduces their fitness and thus cooperators are not likely to spread based on natural selection. Nonetheless, cooperation is widespread on every level of biological organization ranging from bacterial communities to human society. Mathematical models can help to explain under which circumstances cooperation evolves. Evolutionary game theory is a powerful mathematical tool to depict the interactions between cooperators and defectors. Classical models typically involve either pairwise interactions between individuals or a linear superposition of these interactions. For interactions within groups, however, synergetic effects may arise: their outcome is not just the sum of its parts. This is because the payoffs via a single group interaction can be different from the sum of any collection of two-player interactions. Assuming that all interactions start from pairs, how can such synergetic multiplayer games emerge from simpler pairwise interactions? Here, we present a mathematical model that captures the transition from pairwise interactions to synergetic multiplayer ones. We assume that different social groups have different breaking rates. We show that non-uniform breaking rates do foster the emergence of synergy, even though individuals always interact in pairs. Our work sheds new light on the mechanisms underlying such synergetic interactions. PMID:27466437

  10. Evolving Crashworthiness Design Criteria

    DTIC Science & Technology

    1988-12-01

    occupants n crash velocity changes of the seerity cited in. Figure 3. Moreover, the structure and equipment shall allow deformation in a controlled ...to governed by the s topping distance anod pulse duration. Figure 40 ilus- trotes this relationship and indicates the Importance of controlled energy...fatigue, life durina the initIl design phase of the helicopter.onFigure 5 depicts the systia’s approach required relative to managesent of the crash

  11. Photovoltaic performance and reliability workshop

    SciTech Connect

    Mrig, L.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986--1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the US, PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  12. Reliability physics

    NASA Technical Reports Server (NTRS)

    Cuddihy, E. F.; Ross, R. G., Jr.

    1984-01-01

    Speakers whose topics relate to the reliability physics of solar arrays are listed and their topics briefly reviewed. Nine reports are reviewed ranging in subjects from studies of photothermal degradation in encapsulants and polymerizable ultraviolet stabilizers to interface bonding stability to electrochemical degradation of photovoltaic modules.

  13. Disgust: Evolved Function and Structure

    ERIC Educational Resources Information Center

    Tybur, Joshua M.; Lieberman, Debra; Kurzban, Robert; DeScioli, Peter

    2013-01-01

    Interest in and research on disgust has surged over the past few decades. The field, however, still lacks a coherent theoretical framework for understanding the evolved function or functions of disgust. Here we present such a framework, emphasizing 2 levels of analysis: that of evolved function and that of information processing. Although there is…

  14. The Reliability of Density Measurements.

    ERIC Educational Resources Information Center

    Crothers, Charles

    1978-01-01

    Data from a land-use study of small- and medium-sized towns in New Zealand are used to ascertain the relationship between official and effective density measures. It was found that the reliability of official measures of density is very low overall, although reliability increases with community size. (Author/RLV)

  15. Evolving virtual creatures and catapults.

    PubMed

    Chaumont, Nicolas; Egli, Richard; Adami, Christoph

    2007-01-01

    We present a system that can evolve the morphology and the controller of virtual walking and block-throwing creatures (catapults) using a genetic algorithm. The system is based on Sims' work, implemented as a flexible platform with an off-the-shelf dynamics engine. Experiments aimed at evolving Sims-type walkers resulted in the emergence of various realistic gaits while using fairly simple objective functions. Due to the flexibility of the system, drastically different morphologies and functions evolved with only minor modifications to the system and objective function. For example, various throwing techniques evolved when selecting for catapults that propel a block as far as possible. Among the strategies and morphologies evolved, we find the drop-kick strategy, as well as the systematic invention of the principle behind the wheel, when allowing mutations to the projectile.

  16. Reliability and Validity of the Persian Version of Compulsive Eating Scale (CES) in Overweight or Obese Women and Its Relationship with Some Body Composition and Dietary Intake Variables

    PubMed Central

    Mostafavi, Seyed-Ali; Keshavarz, Seyed Ali; Mohammadi, Mohammad Reza; Hosseini, Saeed; Eshraghian, Mohammad Reza; Hosseinzadeh, Payam; Chamari, Maryam; Sari, Zeinab; Akhondzadeh, Shahin

    2016-01-01

    Objective: Compulsive or binge eating is a kind of disturbed eating behavior, which is mostly observed among dieting women, and is integrated with appetite disorder, and uncontrolled eating of plenty of junk food. The Compulsive Eating Scale (CES) created first by Kagan & Squires in 1984, is an eight-item self-reporting instrument that is made to measure the severity of binge eating disorder. The aim of this study was to provide the reliability and validity of the Persian version of Compulsive Eating Scale (CES) among overweight and obese women in Iran. Method: One hundred and twenty six (N = 126) overweight and obese women consented to participate in this study. We estimated the anthropometric indices, including body weight, height, waist and hip circumferences, a total body fat percentage, and visceral fat level with body analyzer all in standard situations. Then, the participants completed the CES. Next, to assess concurrent validity, Beck Depression Inventory, Spielberger anxiety scale, appetite visual analogue rating scale, Food Craving questionnaire, Three-Factor Eating Questionnaire-R18, and Restraint eating visual analogue rating scale were performed simultaneously. To assess test-retest reliability, CES was repeated for all the participants two weeks later. Moreover, we reported the internal consistency and factor analysis of this questionnaire. Furthermore, we estimated the concurrent correlation of CES with logically relevant questionnaires and body composition and anthropometric indices. Results: Based on the reliability analysis and factor analysis of the principal component by Varimax rotation, we extracted two factors: eating because of negative feelings, and overeating. Internal consistency (Cronbach's alpha) of the CES was 0.85 (Cronbach alpha of the factors was 0.85, and 0.74, respectively). The test-retest correlation of the CES was 0.89. Also, the split-half reliability of the questionnaire was established with the correlation coefficient

  17. Network reliability

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1985-01-01

    Network control (or network management) functions are essential for efficient and reliable operation of a network. Some control functions are currently included as part of the Open System Interconnection model. For local area networks, it is widely recognized that there is a need for additional control functions, including fault isolation functions, monitoring functions, and configuration functions. These functions can be implemented in either a central or distributed manner. The Fiber Distributed Data Interface Medium Access Control and Station Management protocols provide an example of distributed implementation. Relative information is presented here in outline form.

  18. Proposed reliability cost model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  19. How did the cilium evolve?

    PubMed

    Satir, Peter; Mitchell, David R; Jékely, Gáspár

    2008-01-01

    The cilium is a characteristic organelle of eukaryotes constructed from over 600 proteins. Bacterial flagella are entirely different. 9 + 2 motile cilia evolved before the divergence of the last eukaryotic common ancestor (LECA). This chapter explores, compares, and contrasts two potential pathways of evolution: (1) via invasion of a centriolar-like virus and (2) via autogenous formation from a pre-existing microtubule-organizing center (MTOC). In either case, the intraflagellar transport (IFT) machinery that is nearly universally required for the assembly and maintenance of cilia derived from the evolving intracellular vesicular transport system. The sensory function of cilia evolved first and the ciliary axoneme evolved gradually with ciliary motility, an important selection mechanism, as one of the driving forces.

  20. Natural selection promotes antigenic evolvability.

    PubMed

    Graves, Christopher J; Ros, Vera I D; Stevenson, Brian; Sniegowski, Paul D; Brisson, Dustin

    2013-01-01

    The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide an experimentally tractable system to test whether natural selection has favored mechanisms that increase evolvability. Many antigenic variation systems consist of paralogous unexpressed 'cassettes' that recombine into an expression site to rapidly alter the expressed protein. Importantly, the magnitude of antigenic change is a function of the genetic diversity among the unexpressed cassettes. Thus, evidence that selection favors among-cassette diversity is direct evidence that natural selection promotes antigenic evolvability. We used the Lyme disease bacterium, Borrelia burgdorferi, as a model to test the prediction that natural selection favors amino acid diversity among unexpressed vls cassettes and thereby promotes evolvability in a primary surface antigen, VlsE. The hypothesis that diversity among vls cassettes is favored by natural selection was supported in each B. burgdorferi strain analyzed using both classical (dN/dS ratios) and Bayesian population genetic analyses of genetic sequence data. This hypothesis was also supported by the conservation of highly mutable tandem-repeat structures across B. burgdorferi strains despite a near complete absence of sequence conservation. Diversification among vls cassettes due to natural selection and mutable repeat structures promotes long-term antigenic evolvability of VlsE. These findings provide a direct demonstration that molecular mechanisms that enhance evolvability of surface antigens are an evolutionary adaptation. The molecular evolutionary processes identified here can serve as a model for the evolution of antigenic evolvability in many pathogens which utilize similar strategies to establish chronic infections.

  1. On the Discovery of Evolving Truth.

    PubMed

    Li, Yaliang; Li, Qi; Gao, Jing; Su, Lu; Zhao, Bo; Fan, Wei; Han, Jiawei

    2015-08-01

    In the era of big data, information regarding the same objects can be collected from increasingly more sources. Unfortunately, there usually exist conflicts among the information coming from different sources. To tackle this challenge, truth discovery, i.e., to integrate multi-source noisy information by estimating the reliability of each source, has emerged as a hot topic. In many real world applications, however, the information may come sequentially, and as a consequence, the truth of objects as well as the reliability of sources may be dynamically evolving. Existing truth discovery methods, unfortunately, cannot handle such scenarios. To address this problem, we investigate the temporal relations among both object truths and source reliability, and propose an incremental truth discovery framework that can dynamically update object truths and source weights upon the arrival of new data. Theoretical analysis is provided to show that the proposed method is guaranteed to converge at a fast rate. The experiments on three real world applications and a set of synthetic data demonstrate the advantages of the proposed method over state-of-the-art truth discovery methods.

  2. On the Discovery of Evolving Truth

    PubMed Central

    Li, Yaliang; Li, Qi; Gao, Jing; Su, Lu; Zhao, Bo; Fan, Wei; Han, Jiawei

    2015-01-01

    In the era of big data, information regarding the same objects can be collected from increasingly more sources. Unfortunately, there usually exist conflicts among the information coming from different sources. To tackle this challenge, truth discovery, i.e., to integrate multi-source noisy information by estimating the reliability of each source, has emerged as a hot topic. In many real world applications, however, the information may come sequentially, and as a consequence, the truth of objects as well as the reliability of sources may be dynamically evolving. Existing truth discovery methods, unfortunately, cannot handle such scenarios. To address this problem, we investigate the temporal relations among both object truths and source reliability, and propose an incremental truth discovery framework that can dynamically update object truths and source weights upon the arrival of new data. Theoretical analysis is provided to show that the proposed method is guaranteed to converge at a fast rate. The experiments on three real world applications and a set of synthetic data demonstrate the advantages of the proposed method over state-of-the-art truth discovery methods. PMID:26705502

  3. The Skin Cancer and Sun Knowledge (SCSK) Scale: Validity, Reliability, and Relationship to Sun-Related Behaviors among Young Western Adults

    ERIC Educational Resources Information Center

    Day, Ashley K.; Wilson, Carlene; Roberts, Rachel M.; Hutchinson, Amanda D.

    2014-01-01

    Increasing public knowledge remains one of the key aims of skin cancer awareness campaigns, yet diagnosis rates continue to rise. It is essential we measure skin cancer knowledge adequately so as to determine the nature of its relationship to sun-related behaviors. This study investigated the psychometric properties of a new measure of skin cancer…

  4. Robustness to Faults Promotes Evolvability: Insights from Evolving Digital Circuits.

    PubMed

    Milano, Nicola; Nolfi, Stefano

    2016-01-01

    We demonstrate how the need to cope with operational faults enables evolving circuits to find more fit solutions. The analysis of the results obtained in different experimental conditions indicates that, in absence of faults, evolution tends to select circuits that are small and have low phenotypic variability and evolvability. The need to face operation faults, instead, drives evolution toward the selection of larger circuits that are truly robust with respect to genetic variations and that have a greater level of phenotypic variability and evolvability. Overall our results indicate that the need to cope with operation faults leads to the selection of circuits that have a greater probability to generate better circuits as a result of genetic variation with respect to a control condition in which circuits are not subjected to faults.

  5. The genotype-phenotype map of an evolving digital organism.

    PubMed

    Fortuna, Miguel A; Zaman, Luis; Ofria, Charles; Wagner, Andreas

    2017-02-01

    To understand how evolving systems bring forth novel and useful phenotypes, it is essential to understand the relationship between genotypic and phenotypic change. Artificial evolving systems can help us understand whether the genotype-phenotype maps of natural evolving systems are highly unusual, and it may help create evolvable artificial systems. Here we characterize the genotype-phenotype map of digital organisms in Avida, a platform for digital evolution. We consider digital organisms from a vast space of 10141 genotypes (instruction sequences), which can form 512 different phenotypes. These phenotypes are distinguished by different Boolean logic functions they can compute, as well as by the complexity of these functions. We observe several properties with parallels in natural systems, such as connected genotype networks and asymmetric phenotypic transitions. The likely common cause is robustness to genotypic change. We describe an intriguing tension between phenotypic complexity and evolvability that may have implications for biological evolution. On the one hand, genotypic change is more likely to yield novel phenotypes in more complex organisms. On the other hand, the total number of novel phenotypes reachable through genotypic change is highest for organisms with simple phenotypes. Artificial evolving systems can help us study aspects of biological evolvability that are not accessible in vastly more complex natural systems. They can also help identify properties, such as robustness, that are required for both human-designed artificial systems and synthetic biological systems to be evolvable.

  6. The genotype-phenotype map of an evolving digital organism

    PubMed Central

    Zaman, Luis; Wagner, Andreas

    2017-01-01

    To understand how evolving systems bring forth novel and useful phenotypes, it is essential to understand the relationship between genotypic and phenotypic change. Artificial evolving systems can help us understand whether the genotype-phenotype maps of natural evolving systems are highly unusual, and it may help create evolvable artificial systems. Here we characterize the genotype-phenotype map of digital organisms in Avida, a platform for digital evolution. We consider digital organisms from a vast space of 10141 genotypes (instruction sequences), which can form 512 different phenotypes. These phenotypes are distinguished by different Boolean logic functions they can compute, as well as by the complexity of these functions. We observe several properties with parallels in natural systems, such as connected genotype networks and asymmetric phenotypic transitions. The likely common cause is robustness to genotypic change. We describe an intriguing tension between phenotypic complexity and evolvability that may have implications for biological evolution. On the one hand, genotypic change is more likely to yield novel phenotypes in more complex organisms. On the other hand, the total number of novel phenotypes reachable through genotypic change is highest for organisms with simple phenotypes. Artificial evolving systems can help us study aspects of biological evolvability that are not accessible in vastly more complex natural systems. They can also help identify properties, such as robustness, that are required for both human-designed artificial systems and synthetic biological systems to be evolvable. PMID:28241039

  7. An Investigation into Reliability of Knee Extension Muscle Strength Measurements, and into the Relationship between Muscle Strength and Means of Independent Mobility in the Ward: Examinations of Patients Who Underwent Femoral Neck Fracture Surgery

    PubMed Central

    Katoh, Munenori; Kaneko, Yoshihiro

    2014-01-01

    [Purpose] The purpose of the present study was to investigate the reliability of isometric knee extension muscle strength measurement of patients who underwent femoral neck fracture surgery, as well as the relationship between independent mobility in the ward and knee muscle strength. [Subjects] The subjects were 75 patients who underwent femoral neck fracture surgery. [Methods] We used a hand-held dynamometer and a belt to measure isometric knee extension muscle strength three times, and used intraclass correlation coefficients (ICCs) to investigate the reliability of the measurements. We used a receiver operating characteristic curve to investigate the cutoff values for independent walking with walking sticks and non-independent mobility. [Results] ICCs (1, 1) were 0.9 or higher. The cutoff value for independent walking with walking sticks was 0.289 kgf/kg on the non-fractured side, 0.193 kgf/kg on the fractured side, and the average of both limbs was 0.238 kgf/kg. [Conclusion] We consider that the test-retest reliability of isometric knee extension muscle strength measurement of patients who have undergone femoral neck fracture surgery is high. We also consider that isometric knee extension muscle strength is useful for investigating means of independent mobility in the ward. PMID:24567667

  8. Evolving MEMS Resonator Designs for Fabrication

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.; Kraus, William F.; Lohn, Jason D.

    2008-01-01

    Because of their small size and high reliability, microelectromechanical (MEMS) devices have the potential to revolution many areas of engineering. As with conventionally-sized engineering design, there is likely to be a demand for the automated design of MEMS devices. This paper describes our current status as we progress toward our ultimate goal of using an evolutionary algorithm and a generative representation to produce designs of a MEMS device and successfully demonstrate its transfer to an actual chip. To produce designs that are likely to transfer to reality, we present two ways to modify evaluation of designs. The first is to add location noise, differences between the actual dimensions of the design and the design blueprint, which is a technique we have used for our work in evolving antennas and robots. The second method is to add prestress to model the warping that occurs during the extreme heat of fabrication. In future we expect to fabricate and test some MEMS resonators that are evolved in this way.

  9. Slippery Texts and Evolving Literacies

    ERIC Educational Resources Information Center

    Mackey, Margaret

    2007-01-01

    The idea of "slippery texts" provides a useful descriptor for materials that mutate and evolve across different media. Eight adult gamers, encountering the slippery text "American McGee's Alice," demonstrate a variety of ways in which players attempt to manage their attention as they encounter a new text with many resonances. The range of their…

  10. Thermal and evolved gas analyzer

    NASA Technical Reports Server (NTRS)

    Williams, M. S.; Boynton, W. V.; James, R. L.; Verts, W. T.; Bailey, S. H.; Hamara, D. K.

    1998-01-01

    The Thermal and Evolved Gas Analyzer (TEGA) instrument will perform calorimetry and evolved gas analysis on soil samples collected from the Martian surface. TEGA is one of three instruments, along with a robotic arm, that form the Mars Volatile and Climate Survey (MVACS) payload. The other instruments are a stereo surface imager, built by Peter Smith of the University of Arizona and a meteorological station, built by JPL. The MVACS lander will investigate a Martian landing site at approximately 70 deg south latitude. Launch will take place from Kennedy Space Center in January, 1999. The TEGA project started in February, 1996. In the intervening 24 months, a flight instrument concept has been designed, prototyped, built as an engineering model and flight model, and tested. The instrument performs laboratory-quality differential-scanning calorimetry (DSC) over the temperature range of Mars ambient to 1400K. Low-temperature volatiles (water and carbon dioxide ices) and the carbonates will be analyzed in this temperature range. Carbonates melt and evolve carbon dioxide at temperatures above 600 C. Evolved oxygen (down to a concentration of 1 ppm) is detected, and C02 and water vapor and the isotopic variations of C02 and water vapor are detected and their concentrations measured. The isotopic composition provides important tests of the theory of solar system formation.

  11. Signing Apes and Evolving Linguistics.

    ERIC Educational Resources Information Center

    Stokoe, William C.

    Linguistics retains from its antecedents, philology and the study of sacred writings, some of their apologetic and theological bias. Thus it has not been able to face squarely the question how linguistic function may have evolved from animal communication. Chimpanzees' use of signs from American Sign Language forces re-examination of language…

  12. How evolvable are polarization machines?

    NASA Astrophysics Data System (ADS)

    Laan, Liedewij; Murray, Andrew

    2012-02-01

    In many different cell types proper polarization is essential for cell function. Polarization mechanisms however, differ between cell types and even closely related species use a variety of polarization machines. Budding yeast, for example, depends on several parallel mechanisms to establish polarity. One mechanism (i) depends on reaction and diffusion of proteins in the membrane. Another one (ii) depends on reorganization of the actin cytoskeleton. So why does yeast use several mechanisms simultaneously? Can yeast also polarize robustly in the absence of one of them? We addressed these questions by evolving budding yeast in the absence of mechanism (i) or (ii). We deleted a mechanism by deleting one or two genes that are essential for its function. After the deletion of either mechanism the growth rate of cells was highly decreased (2-5 fold) and their cell shape was highly perturbed. Subsequently, we evolved these cells for 10 days. Surprisingly, the evolved cells rapidly overcame most of their polarity defects. They grow at 0.9x wildtype growth rate and their cell shape is signifigantly less perturbed. Now we will study how these cells rescued polarization. Did they fix the deleted mechanism, strengthen other mechanisms or evolve a completely new one?

  13. The evolved function of the oedipal conflict.

    PubMed

    Josephs, Lawrence

    2010-08-01

    Freud based his oedipal theory on three clinical observations of adult romantic relationships: (1) Adults tend to split love and lust; (2) There tend to be sex differences in the ways that men and women split love and lust; (3) Adult romantic relationships are unconsciously structured by the dynamics of love triangles in which dramas of seduction and betrayal unfold. Freud believed that these aspects of adult romantic relationships were derivative expressions of a childhood oedipal conflict that has been repressed. Recent research conducted by evolutionary psychologists supports many of Freud's original observations and suggests that Freud's oedipal conflict may have evolved as a sexually selected adaptation for reproductive advantage. The evolution of bi-parental care based on sexually exclusive romantic bonds made humans vulnerable to the costs of sexual infidelity, a situation of danger that seriously threatens monogamous bonds. A childhood oedipal conflict enables humans to better adapt to this longstanding evolutionary problem by providing the child with an opportunity to develop working models of love triangles. On the one hand, the oedipal conflict facilitates monogamous resolutions by creating intense anxiety about the dangers of sexual infidelity and mate poaching. On the other hand, the oedipal conflict in humans may facilitate successful cheating and mate poaching by cultivating a talent for hiding our true sexual intentions from others and even from ourselves. The oedipal conflict in humans may be disguised by evolutionary design in order to facilitate tactical deception in adult romantic relationships.

  14. Non-uniform Evolving Hypergraphs and Weighted Evolving Hypergraphs

    PubMed Central

    Guo, Jin-Li; Zhu, Xin-Yun; Suo, Qi; Forrest, Jeffrey

    2016-01-01

    Firstly, this paper proposes a non-uniform evolving hypergraph model with nonlinear preferential attachment and an attractiveness. This model allows nodes to arrive in batches according to a Poisson process and to form hyperedges with existing batches of nodes. Both the number of arriving nodes and that of chosen existing nodes are random variables so that the size of each hyperedge is non-uniform. This paper establishes the characteristic equation of hyperdegrees, calculates changes in the hyperdegree of each node, and obtains the stationary average hyperdegree distribution of the model by employing the Poisson process theory and the characteristic equation. Secondly, this paper constructs a model for weighted evolving hypergraphs that couples the establishment of new hyperedges, nodes and the dynamical evolution of the weights. Furthermore, what is obtained are respectively the stationary average hyperdegree and hyperstrength distributions by using the hyperdegree distribution of the established unweighted model above so that the weighted evolving hypergraph exhibits a scale-free behavior for both hyperdegree and hyperstrength distributions. PMID:27845334

  15. Coupled oscillators on evolving networks

    NASA Astrophysics Data System (ADS)

    Singh, R. K.; Bagarti, Trilochan

    2016-12-01

    In this work we study coupled oscillators on evolving networks. We find that the steady state behavior of the system is governed by the relative values of the spread in natural frequencies and the global coupling strength. For coupling strong in comparison to the spread in frequencies, the system of oscillators synchronize and when coupling strength and spread in frequencies are large, a phenomenon similar to amplitude death is observed. The network evolution provides a mechanism to build inter-oscillator connections and once a dynamic equilibrium is achieved, oscillators evolve according to their local interactions. We also find that the steady state properties change by the presence of additional time scales. We demonstrate these results based on numerical calculations studying dynamical evolution of limit-cycle and van der Pol oscillators.

  16. Evolvable Hardware for Space Applications

    NASA Technical Reports Server (NTRS)

    Lohn, Jason; Globus, Al; Hornby, Gregory; Larchev, Gregory; Kraus, William

    2004-01-01

    This article surveys the research of the Evolvable Systems Group at NASA Ames Research Center. Over the past few years, our group has developed the ability to use evolutionary algorithms in a variety of NASA applications ranging from spacecraft antenna design, fault tolerance for programmable logic chips, atomic force field parameter fitting, analog circuit design, and earth observing satellite scheduling. In some of these applications, evolutionary algorithms match or improve on human performance.

  17. Evolving Systems and Adaptive Key Component Control

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Balas, Mark J.

    2009-01-01

    We propose a new framework called Evolving Systems to describe the self-assembly, or autonomous assembly, of actively controlled dynamical subsystems into an Evolved System with a higher purpose. An introduction to Evolving Systems and exploration of the essential topics of the control and stability properties of Evolving Systems is provided. This chapter defines a framework for Evolving Systems, develops theory and control solutions for fundamental characteristics of Evolving Systems, and provides illustrative examples of Evolving Systems and their control with adaptive key component controllers.

  18. You 3.0: The Most Important Evolving Technology

    ERIC Educational Resources Information Center

    Tamarkin, Molly; Bantz, David A.; Childs, Melody; diFilipo, Stephen; Landry, Stephen G.; LoPresti, Frances; McDonald, Robert H.; McGuthry, John W.; Meier, Tina; Rodrigo, Rochelle; Sparrow, Jennifer; Diggs, D. Teddy; Yang, Catherine W.

    2010-01-01

    That technology evolves is a given. Not as well understood is the impact of technological evolution on each individual--on oneself, one's skill development, one's career, and one's relationship with the work community. The authors believe that everyone in higher education will become an IT worker and that IT workers will be managing a growing…

  19. The emotion system promotes diversity and evolvability

    PubMed Central

    Giske, Jarl; Eliassen, Sigrunn; Fiksen, Øyvind; Jakobsen, Per J.; Aksnes, Dag L.; Mangel, Marc; Jørgensen, Christian

    2014-01-01

    Studies on the relationship between the optimal phenotype and its environment have had limited focus on genotype-to-phenotype pathways and their evolutionary consequences. Here, we study how multi-layered trait architecture and its associated constraints prescribe diversity. Using an idealized model of the emotion system in fish, we find that trait architecture yields genetic and phenotypic diversity even in absence of frequency-dependent selection or environmental variation. That is, for a given environment, phenotype frequency distributions are predictable while gene pools are not. The conservation of phenotypic traits among these genetically different populations is due to the multi-layered trait architecture, in which one adaptation at a higher architectural level can be achieved by several different adaptations at a lower level. Our results emphasize the role of convergent evolution and the organismal level of selection. While trait architecture makes individuals more constrained than what has been assumed in optimization theory, the resulting populations are genetically more diverse and adaptable. The emotion system in animals may thus have evolved by natural selection because it simultaneously enhances three important functions, the behavioural robustness of individuals, the evolvability of gene pools and the rate of evolutionary innovation at several architectural levels. PMID:25100697

  20. A slowly evolving host moves first in symbiotic interactions

    NASA Astrophysics Data System (ADS)

    Damore, James; Gore, Jeff

    2011-03-01

    Symbiotic relationships, both parasitic and mutualistic, are ubiquitous in nature. Understanding how these symbioses evolve, from bacteria and their phages to humans and our gut microflora, is crucial in understanding how life operates. Often, symbioses consist of a slowly evolving host species with each host only interacting with its own sub-population of symbionts. The Red Queen hypothesis describes coevolutionary relationships as constant arms races with each species rushing to evolve an advantage over the other, suggesting that faster evolution is favored. Here, we use a simple game theoretic model of host- symbiont coevolution that includes population structure to show that if the symbionts evolve much faster than the host, the equilibrium distribution is the same as it would be if it were a sequential game where the host moves first against its symbionts. For the slowly evolving host, this will prove to be advantageous in mutualisms and a handicap in antagonisms. The model allows for symbiont adaptation to its host, a result that is robust to changes in the parameters and generalizes to continuous and multiplayer games. Our findings provide insight into a wide range of symbiotic phenomena and help to unify the field of coevolutionary theory.

  1. Evolved Expendable Launch Vehicle (EELV)

    DTIC Science & Technology

    2015-12-15

    potential NSS mission processing timelines. SpaceX is now eligible for an award of specified NSS missions to include the GPS III-2 launch service... SpaceX has also evolved their Falcon 9v1.1 configuration into the Falcon 9 Upgrade. To update the certification baseline, SpaceX and AF built Joint Work...9 v1.1 commercial launch experienced an in-flight mishap resulting in loss of vehicle on June 28, 2015. An official investigation was led by a SpaceX

  2. The 'E' factor -- evolving endodontics.

    PubMed

    Hunter, M J

    2013-03-01

    Endodontics is a constantly developing field, with new instruments, preparation techniques and sealants competing with trusted and traditional approaches to tooth restoration. Thus general dental practitioners must question and understand the significance of these developments before adopting new practices. In view of this, the aim of this article, and the associated presentation at the 2013 British Dental Conference & Exhibition, is to provide an overview of endodontic methods and constantly evolving best practice. The presentation will review current preparation techniques, comparing rotary versus reciprocation, and question current trends in restoration of the endodontically treated tooth.

  3. Regolith Evolved Gas Analyzer (REGA)

    NASA Technical Reports Server (NTRS)

    Allen, Carlton C.; McKay, David S.

    1997-01-01

    The instrument consists of five subsystems: (1) a programmable furnace which can be loaded with samples of regolith, (2) a mass spectrometer which detects and measures atmospheric gases or gases evolved during heating, (3) a tank of pressurized gas which can be introduced to the regolith material while detecting and measuring volatile reaction products, (4) a mechanism for dumping the regolith sample and repeating the experiment on a fresh sample, and (5) a data system which controls and monitors the furnace, gas system, and mass spectrometer.

  4. Transport on randomly evolving trees

    NASA Astrophysics Data System (ADS)

    Pál, L.

    2005-11-01

    The time process of transport on randomly evolving trees is investigated. By introducing the notions of living and dead nodes, a model of random tree evolution is constructed which describes the spreading in time of objects corresponding to nodes. It is assumed that at t=0 the tree consists of a single living node (root), from which the evolution may begin. At a certain time instant τ⩾0 , the root produces ν⩾0 living nodes connected by lines to the root which becomes dead at the moment of the offspring production. In the evolution process each of the new living nodes evolves further like a root independently of the others. By using the methods of the age-dependent branching processes we derive the joint distribution function of the numbers of living and dead nodes, and determine the correlation between these node numbers as a function of time. It is proved that the correlation function converges to 3/2 independently of the distributions of ν and τ when q1→1 and t→∞ . Also analyzed are the stochastic properties of the end nodes; and the correlation between the numbers of living and dead end nodes is shown to change its character suddenly at the very beginning of the evolution process. The survival probability of random trees is investigated and expressions are derived for this probability.

  5. Transport on randomly evolving trees.

    PubMed

    Pál, L

    2005-11-01

    The time process of transport on randomly evolving trees is investigated. By introducing the notions of living and dead nodes, a model of random tree evolution is constructed which describes the spreading in time of objects corresponding to nodes. It is assumed that at t=0 the tree consists of a single living node (root), from which the evolution may begin. At a certain time instant tau> or =0, the root produces v> or =0 living nodes connected by lines to the root which becomes dead at the moment of the offspring production. In the evolution process each of the new living nodes evolves further like a root independently of the others. By using the methods of the age-dependent branching processes we derive the joint distribution function of the numbers of living and dead nodes, and determine the correlation between these node numbers as a function of time. It is proved that the correlation function converges to square root of 3/2 independently of the distributions of v and tau when q1-->1 and t-->infinity. Also analyzed are the stochastic properties of the end nodes; and the correlation between the numbers of living and dead end nodes is shown to change its character suddenly at the very beginning of the evolution process. The survival probability of random trees is investigated and expressions are derived for this probability.

  6. Rapidly evolving homing CRISPR barcodes.

    PubMed

    Kalhor, Reza; Mali, Prashant; Church, George M

    2017-02-01

    We present an approach for engineering evolving DNA barcodes in living cells. A homing guide RNA (hgRNA) scaffold directs the Cas9-hgRNA complex to the DNA locus of the hgRNA itself. We show that this homing CRISPR-Cas9 system acts as an expressed genetic barcode that diversifies its sequence and that the rate of diversification can be controlled in cultured cells. We further evaluate these barcodes in cell populations and show that they can be used to record lineage history and that the barcode RNA can be amplified in situ, a prerequisite for in situ sequencing. This integrated approach will have wide-ranging applications, such as in deep lineage tracing, cellular barcoding, molecular recording, dissecting cancer biology, and connectome mapping.

  7. The evolving Gleason grading system.

    PubMed

    Chen, Ni; Zhou, Qiao

    2016-02-01

    The Gleason grading system for prostate adenocarcinoma has evolved from its original scheme established in the 1960s-1970s, to a significantly modified system after two major consensus meetings conducted by the International Society of Urologic Pathology (ISUP) in 2005 and 2014, respectively. The Gleason grading system has been incorporated into the WHO classification of prostate cancer, the AJCC/UICC staging system, and the NCCN guidelines as one of the key factors in treatment decision. Both pathologists and clinicians need to fully understand the principles and practice of this grading system. We here briefly review the historical aspects of the original scheme and the recent developments of Gleason grading system, focusing on major changes over the years that resulted in the modern Gleason grading system, which has led to a new "Grade Group" system proposed by the 2014 ISUP consensus, and adopted by the 2016 WHO classification of tumours of the prostate.

  8. [The evolving of cardiac interventions].

    PubMed

    Billinger, Michael

    2014-12-01

    Treatment modalities for heart diseases have considerable evolved during the last 20 years. Coronary and valvular heart disease are treated increasingly by less invasive percutaneous catheter based procedures instead of open-heart surgery. In addition, new cutting-edge interventions allow to cure heart disease for which until recently only medical treatment options were available. Whilst many patients benefit from these innovative therapies, rapidly developing technologies potentially carry the risk of overtreatment. In order to select patients for the most appropriate treatment, an intensive interdisciplinary teamwork between cardiologists and cardiac surgeons is a mandatory requirement. Additionally, knowledge transfer between cardiologists, their growing subspecialties and practitioners should be encouraged. Finally, timely scientific evaluation of new therapies and subsequent incorporation in guidelines remains crucial.

  9. Isotopic Analysis and Evolved Gases

    NASA Technical Reports Server (NTRS)

    Swindle, Timothy D.; Boynton, William V.; Chutjian, Ara; Hoffman, John H.; Jordan, Jim L.; Kargel, Jeffrey S.; McEntire, Richard W.; Nyquist, Larry

    1996-01-01

    Precise measurements of the chemical, elemental, and isotopic composition of planetary surface material and gases, and observed variations in these compositions, can contribute significantly to our knowledge of the source(s), ages, and evolution of solar system materials. The analyses discussed in this paper are mostly made by mass spectrometers or some other type of mass analyzer, and address three broad areas of interest: (1) atmospheric composition - isotopic, elemental, and molecular, (2) gases evolved from solids, and (3) solids. Current isotopic data on nine elements, mostly from in situ analysis, but also from meteorites and telescopic observations are summarized. Potential instruments for isotopic analysis of lunar, Martian, Venusian, Mercury, and Pluto surfaces, along with asteroid, cometary and icy satellites, surfaces are discussed.

  10. Evolving networks by merging cliques

    NASA Astrophysics Data System (ADS)

    Takemoto, Kazuhiro; Oosawa, Chikoo

    2005-10-01

    We propose a model for evolving networks by merging building blocks represented as complete graphs, reminiscent of modules in biological system or communities in sociology. The model shows power-law degree distributions, power-law clustering spectra, and high average clustering coefficients independent of network size. The analytical solutions indicate that a degree exponent is determined by the ratio of the number of merging nodes to that of all nodes in the blocks, demonstrating that the exponent is tunable, and are also applicable when the blocks are classical networks such as Erdös-Rényi or regular graphs. Our model becomes the same model as the Barabási-Albert model under a specific condition.

  11. Evolving phenotype of Marfan's syndrome

    PubMed Central

    Lipscomb, K.; Clayton-Smith, J.; Harris, R.

    1997-01-01

    Accepted 20 August 1996
 AIM—To examine evolution of the physical characteristics of Marfan's syndrome throughout childhood.
METHODS—40 children were ascertained during the development of a regional register for Marfan's syndrome. Evolution of the clinical characteristics was determined by repeat evaluation of 10 patients with sporadic Marfan's syndrome and 30 with a family history of the condition. DNA marker studies were used to facilitate diagnosis in those with the familial condition.
RESULTS—Musculoskeletal features predominated and evolved throughout childhood. Gene tracking enabled early diagnosis in children with familial Marfan's syndrome.
CONCLUSIONS—These observations may aid the clinical diagnosis of Marfan's syndrome in childhood, especially in those with the sporadic condition. Gene tracking has a role in the early diagnosis of familial Marfan's syndrome, allowing appropriate follow up and preventive care.

 PMID:9059160

  12. Reliability model generator

    NASA Technical Reports Server (NTRS)

    McMann, Catherine M. (Inventor); Cohen, Gerald C. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  13. The Evolving Relationship between Researchers and Public Policy

    ERIC Educational Resources Information Center

    Henig, Jeffrey R.

    2008-01-01

    When it comes to the role of research in shaping public policy and debate, one might reasonably argue that this is the best of times. No Child Left Behind (NCLB), with its frequent mention of evidence-based decision making, has underscored the role that objective knowledge should play in a democratic society. The Institute of Education Sciences,…

  14. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  15. Systems Issues In Terrestrial Fiber Optic Link Reliability

    NASA Astrophysics Data System (ADS)

    Spencer, James L.; Lewin, Barry R.; Lee, T. Frank S.

    1990-01-01

    This paper reviews fiber optic system reliability issues from three different viewpoints - availability, operating environment, and evolving technologies. Present availability objectives for interoffice links and for the distribution loop must be re-examined for applications such as the Synchronous Optical Network (SONET), Fiber-to-the-Home (FTTH), and analog services. The hostile operating environments of emerging applications (such as FTTH) must be carefully considered in system design as well as reliability assessments. Finally, evolving technologies might require the development of new reliability testing strategies.

  16. Evolving toward Laughter in Learning

    ERIC Educational Resources Information Center

    Strean, William B.

    2008-01-01

    Lowman (1995) described the relationship between teacher and student and student engagement as the two most important ingredients in learning in higher education. Humour builds teacher-student connection (Berk, 1998) and engages students in the learning process. The bond between student and teacher is essential for learning, satisfaction, and…

  17. Reliability Generalization: "Lapsus Linguae"

    ERIC Educational Resources Information Center

    Smith, Julie M.

    2011-01-01

    This study examines the proposed Reliability Generalization (RG) method for studying reliability. RG employs the application of meta-analytic techniques similar to those used in validity generalization studies to examine reliability coefficients. This study explains why RG does not provide a proper research method for the study of reliability,…

  18. Increased longevity evolves from grandmothering.

    PubMed

    Kim, Peter S; Coxworth, James E; Hawkes, Kristen

    2012-12-22

    Postmenopausal longevity may have evolved in our lineage when ancestral grandmothers subsidized their daughters' fertility by provisioning grandchildren, but the verbal hypothesis has lacked mathematical support until now. Here, we present a formal simulation in which life spans similar to those of modern chimpanzees lengthen into the modern human range as a consequence of grandmother effects. Greater longevity raises the chance of living through the fertile years but is opposed by costs that differ for the sexes. Our grandmother assumptions are restrictive. Only females who are no longer fertile themselves are eligible, and female fertility extends to age 45 years. Initially, there are very few eligible grandmothers and effects are small. Grandmothers can support only one dependent at a time and do not care selectively for their daughters' offspring. They must take the oldest juveniles still relying on mothers; and infants under the age of 2 years are never eligible for subsidy. Our model includes no assumptions about brains, learning or pair bonds. Grandmother effects alone are sufficient to propel the doubling of life spans in less than sixty thousand years.

  19. Idiopathic pulmonary fibrosis: evolving concepts.

    PubMed

    Ryu, Jay H; Moua, Teng; Daniels, Craig E; Hartman, Thomas E; Yi, Eunhee S; Utz, James P; Limper, Andrew H

    2014-08-01

    Idiopathic pulmonary fibrosis (IPF) occurs predominantly in middle-aged and older adults and accounts for 20% to 30% of interstitial lung diseases. It is usually progressive, resulting in respiratory failure and death. Diagnostic criteria for IPF have evolved over the years, and IPF is currently defined as a disease characterized by the histopathologic pattern of usual interstitial pneumonia occurring in the absence of an identifiable cause of lung injury. Understanding of the pathogenesis of IPF has shifted away from chronic inflammation and toward dysregulated fibroproliferative repair in response to alveolar epithelial injury. Idiopathic pulmonary fibrosis is likely a heterogeneous disorder caused by various interactions between genetic components and environmental exposures. High-resolution computed tomography can be diagnostic in the presence of typical findings such as bilateral reticular opacities associated with traction bronchiectasis/bronchiolectasis in a predominantly basal and subpleural distribution, along with subpleural honeycombing. In other circumstances, a surgical lung biopsy may be needed. The clinical course of IPF can be unpredictable and may be punctuated by acute deteriorations (acute exacerbation). Although progress continues in unraveling the mechanisms of IPF, effective therapy has remained elusive. Thus, clinicians and patients need to reach informed decisions regarding management options including lung transplant. The findings in this review were based on a literature search of PubMed using the search terms idiopathic pulmonary fibrosis and usual interstitial pneumonia, limited to human studies in the English language published from January 1, 2000, through December 31, 2013, and supplemented by key references published before the year 2000.

  20. Increased longevity evolves from grandmothering

    PubMed Central

    Kim, Peter S.; Coxworth, James E.; Hawkes, Kristen

    2012-01-01

    Postmenopausal longevity may have evolved in our lineage when ancestral grandmothers subsidized their daughters' fertility by provisioning grandchildren, but the verbal hypothesis has lacked mathematical support until now. Here, we present a formal simulation in which life spans similar to those of modern chimpanzees lengthen into the modern human range as a consequence of grandmother effects. Greater longevity raises the chance of living through the fertile years but is opposed by costs that differ for the sexes. Our grandmother assumptions are restrictive. Only females who are no longer fertile themselves are eligible, and female fertility extends to age 45 years. Initially, there are very few eligible grandmothers and effects are small. Grandmothers can support only one dependent at a time and do not care selectively for their daughters' offspring. They must take the oldest juveniles still relying on mothers; and infants under the age of 2 years are never eligible for subsidy. Our model includes no assumptions about brains, learning or pair bonds. Grandmother effects alone are sufficient to propel the doubling of life spans in less than sixty thousand years. PMID:23097518

  1. Multicopy Suppression Underpins Metabolic Evolvability

    PubMed Central

    Patrick, Wayne M.; Quandt, Erik M.; Swartzlander, Dan B.; Matsumura, Ichiro

    2009-01-01

    Our understanding of the origins of new metabolic functions is based upon anecdotal genetic and biochemical evidence. Some auxotrophies can be suppressed by overexpressing substrate-ambiguous enzymes (i.e., those that catalyze the same chemical transformation on different substrates). Other enzymes exhibit weak but detectable catalytic promiscuity in vitro (i.e., they catalyze different transformations on similar substrates). Cells adapt to novel environments through the evolution of these secondary activities, but neither their chemical natures nor their frequencies of occurrence have been characterized en bloc. Here, we systematically identified multifunctional genes within the Escherichia coli genome. We screened 104 single-gene knockout strains and discovered that many (20%) of these auxotrophs were rescued by the overexpression of at least one noncognate E. coli gene. The deleted gene and its suppressor were generally unrelated, suggesting that promiscuity is a product of contingency. This genome-wide survey demonstrates that multifunctional genes are common and illustrates the mechanistic diversity by which their products enhance metabolic robustness and evolvability. PMID:17884825

  2. How do drumlin patterns evolve?

    NASA Astrophysics Data System (ADS)

    Ely, Jeremy; Clark, Chris; Spagnolo, Matteo; Hughes, Anna

    2016-04-01

    The flow of a geomorphic agent over a sediment bed creates patterns in the substrate composed of bedforms. Ice is no exception to this, organising soft sedimentary substrates into subglacial bedforms. As we are yet to fully observe their initiation and evolution beneath a contemporary ice mass, little is known about how patterns in subglacial bedforms develop. Here we study 36,222 drumlins, divided into 72 flowsets, left behind by the former British-Irish Ice sheet. These flowsets provide us with 'snapshots' of drumlin pattern development. The probability distribution functions of the size and shape metrics of drumlins within these flowsets were analysed to determine whether behaviour that is common of other patterned phenomena has occurred. Specifically, we ask whether drumlins i) are printed at a specific scale; ii) grow or shrink after they initiate; iii) stabilise at a specific size and shape; and iv) migrate. Our results indicate that drumlins initiate at a minimum size and spacing. After initiation, the log-normal distribution of drumlin size and shape metrics suggests that drumlins grow, or possibly shrink, as they develop. We find no evidence for stabilisation in drumlin length, supporting the idea of a subglacial bedform continuum. Drumlin migration is difficult to determine from the palaeo-record. However, there are some indications that a mixture of static and mobile drumlins occurs, which could potentially lead to collisions, cannibalisation and coarsening. Further images of modern drumlin fields evolving beneath ice are required to capture stages of drumlin pattern evolution.

  3. Magnetic fields around evolved stars

    NASA Astrophysics Data System (ADS)

    Leal-Ferreira, M.; Vlemmings, W.; Kemball, A.; Amiri, N.; Maercker, M.; Ramstedt, S.; Olofsson, G.

    2014-04-01

    A number of mechanisms, such as magnetic fields, (binary) companions and circumstellar disks have been suggested to be the cause of non-spherical PNe and in particular collimated outflows. This work investigates one of these mechanisms: the magnetic fields. While MHD simulations show that the fields can indeed be important, few observations of magnetic fields have been done so far. We used the VLBA to observe five evolved stars, with the goal of detecting the magnetic field by means of water maser polarization. The sample consists in four AGB stars (IK Tau, RT Vir, IRC+60370 and AP Lyn) and one pPN (OH231.8+4.2). In four of the five sources, several strong maser features were detected allowing us to measure the linear and/or circular polarization. Based on the circular polarization detections, we infer the strength of the component of the field along the line of sight to be between ~30 mG and ~330 mG in the water maser regions of these four sources. When extrapolated to the surface of the stars, the magnetic field strength would be between a few hundred mG and a few Gauss when assuming a toroidal field geometry and higher when assuming more complex magnetic fields. We conclude that the magnetic energy we derived in the water maser regions is higher than the thermal and kinetic energy, leading to the conclusion that, indeed, magnetic fields probably play an important role in shaping Planetary Nebulae.

  4. Circumstellar Crystalline Silicates: Evolved Stars

    NASA Astrophysics Data System (ADS)

    Tartar, Josh; Speck, A. K.

    2008-05-01

    One of the most exciting developments in astronomy in the last 15 years was the discovery of crystalline silicate stardust by the Short Wavelength Spectrometer (SWS) on board of ISO; discovery of the crystalline grains was indeed one of the biggest surprises of the ISO mission. Initially discovered around AGB stars (evolved stars in the range of 0.8 > M/M¤>8) at far-infrared (IR) wavelengths, crystalline silicates have since been seen in many astrophysical environments including young stellar objects (T Tauri and Herbig Ae/Be), comets and Ultra Luminous Infrared Galaxies. Low and intermediate mass stars (LIMS) comprise 95% of the contributors to the ISM, so study of the formation of crystalline silicates is critical to our understanding of the ISM, which is thought to be primarily amorphous (one would expect an almost exact match between the composition of AGB dust shells and the dust in the ISM). Whether the crystalline dust is merely undetectable or amorphized remains a mystery. The FORCAST instrument on SOFIA as well as the PACS instrument on Herschel will provide exciting observing opportunities for the further study of crystalline silicates.

  5. Multiscale modelling of evolving foams

    NASA Astrophysics Data System (ADS)

    Saye, R. I.; Sethian, J. A.

    2016-06-01

    We present a set of multi-scale interlinked algorithms to model the dynamics of evolving foams. These algorithms couple the key effects of macroscopic bubble rearrangement, thin film drainage, and membrane rupture. For each of the mechanisms, we construct consistent and accurate algorithms, and couple them together to work across the wide range of space and time scales that occur in foam dynamics. These algorithms include second order finite difference projection methods for computing incompressible fluid flow on the macroscale, second order finite element methods to solve thin film drainage equations in the lamellae and Plateau borders, multiphase Voronoi Implicit Interface Methods to track interconnected membrane boundaries and capture topological changes, and Lagrangian particle methods for conservative liquid redistribution during rearrangement and rupture. We derive a full set of numerical approximations that are coupled via interface jump conditions and flux boundary conditions, and show convergence for the individual mechanisms. We demonstrate our approach by computing a variety of foam dynamics, including coupled evolution of three-dimensional bubble clusters attached to an anchored membrane and collapse of a foam cluster.

  6. Self-regulating and self-evolving particle swarm optimizer

    NASA Astrophysics Data System (ADS)

    Wang, Hui-Min; Qiao, Zhao-Wei; Xia, Chang-Liang; Li, Liang-Yu

    2015-01-01

    In this article, a novel self-regulating and self-evolving particle swarm optimizer (SSPSO) is proposed. Learning from the idea of direction reversal, self-regulating behaviour is a modified position update rule for particles, according to which the algorithm improves the best position to accelerate convergence in situations where the traditional update rule does not work. Borrowing the idea of mutation from evolutionary computation, self-evolving behaviour acts on the current best particle in the swarm to prevent the algorithm from prematurely converging. The performance of SSPSO and four other improved particle swarm optimizers is numerically evaluated by unimodal, multimodal and rotated multimodal benchmark functions. The effectiveness of SSPSO in solving real-world problems is shown by the magnetic optimization of a Halbach-based permanent magnet machine. The results show that SSPSO has good convergence performance and high reliability, and is well matched to actual problems.

  7. Can There Be Reliability without "Reliability?"

    ERIC Educational Resources Information Center

    Mislevy, Robert J.

    2004-01-01

    An "Educational Researcher" article by Pamela Moss (1994) asks the title question, "Can there be validity without reliability?" Yes, she answers, if by reliability one means "consistency among independent observations intended as interchangeable" (Moss, 1994, p. 7), quantified by internal consistency indices such as…

  8. Evolving Communicative Complexity: Insight from Rodents and Beyond

    DTIC Science & Technology

    2012-01-01

    evolve is an active question in behavioural ecology. Sciurid rodents (ground squirrels, prairie dogs and marmots ) provide an excellent model system for...the situation [46–48]. For example, many species of marmots alter the rate of their alarm calls with the urgency of the situation or the degree of risk...re- lated genera, prairie dogs Cynomys spp., and marmots Marmota spp.), present an excellent model comparative system for studying the relationship

  9. Submillimeter observations of evolved stars

    SciTech Connect

    Sopka, R.J.; Hildebrand, R.; Jaffe, D.T.; Gatley, I.; Roellig, T.; Werner, M.; Jura, M.; Zuckerman, B.

    1985-07-01

    Broad-band submillimeter observations of the thermal emission from evolved stars have been obtained with the United Kingdom Infrared Telescope on Mauna Kea, Hawaii. These observations, at an effective wavelength of 400 ..mu..m, provide the most direct method for estimating the mass loss rate in dust from these stars and also help to define the long-wavelength thermal spectrum of the dust envelopes. The mass loss rates in dust that we derive range from 10/sup -9/ to 10/sup -6/ M/sub sun/ yr/sup -1/ and are compared with mass loss rates derived from molecular line observations to estimate gas-to-dust ratios in outflowing envelopes. These values are found to be generally compatible with the interstellar gas-to-dust ratio of approx.100 if submillimeter emissivities appropriate to amorphous grain structures are assumed. Our analysis of the spectrum of IRC+10216 confirms previous suggestions that the grain emissivity varies as lambda/sup -1.2/ rather than as lambda/sup -2/ for 10

  10. Evolving Galaxies in a Hierachical Universe

    NASA Astrophysics Data System (ADS)

    Hahn, Changhoon

    2017-01-01

    Observations of galaxies using large surveys (SDSS, COSMOS, PRIMUS, etc.) have firmly established a global view of galaxy properties out to z~1. Galaxies are broadly divided into two classes: blue, typically disk-like star forming galaxies and red, typically elliptical quiescent ones with little star formation. The star formation rates (SFR) and stellar masses of star forming galaxies form an empirical relationship referred to as the "star formation main sequence". Over cosmic time, this sequence undergoes significant decline in SFR and causes the overall cosmic star formation decline. Simultaneously, physical processes cause significant fractions of star forming galaxies to "quench" their star formation. Hierarchical structure formation and cosmological models provide precise predictions of the evolution of the underying dark matter, which serve as the foundation for these detailed trends and their evolution. Whatever trends we observe in galaxy properties can be interpreted within the narrative of the underlying dark matter and halo occupation framework. More importantly, through careful statistical treatment and precise measurements, this connection can be utilized to better constrain and understand key elements of galaxy evolution. In this spirit, for my dissertation I connect observations of evolving galaxy properties to the framework of the hierarchical Universe and use it to better understand physical processes responsible for the cessation of star formation in galaxies. For instance, through this approach, I constrain the quenching timescale of central galaxies and find that they are significantly longer than the quenching timescale of satellite galaxies.

  11. Evolving the ingredients for reciprocity and spite

    PubMed Central

    Hauser, Marc; McAuliffe, Katherine; Blake, Peter R.

    2009-01-01

    Darwin never provided a satisfactory account of altruism, but posed the problem beautifully in light of the logic of natural selection. Hamilton and Williams delivered the necessary satisfaction by appealing to kinship, and Trivers showed that kinship was not necessary as long as the originally altruistic act was conditionally reciprocated. From the late 1970s to the present, the kinship theories in particular have been supported by considerable empirical data and elaborated to explore a number of other social interactions such as cooperation, selfishness and punishment, giving us what is now a rich description of the nature of social relationships among organisms. There are, however, two forms of theoretically possible social interactions—reciprocity and spite—that appear absent or nearly so in non-human vertebrates, despite considerable research efforts on a wide diversity of species. We suggest that the rather weak comparative evidence for these interactions is predicted once we consider the requisite socioecological pressures and psychological mechanisms. That is, a consideration of ultimate demands and proximate prerequisites leads to the prediction that reciprocity and spite should be rare in non-human animals, and common in humans. In particular, reciprocity and spite evolved in humans because of adaptive demands on cooperation among unrelated individuals living in large groups, and the integrative capacities of inequity detection, future-oriented decision-making and inhibitory control. PMID:19805432

  12. Inherent randomness of evolving populations.

    PubMed

    Harper, Marc

    2014-03-01

    The entropy rates of the Wright-Fisher process, the Moran process, and generalizations are computed and used to compare these processes and their dependence on standard evolutionary parameters. Entropy rates are measures of the variation dependent on both short-run and long-run behaviors and allow the relationships between mutation, selection, and population size to be examined. Bounds for the entropy rate are given for the Moran process (independent of population size) and for the Wright-Fisher process (bounded for fixed population size). A generational Moran process is also presented for comparison to the Wright-Fisher Process. Results include analytic results and computational extensions.

  13. Inherent randomness of evolving populations

    NASA Astrophysics Data System (ADS)

    Harper, Marc

    2014-03-01

    The entropy rates of the Wright-Fisher process, the Moran process, and generalizations are computed and used to compare these processes and their dependence on standard evolutionary parameters. Entropy rates are measures of the variation dependent on both short-run and long-run behaviors and allow the relationships between mutation, selection, and population size to be examined. Bounds for the entropy rate are given for the Moran process (independent of population size) and for the Wright-Fisher process (bounded for fixed population size). A generational Moran process is also presented for comparison to the Wright-Fisher Process. Results include analytic results and computational extensions.

  14. Assuring reliability program effectiveness.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  15. Evolving evolutionary algorithms using linear genetic programming.

    PubMed

    Oltean, Mihai

    2005-01-01

    A new model for evolving Evolutionary Algorithms is proposed in this paper. The model is based on the Linear Genetic Programming (LGP) technique. Every LGP chromosome encodes an EA which is used for solving a particular problem. Several Evolutionary Algorithms for function optimization, the Traveling Salesman Problem and the Quadratic Assignment Problem are evolved by using the considered model. Numerical experiments show that the evolved Evolutionary Algorithms perform similarly and sometimes even better than standard approaches for several well-known benchmarking problems.

  16. Acquiring Evolving Technologies: Web Services Standards

    DTIC Science & Technology

    2016-06-30

    2006 Carnegie Mellon University Acquiring Evolving Technologies : Web Services Standards Harry L. Levinson Software Engineering Institute Carnegie...Acquiring Evolving Technologies : Web Services Standards 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 2 Acquiring Evolving Technologies : Web Services Standards © 2006 Carnegie Mellon University Acquiring

  17. Water in evolved lunar rocks

    NASA Astrophysics Data System (ADS)

    Robinson, Katharine Lynn

    The Moon was thought to be completely anhydrous until indigenous water was found in lunar samples in 2008. This discovery raised two fundamental questions about the Moon: how much water is present in the bulk Moon and is water uniformly distributed in the lunar interior? To address these questions, I studied a suite of lunar samples rich in a chemical component called KREEP (K, Rare Earth Elements, P), all of which are incompatible elements. Water behaves as an incompatible element in magmas, so KREEP-rich lunar samples are potentially water rich. In this dissertation, I present the results of a petrologic study of KREEP-rich lunar rocks, measurements of their water contents and deuterium (D) to hydrogen (H) ratios (D/H), and examined where these rocks fit into our understanding of water in the Moon as a whole. We performed a study of highly evolved, KREEP-rich lunar rocks called felsites and determined that they contain quartz. Using cooling rates derived from quartz-Ti thermometry, we show the felsites originated at a minimum pressure of ˜1 kbar, corresponding to a minimum depth of 20-25 km in the lunar crust. We calculate that at that pressure water would have been soluble in the melt, indicating that degassing of H2O from the felsite parental melts was likely minimal and hydrogen isotopes in intrusive rocks are likely unfractionated. We then measured D/H in apatite in KREEP-rich intrusive rocks to clarify the solar system source of the Moon's water. When viewed in the context of other lunar D/H studies, our results indicate there are at least three distinctive reservoirs in the lunar interior, including an ultra-low D reservoir that could represent a primitive component in the Moon's interior. Furthermore, our measurements of residual glass in a KREEP basalt show that the KREEP basaltic magmas contained 10 times less water than the source of the Apollo 17 pyroclastic glass beads, indicating that, though wetter than previously thought, the concentration of

  18. Power electronics reliability analysis.

    SciTech Connect

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  19. Human Reliability Program Overview

    SciTech Connect

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  20. Integrated avionics reliability

    NASA Technical Reports Server (NTRS)

    Alikiotis, Dimitri

    1988-01-01

    The integrated avionics reliability task is an effort to build credible reliability and/or performability models for multisensor integrated navigation and flight control. The research was initiated by the reliability analysis of a multisensor navigation system consisting of the Global Positioning System (GPS), the Long Range Navigation system (Loran C), and an inertial measurement unit (IMU). Markov reliability models were developed based on system failure rates and mission time.

  1. Reliable Design Versus Trust

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  2. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  3. Reliability and structural integrity

    NASA Technical Reports Server (NTRS)

    Davidson, J. R.

    1976-01-01

    An analytic model is developed to calculate the reliability of a structure after it is inspected for cracks. The model accounts for the growth of undiscovered cracks between inspections and their effect upon the reliability after subsequent inspections. The model is based upon a differential form of Bayes' Theorem for reliability, and upon fracture mechanics for crack growth.

  4. Reliability model generator specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Mccann, Catherine

    1990-01-01

    The Reliability Model Generator (RMG), a program which produces reliability models from block diagrams for ASSIST, the interface for the reliability evaluation tool SURE is described. An account is given of motivation for RMG and the implemented algorithms are discussed. The appendices contain the algorithms and two detailed traces of examples.

  5. Viking Lander reliability program

    NASA Technical Reports Server (NTRS)

    Pilny, M. J.

    1978-01-01

    The Viking Lander reliability program is reviewed with attention given to the development of the reliability program requirements, reliability program management, documents evaluation, failure modes evaluation, production variation control, failure reporting and correction, and the parts program. Lander hardware failures which have occurred during the mission are listed.

  6. Theory of reliable systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1975-01-01

    An attempt was made to refine the current notion of system reliability by identifying and investigating attributes of a system which are important to reliability considerations. Techniques which facilitate analysis of system reliability are included. Special attention was given to fault tolerance, diagnosability, and reconfigurability characteristics of systems.

  7. Predicting software reliability

    NASA Technical Reports Server (NTRS)

    Littlewood, B.

    1989-01-01

    A detailed look is given to software reliability techniques. A conceptual model of the failure process is examined, and some software reliability growth models are discussed. Problems for which no current solutions exist are addressed, emphasizing the very difficult problem of safety-critical systems for which the reliability requirements can be enormously demanding.

  8. What Technology? Reflections on Evolving Services

    ERIC Educational Resources Information Center

    Collins, Sharon

    2009-01-01

    Each year, the members of the EDUCAUSE Evolving Technologies Committee identify and research the evolving technologies that are having--or are predicted to have--the most direct impact on higher education institutions. The committee members choose the relevant topics, write white papers, and present their findings at the EDUCAUSE annual…

  9. Directional Communication in Evolved Multiagent Teams

    DTIC Science & Technology

    2013-06-10

    only a fraction of the observable state of the environment. In such tasks, communication facilitates sharing information among team members to...architecture is becoming increasingly important for evolving autonomous multiagent systems. Directional reception of signals, a design feature of communication ...Title ABSTRACT How to best design a communication architecture is becoming increasingly important for evolving autonomous multiagent systems. Directional

  10. Evolving Technologies: A View to Tomorrow

    ERIC Educational Resources Information Center

    Tamarkin, Molly; Rodrigo, Shelley

    2011-01-01

    Technology leaders must participate in strategy creation as well as operational delivery within higher education institutions. The future of higher education--the view to tomorrow--is irrevocably integrated and intertwined with evolving technologies. This article focuses on two specific evolving technologies: (1) alternative IT sourcing; and (2)…

  11. [Emergencies evolving from local anesthesia].

    PubMed

    Kaufman, E; Garfunkel, A; Findler, M; Elad, S; Zusman, S P; Malamed, S F; Galili, D

    2002-01-01

    Local anesthesia is without doubt the most frequently used drug in dentistry and in medicine. In spite of records of safety set by using these drugs, there is evidence to adverse reactions ranging from 2.5%-11%. Most of the reactions originate from the autonomic system. A recent, well-planned study indicates that adverse reactions are highly correlated to the medical status of the patient: the higher the medical risk, the greater the chance to experience an adverse reaction. This study also found that adverse reactions highly correlated to the concentration of adrenalin. Another recent study found a direct relationship between adverse reactions and the level of anxiety experienced by the patient and to the dental procedure. Most of the reactions in this study occurred either immediately at injection time and within 2 hours following the injection. Since the beginning of last century, vasoconstrictors have been added to local anesthesia solutions in order to reduce toxicity and prologue activity of the LA. However, today it is commonly agreed that this addition to local anesthesia should not be administered to cardiac patients especially those suffering from refractory dysrhythmias, angina pectoris, post myocardial infarction (6 months) and uncontrolled hypertension. Other contraindications to vasoconstrictors are endocrine disorders such as hyperthyroidism, hyperfunction of the medullary adrenal (pheochromocytoma) and uncontrolled diabetes mellitus. Cross reactivity of local anesthetic solutions can occur with MAO inhibitors, non specific beta adrenergic blockers, tricyclic antidepressants, phenothiazides and cocaine abusers. Noradrenaline added to local anesthetics as a vasoconstrictor has been described as a trigger to a great increase in blood pressure and therefore has been forbidden for use in many countries. This paper describes 4 cases of severe complications following the injections of local anesthesia of which three ended in fatality.

  12. Evolvable Cryogenics (ECRYO) Pressure Transducer Calibration Test

    NASA Technical Reports Server (NTRS)

    Diaz, Carlos E., Jr.

    2015-01-01

    This paper provides a summary of the findings of recent activities conducted by Marshall Space Flight Center's (MSFC) In-Space Propulsion Branch and MSFC's Metrology and Calibration Lab to assess the performance of current "state of the art" pressure transducers for use in long duration storage and transfer of cryogenic propellants. A brief historical narrative in this paper describes the Evolvable Cryogenics program and the relevance of these activities to the program. This paper also provides a review of three separate test activities performed throughout this effort, including: (1) the calibration of several pressure transducer designs in a liquid nitrogen cryogenic environmental chamber, (2) the calibration of a pressure transducer in a liquid helium Dewar, and (3) the calibration of several pressure transducers at temperatures ranging from 20 to 70 degrees Kelvin (K) using a "cryostat" environmental chamber. These three separate test activities allowed for study of the sensors along a temperature range from 4 to 300 K. The combined data shows that both the slope and intercept of the sensor's calibration curve vary as a function of temperature. This homogeneous function is contrary to the linearly decreasing relationship assumed at the start of this investigation. Consequently, the data demonstrates the need for lookup tables to change the slope and intercept used by any data acquisition system. This ultimately would allow for more accurate pressure measurements at the desired temperature range. This paper concludes with a review of a request for information (RFI) survey conducted amongst different suppliers to determine the availability of current "state of the art" flight-qualified pressure transducers. The survey identifies requirements that are most difficult for the suppliers to meet, most notably the capability to validate the sensor's performance at temperatures below 70 K.

  13. Properties of artificial networks evolved to contend with natural spectra.

    PubMed

    Morgenstern, Yaniv; Rostami, Mohammad; Purves, Dale

    2014-07-22

    Understanding why spectra that are physically the same appear different in different contexts (color contrast), whereas spectra that are physically different appear similar (color constancy) presents a major challenge in vision research. Here, we show that the responses of biologically inspired neural networks evolved on the basis of accumulated experience with spectral stimuli automatically generate contrast and constancy. The results imply that these phenomena are signatures of a strategy that biological vision uses to circumvent the inverse optics problem as it pertains to light spectra, and that double-opponent neurons in early-level vision evolve to serve this purpose. This strategy provides a way of understanding the peculiar relationship between the objective world and subjective color experience, as well as rationalizing the relevant visual circuitry without invoking feature detection or image representation.

  14. Properties of artificial networks evolved to contend with natural spectra

    PubMed Central

    Morgenstern, Yaniv; Rostami, Mohammad; Purves, Dale

    2014-01-01

    Understanding why spectra that are physically the same appear different in different contexts (color contrast), whereas spectra that are physically different appear similar (color constancy) presents a major challenge in vision research. Here, we show that the responses of biologically inspired neural networks evolved on the basis of accumulated experience with spectral stimuli automatically generate contrast and constancy. The results imply that these phenomena are signatures of a strategy that biological vision uses to circumvent the inverse optics problem as it pertains to light spectra, and that double-opponent neurons in early-level vision evolve to serve this purpose. This strategy provides a way of understanding the peculiar relationship between the objective world and subjective color experience, as well as rationalizing the relevant visual circuitry without invoking feature detection or image representation. PMID:25024184

  15. Warning signals evolve to disengage Batesian mimics.

    PubMed

    Franks, Daniel W; Ruxton, Graeme D; Sherratt, Thomas N

    2009-01-01

    Prey that are unprofitable to attack are typically conspicuous in appearance. Conventional theory assumes that these warning signals have evolved in response to predator receiver biases. However, such biases might be a symptom rather than a cause of warning signals. We therefore examine an alternative theory: that conspicuousness evolves in unprofitable prey to avoid confusion with profitable prey. One might wonder why unprofitable prey do not find a cryptic means to be distinct from profitable prey, reducing both their risk of confusion with profitable prey and their rate of detection by predators. Here we present the first coevolutionary model to allow for Batesian mimicry and signals with different levels of detectability. We find that unprofitable prey do indeed evolve ways of distinguishing themselves using cryptic signals, particularly when appearance traits can evolve in multiple dimensions. However, conspicuous warning signals readily evolve in unprofitable prey when there are more ways to look different from the background than to match it. Moreover, the more unprofitable the prey species, the higher its evolved conspicuousness. Our results provide strong support for the argument that unprofitable species evolve conspicuous signals to avoid confusion with profitable prey and indicate that peak shift in conspicuousness-linked traits is a major factor in its establishment.

  16. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  17. Power Quality and Reliability Project

    NASA Technical Reports Server (NTRS)

    Attia, John O.

    2001-01-01

    One area where universities and industry can link is in the area of power systems reliability and quality - key concepts in the commercial, industrial and public sector engineering environments. Prairie View A&M University (PVAMU) has established a collaborative relationship with the University of'Texas at Arlington (UTA), NASA/Johnson Space Center (JSC), and EP&C Engineering and Technology Group (EP&C) a small disadvantage business that specializes in power quality and engineering services. The primary goal of this collaboration is to facilitate the development and implementation of a Strategic Integrated power/Systems Reliability and Curriculum Enhancement Program. The objectives of first phase of this work are: (a) to develop a course in power quality and reliability, (b) to use the campus of Prairie View A&M University as a laboratory for the study of systems reliability and quality issues, (c) to provide students with NASA/EPC shadowing and Internship experience. In this work, a course, titled "Reliability Analysis of Electrical Facilities" was developed and taught for two semesters. About thirty seven has benefited directly from this course. A laboratory accompanying the course was also developed. Four facilities at Prairie View A&M University were surveyed. Some tests that were performed are (i) earth-ground testing, (ii) voltage, amperage and harmonics of various panels in the buildings, (iii) checking the wire sizes to see if they were the right size for the load that they were carrying, (iv) vibration tests to test the status of the engines or chillers and water pumps, (v) infrared testing to the test arcing or misfiring of electrical or mechanical systems.

  18. Cancer stem cells: constantly evolving and functionally heterogeneous therapeutic targets.

    PubMed

    Yang, Tao; Rycaj, Kiera; Liu, Zhong-Min; Tang, Dean G

    2014-06-01

    Elucidating the origin of and dynamic interrelationship between intratumoral cell subpopulations has clear clinical significance in helping to understand the cellular basis of treatment response, therapeutic resistance, and tumor relapse. Cancer stem cells (CSC), together with clonal evolution driven by genetic alterations, generate cancer cell heterogeneity commonly observed in clinical samples. The 2013 Shanghai International Symposium on Cancer Stem Cells brought together leaders in the field to highlight the most recent progress in phenotyping, characterizing, and targeting CSCs and in elucidating the relationship between the cell-of-origin of cancer and CSCs. Discussions from the symposium emphasize the urgent need in developing novel therapeutics to target the constantly evolving CSCs.

  19. Reliability, Recursion, and Risk.

    ERIC Educational Resources Information Center

    Henriksen, Melvin, Ed.; Wagon, Stan, Ed.

    1991-01-01

    The discrete mathematics topics of trees and computational complexity are implemented in a simple reliability program which illustrates the process advantages of the PASCAL programing language. The discussion focuses on the impact that reliability research can provide in assessment of the risks found in complex technological ventures. (Author/JJK)

  20. Monte Carlo Reliability Analysis.

    DTIC Science & Technology

    1987-10-01

    to Stochastic Processes , Prentice-Hall, Englewood Cliffs, NJ, 1975. (5) R. E. Barlow and F. Proscham, Statistical TheorX of Reliability and Life...Lewis and Z. Tu, "Monte Carlo Reliability Modeling by Inhomogeneous ,Markov Processes, Reliab. Engr. 16, 277-296 (1986). (4) E. Cinlar, Introduction

  1. Hawaii electric system reliability.

    SciTech Connect

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  2. Hawaii Electric System Reliability

    SciTech Connect

    Loose, Verne William; Silva Monroy, Cesar Augusto

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  3. Chapter 9: Reliability

    SciTech Connect

    Algora, Carlos; Espinet-Gonzalez, Pilar; Vazquez, Manuel; Bosco, Nick; Miller, David; Kurtz, Sarah; Rubio, Francisca; McConnell,Robert

    2016-04-15

    This chapter describes the accumulated knowledge on CPV reliability with its fundamentals and qualification. It explains the reliability of solar cells, modules (including optics) and plants. The chapter discusses the statistical distributions, namely exponential, normal and Weibull. The reliability of solar cells includes: namely the issues in accelerated aging tests in CPV solar cells, types of failure and failures in real time operation. The chapter explores the accelerated life tests, namely qualitative life tests (mainly HALT) and quantitative accelerated life tests (QALT). It examines other well proven and experienced PV cells and/or semiconductor devices, which share similar semiconductor materials, manufacturing techniques or operating conditions, namely, III-V space solar cells and light emitting diodes (LEDs). It addresses each of the identified reliability issues and presents the current state of the art knowledge for their testing and evaluation. Finally, the chapter summarizes the CPV qualification and reliability standards.

  4. The transcriptomics of an experimentally evolved plant-virus interaction

    PubMed Central

    Hillung, Julia; García-García, Francisco; Dopazo, Joaquín; Cuevas, José M.; Elena, Santiago F.

    2016-01-01

    Models of plant-virus interaction assume that the ability of a virus to infect a host genotype depends on the matching between virulence and resistance genes. Recently, we evolved tobacco etch potyvirus (TEV) lineages on different ecotypes of Arabidopsis thaliana, and found that some ecotypes selected for specialist viruses whereas others selected for generalists. Here we sought to evaluate the transcriptomic basis of such relationships. We have characterized the transcriptomic responses of five ecotypes infected with the ancestral and evolved viruses. Genes and functional categories differentially expressed by plants infected with local TEV isolates were identified, showing heterogeneous responses among ecotypes, although significant parallelism existed among lineages evolved in the same ecotype. Although genes involved in immune responses were altered upon infection, other functional groups were also pervasively over-represented, suggesting that plant resistance genes were not the only drivers of viral adaptation. Finally, the transcriptomic consequences of infection with the generalist and specialist lineages were compared. Whilst the generalist induced very similar perturbations in the transcriptomes of the different ecotypes, the perturbations induced by the specialist were divergent. Plant defense mechanisms were activated when the infecting virus was specialist but they were down-regulated when infecting with generalist. PMID:27113435

  5. The transcriptomics of an experimentally evolved plant-virus interaction.

    PubMed

    Hillung, Julia; García-García, Francisco; Dopazo, Joaquín; Cuevas, José M; Elena, Santiago F

    2016-04-26

    Models of plant-virus interaction assume that the ability of a virus to infect a host genotype depends on the matching between virulence and resistance genes. Recently, we evolved tobacco etch potyvirus (TEV) lineages on different ecotypes of Arabidopsis thaliana, and found that some ecotypes selected for specialist viruses whereas others selected for generalists. Here we sought to evaluate the transcriptomic basis of such relationships. We have characterized the transcriptomic responses of five ecotypes infected with the ancestral and evolved viruses. Genes and functional categories differentially expressed by plants infected with local TEV isolates were identified, showing heterogeneous responses among ecotypes, although significant parallelism existed among lineages evolved in the same ecotype. Although genes involved in immune responses were altered upon infection, other functional groups were also pervasively over-represented, suggesting that plant resistance genes were not the only drivers of viral adaptation. Finally, the transcriptomic consequences of infection with the generalist and specialist lineages were compared. Whilst the generalist induced very similar perturbations in the transcriptomes of the different ecotypes, the perturbations induced by the specialist were divergent. Plant defense mechanisms were activated when the infecting virus was specialist but they were down-regulated when infecting with generalist.

  6. Zygomorphy evolved from disymmetry in Fumarioideae (Papaveraceae, Ranunculales): new evidence from an expanded molecular phylogenetic framework

    PubMed Central

    Sauquet, Hervé; Carrive, Laetitia; Poullain, Noëlie; Sannier, Julie; Damerval, Catherine; Nadot, Sophie

    2015-01-01

    Background and Aims Fumarioideae (20 genera, 593 species) is a clade of Papaveraceae (Ranunculales) characterized by flowers that are either disymmetric (i.e. two perpendicular planes of bilateral symmetry) or zygomorphic (i.e. one plane of bilateral symmetry). In contrast, the other subfamily of Papaveraceae, Papaveroideae (23 genera, 230 species), has actinomorphic flowers (i.e. more than two planes of symmetry). Understanding of the evolution of floral symmetry in this clade has so far been limited by the lack of a reliable phylogenetic framework. Pteridophyllum (one species) shares similarities with Fumarioideae but has actinomorphic flowers, and the relationships among Pteridophyllum, Papaveroideae and Fumarioideae have remained unclear. This study reassesses the evolution of floral symmetry in Papaveraceae based on new molecular phylogenetic analyses of the family. Methods Maximum likelihood, Bayesian and maximum parsimony phylogenetic analyses of Papaveraceae were conducted using six plastid markers and one nuclear marker, sampling Pteridophyllum, 18 (90 %) genera and 73 species of Fumarioideae, 11 (48 %) genera and 11 species of Papaveroideae, and a wide selection of outgroup taxa. Floral characters recorded from the literature were then optimized onto phylogenetic trees to reconstruct ancestral states using parsimony, maximum likelihood and reversible-jump Bayesian approaches. Key Results Pteridophyllum is not nested in Fumarioideae. Fumarioideae are monophyletic and Hypecoum (18 species) is the sister group of the remaining genera. Relationships within the core Fumarioideae are well resolved and supported. Dactylicapnos and all zygomorphic genera form a well-supported clade nested among disymmetric taxa. Conclusions Disymmetry of the corolla is a synapomorphy of Fumarioideae and is strongly correlated with changes in the androecium and differentiation of middle and inner tepal shape (basal spurs on middle tepals). Zygomorphy subsequently evolved from

  7. Access to space: The Space Shuttle's evolving rolee

    NASA Astrophysics Data System (ADS)

    Duttry, Steven R.

    1993-04-01

    Access to space is of extreme importance to our nation and the world. Military, civil, and commercial space activities all depend on reliable space transportation systems for access to space at a reasonable cost. The Space Transportation System or Space Shuttle was originally planned to provide transportation to and from a manned Earth-orbiting space station. To justify the development and operations costs, the Space Shuttle took on other space transportation requirements to include DoD, civil, and a growing commercial launch market. This research paper or case study examines the evolving role of the Space Shuttle as our nation's means of accessing space. The case study includes a review of the events leading to the development of the Space Shuttle, identifies some of the key players in the decision-making process, examines alternatives developed to mitigate the risks associated with sole reliance on the Space Shuttle, and highlights the impacts of this national space policy following the Challenger accident.

  8. Quantifying evolvability in small biological networks

    SciTech Connect

    Nemenman, Ilya; Mugler, Andrew; Ziv, Etay; Wiggins, Chris H

    2008-01-01

    The authors introduce a quantitative measure of the capacity of a small biological network to evolve. The measure is applied to a stochastic description of the experimental setup of Guet et al. (Science 2002, 296, pp. 1466), treating chemical inducers as functional inputs to biochemical networks and the expression of a reporter gene as the functional output. The authors take an information-theoretic approach, allowing the system to set parameters that optimise signal processing ability, thus enumerating each network's highest-fidelity functions. All networks studied are highly evolvable by the measure, meaning that change in function has little dependence on change in parameters. Moreover, each network's functions are connected by paths in the parameter space along which information is not significantly lowered, meaning a network may continuously change its functionality without completely losing it along the way. This property further underscores the evolvability of the networks.

  9. Co-evolving prisoner's dilemma: Performance indicators and analytic approaches

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Choi, C. W.; Li, Y. S.; Xu, C.; Hui, P. M.

    2017-02-01

    Understanding the intrinsic relation between the dynamical processes in a co-evolving network and the necessary ingredients in formulating a reliable theory is an important question and a challenging task. Using two slightly different definitions of performance indicator in the context of a co-evolving prisoner's dilemma game, it is shown that very different cooperative levels result and theories of different complexity are required to understand the key features. When the payoff per opponent is used as the indicator (Case A), non-cooperative strategy has an edge and dominates in a large part of the parameter space formed by the cutting-and-rewiring probability and the strategy imitation probability. When the payoff from all opponents is used (Case B), cooperative strategy has an edge and dominates the parameter space. Two distinct phases, one homogeneous and dynamical and another inhomogeneous and static, emerge and the phase boundary in the parameter space is studied in detail. A simple theory assuming an average competing environment for cooperative agents and another for non-cooperative agents is shown to perform well in Case A. The same theory, however, fails badly for Case B. It is necessary to include more spatial correlation into a theory for Case B. We show that the local configuration approximation, which takes into account of the different competing environments for agents with different strategies and degrees, is needed to give reliable results for Case B. The results illustrate that formulating a proper theory requires both a conceptual understanding of the effects of the adaptive processes in the problem and a delicate balance between simplicity and accuracy.

  10. JavaGenes: Evolving Graphs with Crossover

    NASA Technical Reports Server (NTRS)

    Globus, Al; Atsatt, Sean; Lawton, John; Wipke, Todd

    2000-01-01

    Genetic algorithms usually use string or tree representations. We have developed a novel crossover operator for a directed and undirected graph representation, and used this operator to evolve molecules and circuits. Unlike strings or trees, a single point in the representation cannot divide every possible graph into two parts, because graphs may contain cycles. Thus, the crossover operator is non-trivial. A steady-state, tournament selection genetic algorithm code (JavaGenes) was written to implement and test the graph crossover operator. All runs were executed by cycle-scavagging on networked workstations using the Condor batch processing system. The JavaGenes code has evolved pharmaceutical drug molecules and simple digital circuits. Results to date suggest that JavaGenes can evolve moderate sized drug molecules and very small circuits in reasonable time. The algorithm has greater difficulty with somewhat larger circuits, suggesting that directed graphs (circuits) are more difficult to evolve than undirected graphs (molecules), although necessary differences in the crossover operator may also explain the results. In principle, JavaGenes should be able to evolve other graph-representable systems, such as transportation networks, metabolic pathways, and computer networks. However, large graphs evolve significantly slower than smaller graphs, presumably because the space-of-all-graphs explodes combinatorially with graph size. Since the representation strongly affects genetic algorithm performance, adding graphs to the evolutionary programmer's bag-of-tricks should be beneficial. Also, since graph evolution operates directly on the phenotype, the genotype-phenotype translation step, common in genetic algorithm work, is eliminated.

  11. Structural Analysis of an Evolved Transketolase Reveals Divergent Binding Modes

    PubMed Central

    Affaticati, Pierre E.; Dai, Shao-Bo; Payongsri, Panwajee; Hailes, Helen C.; Tittmann, Kai; Dalby, Paul A.

    2016-01-01

    The S385Y/D469T/R520Q variant of E. coli transketolase was evolved previously with three successive smart libraries, each guided by different structural, bioinformatical or computational methods. Substrate-walking progressively shifted the target acceptor substrate from phosphorylated aldehydes, towards a non-phosphorylated polar aldehyde, a non-polar aliphatic aldehyde, and finally a non-polar aromatic aldehyde. Kinetic evaluations on three benzaldehyde derivatives, suggested that their active-site binding was differentially sensitive to the S385Y mutation. Docking into mutants generated in silico from the wild-type crystal structure was not wholly satisfactory, as errors accumulated with successive mutations, and hampered further smart-library designs. Here we report the crystal structure of the S385Y/D469T/R520Q variant, and molecular docking of three substrates. This now supports our original hypothesis that directed-evolution had generated an evolutionary intermediate with divergent binding modes for the three aromatic aldehydes tested. The new active site contained two binding pockets supporting π-π stacking interactions, sterically separated by the D469T mutation. While 3-formylbenzoic acid (3-FBA) preferred one pocket, and 4-FBA the other, the less well-accepted substrate 3-hydroxybenzaldehyde (3-HBA) was caught in limbo with equal preference for the two pockets. This work highlights the value of obtaining crystal structures of evolved enzyme variants, for continued and reliable use of smart library strategies. PMID:27767080

  12. An Evolvable Multi-Agent Approach to Space Operations Engineering

    NASA Technical Reports Server (NTRS)

    Mandutianu, Sanda; Stoica, Adrian

    1999-01-01

    A complex system of spacecraft and ground tracking stations, as well as a constellation of satellites or spacecraft, has to be able to reliably withstand sudden environment changes, resource fluctuations, dynamic resource configuration, limited communication bandwidth, etc., while maintaining the consistency of the system as a whole. It is not known in advance when a change in the environment might occur or when a particular exchange will happen. A higher degree of sophistication for the communication mechanisms between different parts of the system is required. The actual behavior has to be determined while the system is performing and the course of action can be decided at the individual level. Under such circumstances, the solution will highly benefit from increased on-board and on the ground adaptability and autonomy. An evolvable architecture based on intelligent agents that communicate and cooperate with each other can offer advantages in this direction. This paper presents an architecture of an evolvable agent-based system (software and software/hardware hybrids) as well as some plans for further implementation.

  13. Reliability Analysis Model

    NASA Technical Reports Server (NTRS)

    1970-01-01

    RAM program determines probability of success for one or more given objectives in any complex system. Program includes failure mode and effects, criticality and reliability analyses, and some aspects of operations, safety, flight technology, systems design engineering, and configuration analyses.

  14. A Stefan problem on an evolving surface

    PubMed Central

    Alphonse, Amal; Elliott, Charles M.

    2015-01-01

    We formulate a Stefan problem on an evolving hypersurface and study the well posedness of weak solutions given L1 data. To do this, we first develop function spaces and results to handle equations on evolving surfaces in order to give a natural treatment of the problem. Then, we consider the existence of solutions for data; this is done by regularization of the nonlinearity. The regularized problem is solved by a fixed point theorem and then uniform estimates are obtained in order to pass to the limit. By using a duality method, we show continuous dependence, which allows us to extend the results to L1 data. PMID:26261364

  15. The rating reliability calculator

    PubMed Central

    Solomon, David J

    2004-01-01

    Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program. PMID:15117416

  16. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  17. Metrology automation reliability

    NASA Astrophysics Data System (ADS)

    Chain, Elizabeth E.

    1996-09-01

    At Motorola's MOS-12 facility automated measurements on 200- mm diameter wafers proceed in a hands-off 'load-and-go' mode requiring only wafer loading, measurement recipe loading, and a 'run' command for processing. Upon completion of all sample measurements, the data is uploaded to the factory's data collection software system via a SECS II interface, eliminating the requirement of manual data entry. The scope of in-line measurement automation has been extended to the entire metrology scheme from job file generation to measurement and data collection. Data analysis and comparison to part specification limits is also carried out automatically. Successful integration of automated metrology into the factory measurement system requires that automated functions, such as autofocus and pattern recognition algorithms, display a high degree of reliability. In the 24- hour factory reliability data can be collected automatically on every part measured. This reliability data is then uploaded to the factory data collection software system at the same time as the measurement data. Analysis of the metrology reliability data permits improvements to be made as needed, and provides an accurate accounting of automation reliability. This reliability data has so far been collected for the CD-SEM (critical dimension scanning electron microscope) metrology tool, and examples are presented. This analysis method can be applied to such automated in-line measurements as CD, overlay, particle and film thickness measurements.

  18. Surveying The Digital Landscape: Evolving Technologies 2004. The EDUCAUSE Evolving Technologies Committee

    ERIC Educational Resources Information Center

    EDUCAUSE Review, 2004

    2004-01-01

    Each year, the members of the EDUCAUSE Evolving Technologies Committee identify and research the evolving technologies that are having the most direct impact on higher education institutions. The committee members choose the relevant topics, write white papers, and present their findings at the EDUCAUSE annual conference. This year, under the…

  19. Mission Reliability Estimation for Repairable Robot Teams

    NASA Technical Reports Server (NTRS)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  20. Scaled CMOS Technology Reliability Users Guide

    NASA Technical Reports Server (NTRS)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  1. Thermal and Evolved-Gas Analyzer Illustration

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This is a computer-aided drawing of the Thermal and Evolved-Gas Analyzer, or TEGA, on NASA's Phoenix Mars Lander.

    The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  2. Evolving Neural Networks for Nonlinear Control.

    DTIC Science & Technology

    1996-09-30

    An approach to creating Amorphous Recurrent Neural Networks (ARNN) using Genetic Algorithms (GA) called 2pGA has been developed and shown to be...effective in evolving neural networks for the control and stabilization of both linear and nonlinear plants, the optimal control for a nonlinear regulator

  3. The Evolving Leadership Path of Visual Analytics

    SciTech Connect

    Kluse, Michael; Peurrung, Anthony J.; Gracio, Deborah K.

    2012-01-02

    This is a requested book chapter for an internationally authored book on visual analytics and related fields, coordianted by a UK university and to be published by Springer in 2012. This chapter is an overview of the leadship strategies that PNNL's Jim Thomas and other stakeholders used to establish visual analytics as a field, and how those strategies may evolve in the future.

  4. Toward an Evolved Concept of Landrace.

    PubMed

    Casañas, Francesc; Simó, Joan; Casals, Joan; Prohens, Jaime

    2017-01-01

    The term "landrace" has generally been defined as a cultivated, genetically heterogeneous variety that has evolved in a certain ecogeographical area and is therefore adapted to the edaphic and climatic conditions and to its traditional management and uses. Despite being considered by many to be inalterable, landraces have been and are in a constant state of evolution as a result of natural and artificial selection. Many landraces have disappeared from cultivation but are preserved in gene banks. Using modern selection and breeding technology tools to shape these preserved landraces together with the ones that are still cultivated is a further step in their evolution in order to preserve their agricultural significance. Adapting historical landraces to present agricultural conditions using cutting-edge breeding technology represents a challenging opportunity to use them in a modern sustainable agriculture, as an immediate return on the investment is highly unlikely. Consequently, we propose a more inclusive definition of landraces, namely that they consist of cultivated varieties that have evolved and may continue evolving, using conventional or modern breeding techniques, in traditional or new agricultural environments within a defined ecogeographical area and under the influence of the local human culture. This includes adaptation of landraces to new management systems and the unconscious or conscious selection made by farmers or breeders using available technology. In this respect, a mixed selection system might be established in which farmers and other social agents develop evolved landraces from the variability generated by public entities.

  5. Did Language Evolve Like the Vertebrate Eye?

    ERIC Educational Resources Information Center

    Botha, Rudolf P.

    2002-01-01

    Offers a critical appraisal of the way in which the idea that human language or some of its features evolved like the vertebrate eye by natural selection is articulated in Pinker and Bloom's (1990) selectionist account of language evolution. Argues that this account is less than insightful because it fails to draw some of the conceptual…

  6. Apollo 16 Evolved Lithology Sodic Ferrogabbro

    NASA Technical Reports Server (NTRS)

    Zeigler, Ryan; Jolliff, B. L.; Korotev, R. L.

    2014-01-01

    Evolved lunar igneous lithologies, often referred to as the alkali suite, are a minor but important component of the lunar crust. These evolved samples are incompatible-element rich samples, and are, not surprisingly, most common in the Apollo sites in (or near) the incompatible-element rich region of the Moon known as the Procellarum KREEP Terrane (PKT). The most commonly occurring lithologies are granites (A12, A14, A15, A17), monzogabbro (A14, A15), alkali anorthosites (A12, A14), and KREEP basalts (A15, A17). The Feldspathic Highlands Terrane is not entirely devoid of evolved lithologies, and rare clasts of alkali gabbronorite and sodic ferrogabbro (SFG) have been identified in Apollo 16 station 11 breccias 67915 and 67016. Curiously, nearly all pristine evolved lithologies have been found as small clasts or soil particles, exceptions being KREEP basalts 15382/6 and granitic sample 12013 (which is itself a breccia). Here we reexamine the petrography and geochemistry of two SFG-like particles found in a survey of Apollo 16 2-4 mm particles from the Cayley Plains 62283,7-15 and 62243,10-3 (hereafter 7-15 and 10-3 respectively). We will compare these to previously reported SFG samples, including recent analyses on the type specimen of SFG from lunar breccia 67915.

  7. Toward an Evolved Concept of Landrace

    PubMed Central

    Casañas, Francesc; Simó, Joan; Casals, Joan; Prohens, Jaime

    2017-01-01

    The term “landrace” has generally been defined as a cultivated, genetically heterogeneous variety that has evolved in a certain ecogeographical area and is therefore adapted to the edaphic and climatic conditions and to its traditional management and uses. Despite being considered by many to be inalterable, landraces have been and are in a constant state of evolution as a result of natural and artificial selection. Many landraces have disappeared from cultivation but are preserved in gene banks. Using modern selection and breeding technology tools to shape these preserved landraces together with the ones that are still cultivated is a further step in their evolution in order to preserve their agricultural significance. Adapting historical landraces to present agricultural conditions using cutting-edge breeding technology represents a challenging opportunity to use them in a modern sustainable agriculture, as an immediate return on the investment is highly unlikely. Consequently, we propose a more inclusive definition of landraces, namely that they consist of cultivated varieties that have evolved and may continue evolving, using conventional or modern breeding techniques, in traditional or new agricultural environments within a defined ecogeographical area and under the influence of the local human culture. This includes adaptation of landraces to new management systems and the unconscious or conscious selection made by farmers or breeders using available technology. In this respect, a mixed selection system might be established in which farmers and other social agents develop evolved landraces from the variability generated by public entities. PMID:28228769

  8. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  9. Photovoltaic module reliability workshop

    SciTech Connect

    Mrig, L.

    1990-01-01

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986--1990. The reliability Photo Voltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warranties available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the US, PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  10. Photovoltaic module reliability workshop

    NASA Astrophysics Data System (ADS)

    Mrig, L.

    The paper and presentations compiled in this volume form the Proceedings of the fourth in a series of Workshops sponsored by Solar Energy Research Institute (SERI/DOE) under the general theme of photovoltaic module reliability during the period 1986 to 1990. The reliability photovoltaic (PV) modules/systems is exceedingly important along with the initial cost and efficiency of modules if the PV technology has to make a major impact in the power generation market, and for it to compete with the conventional electricity producing technologies. The reliability of photovoltaic modules has progressed significantly in the last few years as evidenced by warrantees available on commercial modules of as long as 12 years. However, there is still need for substantial research and testing required to improve module field reliability to levels of 30 years or more. Several small groups of researchers are involved in this research, development, and monitoring activity around the world. In the U.S., PV manufacturers, DOE laboratories, electric utilities and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in this field were brought together under SERI/DOE sponsorship to exchange the technical knowledge and field experience as related to current information in this important field. The papers presented here reflect this effort.

  11. Orbiter Autoland reliability analysis

    NASA Technical Reports Server (NTRS)

    Welch, D. Phillip

    1993-01-01

    The Space Shuttle Orbiter is the only space reentry vehicle in which the crew is seated upright. This position presents some physiological effects requiring countermeasures to prevent a crewmember from becoming incapacitated. This also introduces a potential need for automated vehicle landing capability. Autoland is a primary procedure that was identified as a requirement for landing following and extended duration orbiter mission. This report documents the results of the reliability analysis performed on the hardware required for an automated landing. A reliability block diagram was used to evaluate system reliability. The analysis considers the manual and automated landing modes currently available on the Orbiter. (Autoland is presently a backup system only.) Results of this study indicate a +/- 36 percent probability of successfully extending a nominal mission to 30 days. Enough variations were evaluated to verify that the reliability could be altered with missions planning and procedures. If the crew is modeled as being fully capable after 30 days, the probability of a successful manual landing is comparable to that of Autoland because much of the hardware is used for both manual and automated landing modes. The analysis indicates that the reliability for the manual mode is limited by the hardware and depends greatly on crew capability. Crew capability for a successful landing after 30 days has not been determined yet.

  12. Crystalline-silicon reliability lessons for thin-film modules

    NASA Technical Reports Server (NTRS)

    Ross, Ronald G., Jr.

    1985-01-01

    Key reliability and engineering lessons learned from the 10-year history of the Jet Propulsion Laboratory's Flat-Plate Solar Array Project are presented and analyzed. Particular emphasis is placed on lessons applicable to the evolving new thin-film cell and module technologies and the organizations involved with these technologies. The user-specific demand for reliability is a strong function of the application, its location, and its expected duration. Lessons relative to effective means of specifying reliability are described, and commonly used test requirements are assessed from the standpoint of which are the most troublesome to pass, and which correlate best with field experience. Module design lessons are also summarized, including the significance of the most frequently encountered failure mechanisms and the role of encapsulant and cell reliability in determining module reliability. Lessons pertaining to research, design, and test approaches include the historical role and usefulness of qualification tests and field tests.

  13. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  14. Gearbox Reliability Collaborative Update (Presentation)

    SciTech Connect

    Sheng, S.; Keller, J.; Glinsky, C.

    2013-10-01

    This presentation was given at the Sandia Reliability Workshop in August 2013 and provides information on current statistics, a status update, next steps, and other reliability research and development activities related to the Gearbox Reliability Collaborative.

  15. Reliability Engineering Handbook

    DTIC Science & Technology

    1964-06-01

    INTEVAL 00 0 542 917 1953 OPERATING TIME IN HOURS Figure 6-4. TWT Reliability Function, Showing the 90% Confidence Interval 6-7 6-2-4 to 6-2-5 NAVWEPS...the lower one-sided 90% greater than 977 hours, or 90% confidence confidence limit on 0 is (.704)(530) = 373 that 0 lies between these two bounds . R...6-4 6-2-2 Measurement of Reliability (Application of Confidence Limits).. 6-4 6-2-3 Procedural Steps

  16. Reliability Degradation Due to Stockpile Aging

    SciTech Connect

    Robinson, David G.

    1999-04-01

    The objective of this reseach is the investigation of alternative methods for characterizing the reliability of systems with time dependent failure modes associated with stockpile aging. Reference to 'reliability degradation' has, unfortunately, come to be associated with all types of aging analyes: both deterministic and stochastic. In this research, in keeping with the true theoretical definition, reliability is defined as a probabilistic description of system performance as a funtion of time. Traditional reliability methods used to characterize stockpile reliability depend on the collection of a large number of samples or observations. Clearly, after the experiments have been performed and the data has been collected, critical performance problems can be identified. A Major goal of this research is to identify existing methods and/or develop new mathematical techniques and computer analysis tools to anticipate stockpile problems before they become critical issues. One of the most popular methods for characterizing the reliability of components, particularly electronic components, assumes that failures occur in a completely random fashion, i.e. uniformly across time. This method is based primarily on the use of constant failure rates for the various elements that constitute the weapon system, i.e. the systems do not degrade while in storage. Experience has shown that predictions based upon this approach should be regarded with great skepticism since the relationship between the life predicted and the observed life has been difficult to validate. In addition to this fundamental problem, the approach does not recognize that there are time dependent material properties and variations associated with the manufacturing process and the operational environment. To appreciate the uncertainties in predicting system reliability a number of alternative methods are explored in this report. All of the methods are very different from those currently used to assess stockpile

  17. Evolved gas analysis of secondary organic aerosols

    SciTech Connect

    Grosjean, D.; Williams, E.L. II; Grosjean, E. ); Novakov, T. )

    1994-11-01

    Secondary organic aerosols have been characterized by evolved gas analysis (EGA). Hydrocarbons selected as aerosol precursors were representative of anthropogenic emissions (cyclohexene, cyclopentene, 1-decene and 1-dodecene, n-dodecane, o-xylene, and 1,3,5-trimethylbenzene) and of biogenic emissions (the terpenes [alpha]-pinene, [beta]-pinene and d-limonene and the sesquiterpene trans-caryophyllene). Also analyzed by EGA were samples of secondary, primary (highway tunnel), and ambient (urban) aerosols before and after exposure to ozone and other photochemical oxidants. The major features of the EGA thermograms (amount of CO[sub 2] evolved as a function of temperature) are described. The usefulness and limitations of EGA data for source apportionment of atmospheric particulate carbon are briefly discussed. 28 refs., 7 figs., 4 tabs.

  18. Dust obscuration by an evolving galaxy population

    NASA Technical Reports Server (NTRS)

    Najita, Joan; Silk, Joseph; Wachter, Kenneth W.

    1990-01-01

    The effect of an evolving luminosity function (LF) on the ability of foreground galaxies to obscure background sources is discussed, using the Press-Schechter/CDM standard evolving LF model. Galaxies are modeled as simplified versions of local spirals and Poisson statistics are used to estimate the fraction of sky covered by intervening dusty galaxies and the mean optical depths due to these galaxies. The results are compared to those obtained in the case of nonevolving luminosity function in a low-density universe. It is found that evolution of the galaxy LF does not allow the quasar dust obscuration hypothesis to be sustained for dust disks with plausible sizes. Even in a low-density universe, where evolution at z = less than 10 is unimportant, large disk radii are needed to achieve the desired obscuring effect. The mean fraction of sky covered is presented as a function of the redshift z along with adequate diagram illustrations.

  19. Design Space Issues for Intrinsic Evolvable Hardware

    NASA Technical Reports Server (NTRS)

    Hereford, James; Gwaltney, David

    2004-01-01

    This paper discusses the problem of increased programming time for intrinsic evolvable hardware (EM) as the complexity of the circuit grows. As the circuit becomes more complex, then more components will be required and a longer programming string, L, is required. We develop equations for the size of the population, n, and the number of generations required for the population to converge, based on L. Our analytical results show that even though the design search space grows as 2L (assuming a binary programming string), the number of circuit evaluations, n*ngen, only grows as O(Lg3), or slightly less than O(L). This makes evolvable techniques a good tool for exploring large design spaces. The major hurdle for intrinsic EHW is evaluation time for each possible circuit. The evaluation time involves downloading the bit string to the device, updating the device configuration, measuring the output and then transferring the output data to the control processor. Each of these steps must be done for each member of the population. The processing time of the computer becomes negligible since the selection/crossover/mutation steps are only done once per generation. Evaluation time presently limits intrinsic evolvable hardware techniques to designing only small or medium-sized circuits. To evolve large or complicated circuits, several researchers have proposed using hierarchical design or reuse techniques where submodules are combined together to form complex circuits. However, these practical approaches limit the search space of available designs and preclude utilizing parasitic coupling or other effects within the programmable device. The practical approaches also raise the issue of why intrinsic EHW techniques do not easily apply to large design spaces, since the analytical results show only an O(L) complexity growth.

  20. Quantum games on evolving random networks

    NASA Astrophysics Data System (ADS)

    Pawela, Łukasz

    2016-09-01

    We study the advantages of quantum strategies in evolutionary social dilemmas on evolving random networks. We focus our study on the two-player games: prisoner's dilemma, snowdrift and stag-hunt games. The obtained result show the benefits of quantum strategies for the prisoner's dilemma game. For the other two games, we obtain regions of parameters where the quantum strategies dominate, as well as regions where the classical strategies coexist.

  1. Evolving networks-Using past structure to predict the future

    NASA Astrophysics Data System (ADS)

    Shang, Ke-ke; Yan, Wei-sheng; Small, Michael

    2016-08-01

    Many previous studies on link prediction have focused on using common neighbors to predict the existence of links between pairs of nodes. More broadly, research into the structural properties of evolving temporal networks and temporal link prediction methods have recently attracted increasing attention. In this study, for the first time, we examine the use of links between a pair of nodes to predict their common neighbors and analyze the relationship between the weight and the structure in static networks, evolving networks, and in the corresponding randomized networks. We propose both new unweighted and weighted prediction methods and use six kinds of real networks to test our algorithms. In unweighted networks, we find that if a pair of nodes connect to each other in the current network, they will have a higher probability to connect common nodes both in the current and the future networks-and the probability will decrease with the increase of the number of neighbors. Furthermore, we find that the original networks have their particular structure and statistical characteristics which benefit link prediction. In weighted networks, the prediction algorithm performance of networks which are dominated by human factors decrease with the decrease of weight and are in general better in static networks. Furthermore, we find that geographical position and link weight both have significant influence on the transport network. Moreover, the evolving financial network has the lowest predictability. In addition, we find that the structure of non-social networks has more robustness than social networks. The structure of engineering networks has both best predictability and also robustness.

  2. Assuring Software Reliability

    DTIC Science & Technology

    2014-08-01

    resources.sei.cmu.edu/asset_files/WhitePaper/2009_019_001_29066.pdf [Boydston 2009] Boydston, A. & Lewis , W. Qualification and Reliability of...Woody, Carol . Survivability Analysis Framework (CMU/SEI-2010-TN-013). Software Engineering Institute, Carnegie Mellon University, 2010. http

  3. Parametric Mass Reliability Study

    NASA Technical Reports Server (NTRS)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  4. Sequential Reliability Tests.

    ERIC Educational Resources Information Center

    Eiting, Mindert H.

    1991-01-01

    A method is proposed for sequential evaluation of reliability of psychometric instruments. Sample size is unfixed; a test statistic is computed after each person is sampled and a decision is made in each stage of the sampling process. Results from a series of Monte-Carlo experiments establish the method's efficiency. (SLD)

  5. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the ``factories,`` reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  6. Designing reliability into accelerators

    SciTech Connect

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the factories,'' reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis.

  7. Reliable solar cookers

    SciTech Connect

    Magney, G.K.

    1992-12-31

    The author describes the activities of SERVE, a Christian relief and development agency, to introduce solar ovens to the Afghan refugees in Pakistan. It has provided 5,000 solar cookers since 1984. The experience has demonstrated the potential of the technology and the need for a durable and reliable product. Common complaints about the cookers are discussed and the ideal cooker is described.

  8. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  9. Reliability Design Handbook

    DTIC Science & Technology

    1976-03-01

    prediction, failure modes and effects analysis ( FMEA ) and reliability growth techniques represent those prediction and design evaluation methods that...Assessment Production Operation Ö Maintenance MIL-HDBK- 217 Bayesian Techniques Probabilistic Design FMEA I R Growth " I...devices suffer thermal aging; oxidation and other chemical reactions are enhanced; viscosity reduction and evaporation of lubricants are problems

  10. Continuous Evaluation of Evolving Behavioral Intervention Technologies

    PubMed Central

    Mohr, David C.; Cheung, Ken; Schueller, Stephen M.; Brown, C. Hendricks; Duan, Naihua

    2013-01-01

    Behavioral intervention technologies (BITs) are web-based and mobile interventions intended to support patients and consumers in changing behaviors related to health, mental health, and well-being. BITs are provided to patients and consumers in clinical care settings and commercial marketplaces, frequently with little or no evaluation. Current evaluation methods, including RCTs and implementation studies, can require years to validate an intervention. This timeline is fundamentally incompatible with the BIT environment, where technology advancement and changes in consumer expectations occur quickly, necessitating rapidly evolving interventions. However, BITs can routinely and iteratively collect data in a planned and strategic manner and generate evidence through systematic prospective analyses, thereby creating a system that can “learn.” A methodologic framework, Continuous Evaluation of Evolving Behavioral Intervention Technologies (CEEBIT), is proposed that can support the evaluation of multiple BITs or evolving versions, eliminating those that demonstrate poorer outcomes, while allowing new BITs to be entered at any time. CEEBIT could be used to ensure the effectiveness of BITs provided through deployment platforms in clinical care organizations or BIT marketplaces. The features of CEEBIT are described, including criteria for the determination of inferiority, determination of BIT inclusion, methods of assigning consumers to BITs, definition of outcomes, and evaluation of the usefulness of the system. CEEBIT offers the potential to collapse initial evaluation and postmarketing surveillance, providing ongoing assurance of safety and efficacy to patients and consumers, payers, and policymakers. PMID:24050429

  11. Continuous evaluation of evolving behavioral intervention technologies.

    PubMed

    Mohr, David C; Cheung, Ken; Schueller, Stephen M; Hendricks Brown, C; Duan, Naihua

    2013-10-01

    Behavioral intervention technologies (BITs) are web-based and mobile interventions intended to support patients and consumers in changing behaviors related to health, mental health, and well-being. BITs are provided to patients and consumers in clinical care settings and commercial marketplaces, frequently with little or no evaluation. Current evaluation methods, including RCTs and implementation studies, can require years to validate an intervention. This timeline is fundamentally incompatible with the BIT environment, where technology advancement and changes in consumer expectations occur quickly, necessitating rapidly evolving interventions. However, BITs can routinely and iteratively collect data in a planned and strategic manner and generate evidence through systematic prospective analyses, thereby creating a system that can "learn." A methodologic framework, Continuous Evaluation of Evolving Behavioral Intervention Technologies (CEEBIT), is proposed that can support the evaluation of multiple BITs or evolving versions, eliminating those that demonstrate poorer outcomes, while allowing new BITs to be entered at any time. CEEBIT could be used to ensure the effectiveness of BITs provided through deployment platforms in clinical care organizations or BIT marketplaces. The features of CEEBIT are described, including criteria for the determination of inferiority, determination of BIT inclusion, methods of assigning consumers to BITs, definition of outcomes, and evaluation of the usefulness of the system. CEEBIT offers the potential to collapse initial evaluation and postmarketing surveillance, providing ongoing assurance of safety and efficacy to patients and consumers, payers, and policymakers.

  12. Transistor Level Circuit Experiments using Evolvable Hardware

    NASA Technical Reports Server (NTRS)

    Stoica, A.; Zebulum, R. S.; Keymeulen, D.; Ferguson, M. I.; Daud, Taher; Thakoor, A.

    2005-01-01

    The Jet Propulsion Laboratory (JPL) performs research in fault tolerant, long life, and space survivable electronics for the National Aeronautics and Space Administration (NASA). With that focus, JPL has been involved in Evolvable Hardware (EHW) technology research for the past several years. We have advanced the technology not only by simulation and evolution experiments, but also by designing, fabricating, and evolving a variety of transistor-based analog and digital circuits at the chip level. EHW refers to self-configuration of electronic hardware by evolutionary/genetic search mechanisms, thereby maintaining existing functionality in the presence of degradations due to aging, temperature, and radiation. In addition, EHW has the capability to reconfigure itself for new functionality when required for mission changes or encountered opportunities. Evolution experiments are performed using a genetic algorithm running on a DSP as the reconfiguration mechanism and controlling the evolvable hardware mounted on a self-contained circuit board. Rapid reconfiguration allows convergence to circuit solutions in the order of seconds. The paper illustrates hardware evolution results of electronic circuits and their ability to perform under 230 C temperature as well as radiations of up to 250 kRad.

  13. Evolving specialization of the arthropod nervous system.

    PubMed

    Jarvis, Erin; Bruce, Heather S; Patel, Nipam H

    2012-06-26

    The diverse array of body plans possessed by arthropods is created by generating variations upon a design of repeated segments formed during development, using a relatively small "toolbox" of conserved patterning genes. These attributes make the arthropod body plan a valuable model for elucidating how changes in development create diversity of form. As increasingly specialized segments and appendages evolved in arthropods, the nervous systems of these animals also evolved to control the function of these structures. Although there is a remarkable degree of conservation in neural development both between individual segments in any given species and between the nervous systems of different arthropod groups, the differences that do exist are informative for inferring general principles about the holistic evolution of body plans. This review describes developmental processes controlling neural segmentation and regionalization, highlighting segmentation mechanisms that create both ectodermal and neural segments, as well as recent studies of the role of Hox genes in generating regional specification within the central nervous system. We argue that this system generates a modular design that allows the nervous system to evolve in concert with the body segments and their associated appendages. This information will be useful in future studies of macroevolutionary changes in arthropod body plans, especially in understanding how these transformations can be made in a way that retains the function of appendages during evolutionary transitions in morphology.

  14. Bioharness™ Multivariable Monitoring Device: Part. II: Reliability

    PubMed Central

    Johnstone, James A.; Ford, Paul A.; Hughes, Gerwyn; Watson, Tim; Garrett, Andrew T.

    2012-01-01

    The Bioharness™ monitoring system may provide physiological information on human performance but the reliability of this data is fundamental for confidence in the equipment being used. The objective of this study was to assess the reliability of each of the 5 Bioharness™ variables using a treadmill based protocol. 10 healthy males participated. A between and within subject design to assess the reliability of Heart rate (HR), Breathing Frequency (BF), Accelerometry (ACC) and Infra-red skin temperature (ST) was completed via a repeated, discontinuous, incremental treadmill protocol. Posture (P) was assessed by a tilt table, moved through 160°. Between subject data reported low Coefficient of Variation (CV) and strong correlations(r) for ACC and P (CV< 7.6; r = 0.99, p < 0.01). In contrast, HR and BF (CV~19.4; r~0.70, p < 0.01) and ST (CV 3.7; r = 0.61, p < 0.01), present more variable data. Intra and inter device data presented strong relationships (r > 0.89, p < 0.01) and low CV (<10.1) for HR, ACC, P and ST. BF produced weaker relationships (r < 0.72) and higher CV (<17.4). In comparison to the other variables BF variable consistently presents less reliability. Global results suggest that the Bioharness™ is a reliable multivariable monitoring device during laboratory testing within the limits presented. Key pointsHeart rate and breathing frequency data increased in variance at higher velocities (i.e. ≥ 10 km.h-1)In comparison to the between subject testing, the intra and inter reliability presented good reliability in data suggesting placement or position of device relative to performer could be important for data collectionUnderstanding a devices variability in measurement is important before it can be used within an exercise testing or monitoring setting PMID:24149347

  15. Reliability Generalization (RG) Analysis: The Test Is Not Reliable

    ERIC Educational Resources Information Center

    Warne, Russell

    2008-01-01

    Literature shows that most researchers are unaware of some of the characteristics of reliability. This paper clarifies some misconceptions by describing the procedures, benefits, and limitations of reliability generalization while using it to illustrate the nature of score reliability. Reliability generalization (RG) is a meta-analytic method…

  16. Where the Wild Things Are: The Evolving Iconography of Rural Fauna

    ERIC Educational Resources Information Center

    Buller, Henry

    2004-01-01

    This paper explores the changing relationship between "nature" and rurality through an examination of the shifting iconography of animals, and particularly "wild" animals, in a rural setting. Drawing upon a set of examples, the paper argues that the faunistic icons of rural areas are evolving as alternative conceptions of the countryside, of…

  17. Salt tolerance evolves more frequently in C4 grass lineages.

    PubMed

    Bromham, L; Bennett, T H

    2014-03-01

    Salt tolerance has evolved many times in the grass family, and yet few cereal crops are salt tolerant. Why has it been so difficult to develop crops tolerant of saline soils when salt tolerance has evolved so frequently in nature? One possible explanation is that some grass lineages have traits that predispose them to developing salt tolerance and that without these background traits, salt tolerance is harder to achieve. One candidate background trait is photosynthetic pathway, which has also been remarkably labile in grasses. At least 22 independent origins of the C4 photosynthetic pathway have been suggested to occur within the grass family. It is possible that the evolution of C4 photosynthesis aids exploitation of saline environments, because it reduces transpiration, increases water-use efficiency and limits the uptake of toxic ions. But the observed link between the evolution of C4 photosynthesis and salt tolerance could simply be due to biases in phylogenetic distribution of halophytes or C4 species. Here, we use a phylogenetic analysis to investigate the association between photosynthetic pathway and salt tolerance in the grass family Poaceae. We find that salt tolerance is significantly more likely to occur in lineages with C4 photosynthesis than in C3 lineages. We discuss the possible links between C4 photosynthesis and salt tolerance and consider the limitations of inferring the direction of causality of this relationship.

  18. The Ames Philosophical Belief Inventory: Reliability and Validity

    ERIC Educational Resources Information Center

    Sawyer, R. N.

    1971-01-01

    This study investigated the reliability and validity of the Philosophical Belief Inventory (PBI). With the exception of the relationship between idealism and pragmatism and realism and existentialism, the PBI scales appear to be assessing independent facets of belief. (Author)

  19. Conceptualizing Essay Tests' Reliability and Validity: From Research to Theory

    ERIC Educational Resources Information Center

    Badjadi, Nour El Imane

    2013-01-01

    The current paper on writing assessment surveys the literature on the reliability and validity of essay tests. The paper aims to examine the two concepts in relationship with essay testing as well as to provide a snapshot of the current understandings of the reliability and validity of essay tests as drawn in recent research studies. Bearing in…

  20. Wild Origins: The Evolving Nature of Animal Behavior

    NASA Astrophysics Data System (ADS)

    Flores, Ifigenia

    For billions of years, evolution has been the driving force behind the incredible range of biodiversity on our planet. Wild Origins is a concept plan for an exhibition at the National Zoo that uses case studies of animal behavior to explain the theory of evolution. Behaviors evolve, just as physical forms do. Understanding natural selection can help us interpret animal behavior and vice-versa. A living collection, digital media, interactives, fossils, and photographs will relay stories of social behavior, sex, navigation and migration, foraging, domestication, and relationships between different species. The informal learning opportunities visitors are offered at the zoo will create a connection with the exhibition's teaching points. Visitors will leave with an understanding and sense of wonder at the evolutionary view of life.

  1. The organization and control of an evolving interdependent population

    PubMed Central

    Vural, Dervis C.; Isakov, Alexander; Mahadevan, L.

    2015-01-01

    Starting with Darwin, biologists have asked how populations evolve from a low fitness state that is evolutionarily stable to a high fitness state that is not. Specifically of interest is the emergence of cooperation and multicellularity where the fitness of individuals often appears in conflict with that of the population. Theories of social evolution and evolutionary game theory have produced a number of fruitful results employing two-state two-body frameworks. In this study, we depart from this tradition and instead consider a multi-player, multi-state evolutionary game, in which the fitness of an agent is determined by its relationship to an arbitrary number of other agents. We show that populations organize themselves in one of four distinct phases of interdependence depending on one parameter, selection strength. Some of these phases involve the formation of specialized large-scale structures. We then describe how the evolution of independence can be manipulated through various external perturbations. PMID:26040593

  2. Reliability after inspection. [of flaws on laminate surface

    NASA Technical Reports Server (NTRS)

    Davidson, J. R.

    1975-01-01

    The investigation is concerned with the derivation of relationships between the probability of having manufacturing defects, the probability of detecting a flaw, and the final reliability. Equations for the simple situation in which only one flaw can be present are used to introduce the relationships in a Bayes' theorem approach to the assessment of the final reliability. Situations which are prevalent in composites manufacturing are considered. Attention is given to a case involving the random occurrence of flaws on a laminate surface.

  3. Space Shuttle Propulsion System Reliability

    NASA Technical Reports Server (NTRS)

    Welzyn, Ken; VanHooser, Katherine; Moore, Dennis; Wood, David

    2011-01-01

    This session includes the following sessions: (1) External Tank (ET) System Reliability and Lessons, (2) Space Shuttle Main Engine (SSME), Reliability Validated by a Million Seconds of Testing, (3) Reusable Solid Rocket Motor (RSRM) Reliability via Process Control, and (4) Solid Rocket Booster (SRB) Reliability via Acceptance and Testing.

  4. Administration to innovation: the evolving management challenge in primary care.

    PubMed

    Laing, A; Marnoch, G; McKee, L; Joshi, R; Reid, J

    1997-01-01

    The concept of the primary health-care team involving an increasingly diverse range of health care professionals is widely recognized as central to the pursuit of a primary care-led health service in the UK. Although GPs are formally recognized as the team leaders, there is little by way of policy prescription as to how team roles and relationships should be developed, or evidence as to how their roles have in fact evolved. Thus the notion of the primary health-care team while commonly employed, is in reality lacking definition with the current contribution of practice managers to the operation of this team being poorly understood. Focusing on the career backgrounds of practice managers, their range of responsibilities, and their involvement in innovation in general practice, presents a preliminary account of a chief scientist office-funded project examining the role being played by practice managers in primary health-care innovation. More specifically, utilizing data gained from the ongoing study, contextualizes the role played by practice managers in the primary health-care team. By exploring the business environment surrounding the NHS general practice, the research seeks to understand the evolving world of the practice manager. Drawing on questionnaire data, reinforced by qualitative data from the current interview phase, describes the role played by practice managers in differing practice contexts. This facilitates a discussion of a set of ideal type general practice organizational and managerial structures. Discusses the relationships and skills required by practice managers in each of these organizational types with reference to data gathered to date in the research.

  5. Reliability of digital reactor protection system based on extenics.

    PubMed

    Zhao, Jing; He, Ya-Nan; Gu, Peng-Fei; Chen, Wei-Hua; Gao, Feng

    2016-01-01

    After the Fukushima nuclear accident, safety of nuclear power plants (NPPs) is widespread concerned. The reliability of reactor protection system (RPS) is directly related to the safety of NPPs, however, it is difficult to accurately evaluate the reliability of digital RPS. The method is based on estimating probability has some uncertainties, which can not reflect the reliability status of RPS dynamically and support the maintenance and troubleshooting. In this paper, the reliability quantitative analysis method based on extenics is proposed for the digital RPS (safety-critical), by which the relationship between the reliability and response time of RPS is constructed. The reliability of the RPS for CPR1000 NPP is modeled and analyzed by the proposed method as an example. The results show that the proposed method is capable to estimate the RPS reliability effectively and provide support to maintenance and troubleshooting of digital RPS system.

  6. Human Reliability Program Workshop

    SciTech Connect

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  7. Reliable broadcast protocols

    NASA Technical Reports Server (NTRS)

    Joseph, T. A.; Birman, Kenneth P.

    1989-01-01

    A number of broadcast protocols that are reliable subject to a variety of ordering and delivery guarantees are considered. Developing applications that are distributed over a number of sites and/or must tolerate the failures of some of them becomes a considerably simpler task when such protocols are available for communication. Without such protocols the kinds of distributed applications that can reasonably be built will have a very limited scope. As the trend towards distribution and decentralization continues, it will not be surprising if reliable broadcast protocols have the same role in distributed operating systems of the future that message passing mechanisms have in the operating systems of today. On the other hand, the problems of engineering such a system remain large. For example, deciding which protocol is the most appropriate to use in a certain situation or how to balance the latency-communication-storage costs is not an easy question.

  8. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection

    NASA Astrophysics Data System (ADS)

    Janković, Srdja; Ćirković, Milan M.

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  9. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection.

    PubMed

    Janković, Srdja; Ćirković, Milan M

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  10. Compact, Reliable EEPROM Controller

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Kleyner, Igor

    2010-01-01

    A compact, reliable controller for an electrically erasable, programmable read-only memory (EEPROM) has been developed specifically for a space-flight application. The design may be adaptable to other applications in which there are requirements for reliability in general and, in particular, for prevention of inadvertent writing of data in EEPROM cells. Inadvertent writes pose risks of loss of reliability in the original space-flight application and could pose such risks in other applications. Prior EEPROM controllers are large and complex and do not provide all reasonable protections (in many cases, few or no protections) against inadvertent writes. In contrast, the present controller provides several layers of protection against inadvertent writes. The controller also incorporates a write-time monitor, enabling determination of trends in the performance of an EEPROM through all phases of testing. The controller has been designed as an integral subsystem of a system that includes not only the controller and the controlled EEPROM aboard a spacecraft but also computers in a ground control station, relatively simple onboard support circuitry, and an onboard communication subsystem that utilizes the MIL-STD-1553B protocol. (MIL-STD-1553B is a military standard that encompasses a method of communication and electrical-interface requirements for digital electronic subsystems connected to a data bus. MIL-STD- 1553B is commonly used in defense and space applications.) The intent was to both maximize reliability while minimizing the size and complexity of onboard circuitry. In operation, control of the EEPROM is effected via the ground computers, the MIL-STD-1553B communication subsystem, and the onboard support circuitry, all of which, in combination, provide the multiple layers of protection against inadvertent writes. There is no controller software, unlike in many prior EEPROM controllers; software can be a major contributor to unreliability, particularly in fault

  11. Designing reliability into accelerators

    NASA Astrophysics Data System (ADS)

    Hutton, A.

    1992-07-01

    Future accelerators will have to provide a high degree of reliability. Quality must be designed in right from the beginning and must remain a central theme throughout the project. The problem is similar to the problems facing US industry today, and examples of the successful application of quality engineering will be given. Different aspects of an accelerator project will be addressed: Concept, Design, Motivation, Management Techniques, and Fault Diagnosis. The importance of creating and maintaining a coherent team will be stressed.

  12. Reliability and testing

    NASA Technical Reports Server (NTRS)

    Auer, Werner

    1996-01-01

    Reliability and its interdependence with testing are important topics for development and manufacturing of successful products. This generally accepted fact is not only a technical statement, but must be also seen in the light of 'Human Factors.' While the background for this paper is the experience gained with electromechanical/electronic space products, including control and system considerations, it is believed that the content could be also of interest for other fields.

  13. Laser System Reliability

    DTIC Science & Technology

    1977-03-01

    NEALE CAPT. RANDALL D. GODFREY CAPT. JOHN E. ACTON HR. DAVE B. LEMMING (ASD) :,^ 19 . ••^w**** SECTION III RELIABILITY PREDICTION...Dete Exchange Program) failure rate date bank. In addition, some data have been obtained from Hughes. Rocketdyne , Garrett, and the AFWL’s APT Failure...Central Ave, Suite 306, Albuq, NM 87108 R/M Systems, Inc (Dr. K. Blemel), 10801 Lomas 81vd NE, Albuquerque, NM 87112 Rocketdyne 01 v, Rockwell

  14. Spacecraft transmitter reliability

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A workshop on spacecraft transmitter reliability was held at the NASA Lewis Research Center on September 25 and 26, 1979, to discuss present knowledge and to plan future research areas. Since formal papers were not submitted, this synopsis was derived from audio tapes of the workshop. The following subjects were covered: users' experience with space transmitters; cathodes; power supplies and interfaces; and specifications and quality assurance. A panel discussion ended the workshop.

  15. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  16. General Aviation Aircraft Reliability Study

    NASA Technical Reports Server (NTRS)

    Pettit, Duane; Turnbull, Andrew; Roelant, Henk A. (Technical Monitor)

    2001-01-01

    This reliability study was performed in order to provide the aviation community with an estimate of Complex General Aviation (GA) Aircraft System reliability. To successfully improve the safety and reliability for the next generation of GA aircraft, a study of current GA aircraft attributes was prudent. This was accomplished by benchmarking the reliability of operational Complex GA Aircraft Systems. Specifically, Complex GA Aircraft System reliability was estimated using data obtained from the logbooks of a random sample of the Complex GA Aircraft population.

  17. Risky prey behavior evolves in risky habitats.

    PubMed

    Urban, Mark C

    2007-09-04

    Longstanding theory in behavioral ecology predicts that prey should evolve decreased foraging rates under high predation threat. However, an alternative perspective suggests that growth into a size refuge from gape-limited predation and the future benefits of large size can outweigh the initial survival costs of intense foraging. Here, I evaluate the relative contributions of selection from a gape-limited predator (Ambystoma opacum) and spatial location to explanations of variation in foraging, growth, and survival in 10 populations of salamander larvae (Ambystoma maculatum). Salamander larvae from populations naturally exposed to intense A. opacum predation risk foraged more actively under common garden conditions. Higher foraging rates were associated with low survival in populations exposed to free-ranging A. opacum larvae. Results demonstrate that risky foraging activity can evolve in high predation-risk habitats when the dominant predators are gape-limited. This finding invites the further exploration of diverse patterns of prey foraging behavior that depends on natural variation in predator size-selectivity. In particular, prey should adopt riskier behaviors under predation threat than expected under existing risk allocation models if foraging effort directly reduces the duration of risk by growth into a size refuge. Moreover, evidence from this study suggests that foraging has evolved over microgeographic scales despite substantial modification by regional gene flow. This interaction between local selection and spatial location suggests a joint role for adaptation and maladaptation in shaping species interactions across natural landscapes, which is a finding with implications for dynamics at the population, community, and metacommunity levels.

  18. Production and decay of evolving horizons

    NASA Astrophysics Data System (ADS)

    Nielsen, Alex B.; Visser, Matt

    2006-07-01

    We consider a simple physical model for an evolving horizon that is strongly interacting with its environment, exchanging arbitrarily large quantities of matter with its environment in the form of both infalling material and outgoing Hawking radiation. We permit fluxes of both lightlike and timelike particles to cross the horizon, and ask how the horizon grows and shrinks in response to such flows. We place a premium on providing a clear and straightforward exposition with simple formulae. To be able to handle such a highly dynamical situation in a simple manner we make one significant physical restriction—that of spherical symmetry—and two technical mathematical restrictions: (1) we choose to slice the spacetime in such a way that the spacetime foliations (and hence the horizons) are always spherically symmetric. (2) Furthermore, we adopt Painlevé Gullstrand coordinates (which are well suited to the problem because they are nonsingular at the horizon) in order to simplify the relevant calculations. Of course physics results are ultimately independent of the choice of coordinates, but this particular coordinate system yields a clean physical interpretation of the relevant physics. We find particularly simple forms for surface gravity, and for the first and second law of black hole thermodynamics, in this general evolving horizon situation. Furthermore, we relate our results to Hawking's apparent horizon, Ashtekar and co-worker's isolated and dynamical horizons, and Hayward's trapping horizon. The evolving black hole model discussed here will be of interest, both from an astrophysical viewpoint in terms of discussing growing black holes and from a purely theoretical viewpoint in discussing black hole evaporation via Hawking radiation.

  19. Investigating Evolved Compositions Around Wolf Crater

    NASA Technical Reports Server (NTRS)

    Greenhagen, B. T.; Cahill, J. T. S.; Jolliff, B. L.; Lawrence, S. J.; Glotch, T. D.

    2017-01-01

    Wolf crater is an irregularly shaped, approximately 25 km crater in the south-central portion of Mare Nubium on the lunar nearside. While not previously identified as a lunar "red spot", Wolf crater was identified as a Th anomaly by Lawrence and coworkers. We have used data from the Lunar Reconnaissance Orbiter (LRO) to determine the area surrounding Wolf crater has composition more similar to highly evolved, non-mare volcanic structures than typical lunar crustal lithology. In this presentation, we will investigate the geomorphology and composition of the Wolf crater and discuss implications for the origin of the anomalous terrain.

  20. Cobalt-phosphate oxygen-evolving compound.

    PubMed

    Kanan, Matthew W; Surendranath, Yogesh; Nocera, Daniel G

    2009-01-01

    The utilization of solar energy on a large scale requires efficient storage. Solar-to-fuels has the capacity to meet large scale storage needs as demonstrated by natural photosynthesis. This process uses sunlight to rearrange the bonds of water to furnish O2 and an H2-equivalent. We present a tutorial review of our efforts to develop an amorphous cobalt-phosphate catalyst that oxidizes water to O2. The use of earth-abundant materials, operation in water at neutral pH, and the formation of the catalyst in situ captures functional elements of the oxygen evolving complex of Photosystem II.

  1. Evolvable circuit with transistor-level reconfigurability

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian (Inventor); Salazar-Lazaro, Carlos Harold (Inventor)

    2004-01-01

    An evolvable circuit includes a plurality of reconfigurable switches, a plurality of transistors within a region of the circuit, the plurality of transistors having terminals, the plurality of transistors being coupled between a power source terminal and a power sink terminal so as to be capable of admitting power between the power source terminal and the power sink terminal, the plurality of transistors being coupled so that every transistor terminal to transistor terminal coupling within the region of the circuit comprises a reconfigurable switch.

  2. An evolving paradigm for the secretory pathway?

    PubMed Central

    Lippincott-Schwartz, Jennifer

    2011-01-01

    The paradigm that the secretory pathway consists of a stable endoplasmic reticulum and Golgi apparatus, using discrete transport vesicles to exchange their contents, gained important support from groundbreaking biochemical and genetic studies during the 1980s. However, the subsequent development of new imaging technologies with green fluorescent protein introduced data on dynamic processes not fully accounted for by the paradigm. As a result, we may be seeing an example of how a paradigm is evolving to account for the results of new technologies and their new ways of describing cellular processes. PMID:22039065

  3. Mobile computing acceptance grows as applications evolve.

    PubMed

    Porn, Louis M; Patrick, Kelly

    2002-01-01

    Handheld devices are becoming more cost-effective to own, and their use in healthcare environments is increasing. Handheld devices currently are being used for e-prescribing, charge capture, and accessing daily schedules and reference tools. Future applications may include education on medications, dictation, order entry, and test-results reporting. Selecting the right handheld device requires careful analysis of current and future applications, as well as vendor expertise. It is important to recognize the technology will continue to evolve over the next three years.

  4. Life and reliability models for helicopter transmissions

    NASA Technical Reports Server (NTRS)

    Savage, M.; Knorr, R. J.; Coy, J. J.

    1982-01-01

    Computer models of life and reliability are presented for planetary gear trains with a fixed ring gear, input applied to the sun gear, and output taken from the planet arm. For this transmission the input and output shafts are co-axial and the input and output torques are assumed to be coaxial with these shafts. Thrust and side loading are neglected. The reliability model is based on the Weibull distributions of the individual reliabilities of the in transmission components. The system model is also a Weibull distribution. The load versus life model for the system is a power relationship as the models for the individual components. The load-life exponent and basic dynamic capacity are developed as functions of the components capacities. The models are used to compare three and four planet, 150 kW (200 hp), 5:1 reduction transmissions with 1500 rpm input speed to illustrate their use.

  5. Conditional Reliability Coefficients for Test Scores.

    PubMed

    Nicewander, W Alan

    2017-04-06

    The most widely used, general index of measurement precision for psychological and educational test scores is the reliability coefficient-a ratio of true variance for a test score to the true-plus-error variance of the score. In item response theory (IRT) models for test scores, the information function is the central, conditional index of measurement precision. In this inquiry, conditional reliability coefficients for a variety of score types are derived as simple transformations of information functions. It is shown, for example, that the conditional reliability coefficient for an ordinary, number-correct score, X, is equal to, ρ(X,X'|θ)=I(X,θ)/[I(X,θ)+1] Where: θ is a latent variable measured by an observed test score, X; p(X, X'|θ) is the conditional reliability of X at a fixed value of θ; and I(X, θ) is the score information function. This is a surprisingly simple relationship between the 2, basic indices of measurement precision from IRT and classical test theory (CTT). This relationship holds for item scores as well as test scores based on sums of item scores-and it holds for dichotomous as well as polytomous items, or a mix of both item types. Also, conditional reliabilities are derived for computerized adaptive test scores, and for θ-estimates used as alternatives to number correct scores. These conditional reliabilities are all related to information in a manner similar-or-identical to the 1 given above for the number-correct (NC) score. (PsycINFO Database Record

  6. Crystalline-silicon reliability lessons for thin-film modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1985-01-01

    The reliability of crystalline silicon modules has been brought to a high level with lifetimes approaching 20 years, and excellent industry credibility and user satisfaction. The transition from crystalline modules to thin film modules is comparable to the transition from discrete transistors to integrated circuits. New cell materials and monolithic structures will require new device processing techniques, but the package function and design will evolve to a lesser extent. Although there will be new encapsulants optimized to take advantage of the mechanical flexibility and low temperature processing features of thin films, the reliability and life degradation stresses and mechanisms will remain mostly unchanged. Key reliability technologies in common between crystalline and thin film modules include hot spot heating, galvanic and electrochemical corrosion, hail impact stresses, glass breakage, mechanical fatigue, photothermal degradation of encapsulants, operating temperature, moisture sorption, circuit design strategies, product safety issues, and the process required to achieve a reliable product from a laboratory prototype.

  7. Interrelation Between Safety Factors and Reliability

    NASA Technical Reports Server (NTRS)

    Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.

  8. Have plants evolved to self-immolate?

    PubMed Central

    Bowman, David M. J. S.; French, Ben J.; Prior, Lynda D.

    2014-01-01

    By definition fire prone ecosystems have highly combustible plants, leading to the hypothesis, first formally stated by Mutch in 1970, that community flammability is the product of natural selection of flammable traits. However, proving the “Mutch hypothesis” has presented an enormous challenge for fire ecologists given the difficulty in establishing cause and effect between landscape fire and flammable plant traits. Individual plant traits (such as leaf moisture content, retention of dead branches and foliage, oil rich foliage) are known to affect the flammability of plants but there is no evidence these characters evolved specifically to self-immolate, although some of these traits may have been secondarily modified to increase the propensity to burn. Demonstrating individual benefits from self-immolation is extraordinarily difficult, given the intersection of the physical environmental factors that control landscape fire (fuel production, dryness and ignitions) with community flammability properties that emerge from numerous traits of multiple species (canopy cover and litter bed bulk density). It is more parsimonious to conclude plants have evolved mechanisms to tolerate, but not promote, landscape fire. PMID:25414710

  9. Evolvability of an Optimal Recombination Rate.

    PubMed

    Lobkovsky, Alexander E; Wolf, Yuri I; Koonin, Eugene V

    2015-12-10

    Evolution and maintenance of genetic recombination and its relation to the mutational process is a long-standing, fundamental problem in evolutionary biology that is linked to the general problem of evolution of evolvability. We explored a stochastic model of the evolution of recombination using additive fitness and infinite allele assumptions but no assumptions on the sign or magnitude of the epistasis and the distribution of mutation effects. In this model, fluctuating negative epistasis and predominantly deleterious mutations arise naturally as a consequence of the additive fitness and a reservoir from which new alleles arrive with a fixed distribution of fitness effects. Analysis of the model revealed a nonmonotonic effect of recombination intensity on fitness, with an optimal recombination rate value which maximized fitness in steady state. The optimal recombination rate depended on the mutation rate and was evolvable, that is, subject to selection. The predictions of the model were compatible with the observations on the dependence between genome rearrangement rate and gene flux in microbial genomes.

  10. The Comet Cometh: Evolving Developmental Systems.

    PubMed

    Jaeger, Johannes; Laubichler, Manfred; Callebaut, Werner

    In a recent opinion piece, Denis Duboule has claimed that the increasing shift towards systems biology is driving evolutionary and developmental biology apart, and that a true reunification of these two disciplines within the framework of evolutionary developmental biology (EvoDevo) may easily take another 100 years. He identifies methodological, epistemological, and social differences as causes for this supposed separation. Our article provides a contrasting view. We argue that Duboule's prediction is based on a one-sided understanding of systems biology as a science that is only interested in functional, not evolutionary, aspects of biological processes. Instead, we propose a research program for an evolutionary systems biology, which is based on local exploration of the configuration space in evolving developmental systems. We call this approach-which is based on reverse engineering, simulation, and mathematical analysis-the natural history of configuration space. We discuss a number of illustrative examples that demonstrate the past success of local exploration, as opposed to global mapping, in different biological contexts. We argue that this pragmatic mode of inquiry can be extended and applied to the mathematical analysis of the developmental repertoire and evolutionary potential of evolving developmental mechanisms and that evolutionary systems biology so conceived provides a pragmatic epistemological framework for the EvoDevo synthesis.

  11. Early formation of evolved asteroidal crust.

    PubMed

    Day, James M D; Ash, Richard D; Liu, Yang; Bellucci, Jeremy J; Rumble, Douglas; McDonough, William F; Walker, Richard J; Taylor, Lawrence A

    2009-01-08

    Mechanisms for the formation of crust on planetary bodies remain poorly understood. It is generally accepted that Earth's andesitic continental crust is the product of plate tectonics, whereas the Moon acquired its feldspar-rich crust by way of plagioclase flotation in a magma ocean. Basaltic meteorites provide evidence that, like the terrestrial planets, some asteroids generated crust and underwent large-scale differentiation processes. Until now, however, no evolved felsic asteroidal crust has been sampled or observed. Here we report age and compositional data for the newly discovered, paired and differentiated meteorites Graves Nunatak (GRA) 06128 and GRA 06129. These meteorites are feldspar-rich, with andesite bulk compositions. Their age of 4.52 +/- 0.06 Gyr demonstrates formation early in Solar System history. The isotopic and elemental compositions, degree of metamorphic re-equilibration and sulphide-rich nature of the meteorites are most consistent with an origin as partial melts from a volatile-rich, oxidized asteroid. GRA 06128 and 06129 are the result of a newly recognized style of evolved crust formation, bearing witness to incomplete differentiation of their parent asteroid and to previously unrecognized diversity of early-formed materials in the Solar System.

  12. Novel cooperation experimentally evolved between species.

    PubMed

    Harcombe, William

    2010-07-01

    Cooperation violates the view of "nature red in tooth and claw" that prevails in our understanding of evolution, yet examples of cooperation abound. Most work has focused on maintenance of cooperation within a single species through mechanisms such as kin selection. The factors necessary for the evolutionary origin of aiding unrelated individuals such as members of another species have not been experimentally tested. Here, I demonstrate that cooperation between species can be evolved in the laboratory if (1) there is preexisting reciprocation or feedback for cooperation, and (2) reciprocation is preferentially received by cooperative genotypes. I used a two species system involving Salmonella enterica ser. Typhimurium and an Escherichia coli mutant unable to synthesize an essential amino acid. In lactose media Salmonella consumes metabolic waste from E. coli, thus creating a mechanism of reciprocation for cooperation. Growth in a spatially structured environment assured that the benefits of cooperation were preferentially received by cooperative genotypes. Salmonella evolved to aid E. coli by excreting a costly amino acid, however this novel cooperation disappeared if the waste consumption or spatial structure were removed. This study builds on previous work to demonstrate an experimental origin of interspecific cooperation, and to test the factors necessary for such interactions to arise.

  13. Collapse of cooperation in evolving games

    PubMed Central

    Stewart, Alexander J.; Plotkin, Joshua B.

    2014-01-01

    Game theory provides a quantitative framework for analyzing the behavior of rational agents. The Iterated Prisoner’s Dilemma in particular has become a standard model for studying cooperation and cheating, with cooperation often emerging as a robust outcome in evolving populations. Here we extend evolutionary game theory by allowing players’ payoffs as well as their strategies to evolve in response to selection on heritable mutations. In nature, many organisms engage in mutually beneficial interactions and individuals may seek to change the ratio of risk to reward for cooperation by altering the resources they commit to cooperative interactions. To study this, we construct a general framework for the coevolution of strategies and payoffs in arbitrary iterated games. We show that, when there is a tradeoff between the benefits and costs of cooperation, coevolution often leads to a dramatic loss of cooperation in the Iterated Prisoner’s Dilemma. The collapse of cooperation is so extreme that the average payoff in a population can decline even as the potential reward for mutual cooperation increases. Depending upon the form of tradeoffs, evolution may even move away from the Iterated Prisoner’s Dilemma game altogether. Our work offers a new perspective on the Prisoner’s Dilemma and its predictions for cooperation in natural populations; and it provides a general framework to understand the coevolution of strategies and payoffs in iterated interactions. PMID:25422421

  14. Caterpillars evolved from onychophorans by hybridogenesis.

    PubMed

    Williamson, Donald I

    2009-11-24

    I reject the Darwinian assumption that larvae and their adults evolved from a single common ancestor. Rather I posit that, in animals that metamorphose, the basic types of larvae originated as adults of different lineages, i.e., larvae were transferred when, through hybridization, their genomes were acquired by distantly related animals. "Caterpillars," the name for eruciforms with thoracic and abdominal legs, are larvae of lepidopterans, hymenopterans, and mecopterans (scorpionflies). Grubs and maggots, including the larvae of beetles, bees, and flies, evolved from caterpillars by loss of legs. Caterpillar larval organs are dismantled and reconstructed in the pupal phase. Such indirect developmental patterns (metamorphoses) did not originate solely by accumulation of random mutations followed by natural selection; rather they are fully consistent with my concept of evolution by hybridogenesis. Members of the phylum Onychophora (velvet worms) are proposed as the evolutionary source of caterpillars and their grub or maggot descendants. I present a molecular biological research proposal to test my thesis. By my hypothesis 2 recognizable sets of genes are detectable in the genomes of all insects with caterpillar grub- or maggot-like larvae: (i) onychophoran genes that code for proteins determining larval morphology/physiology and (ii) sequentially expressed insect genes that code for adult proteins. The genomes of insects and other animals that, by contrast, entirely lack larvae comprise recognizable sets of genes from single animal common ancestors.

  15. BOOK REVIEW: OPENING SCIENCE, THE EVOLVING GUIDE ...

    EPA Pesticide Factsheets

    The way we get our funding, collaborate, do our research, and get the word out has evolved over hundreds of years but we can imagine a more open science world, largely facilitated by the internet. The movement towards this more open way of doing and presenting science is coming, and it is not taking hundreds of years. If you are interested in these trends, and would like to find out more about where this is all headed and what it means to you, consider downloding Opening Science, edited by Sönke Bartling and Sascha Friesike, subtitled The Evolving Guide on How the Internet is Changing Research, Collaboration, and Scholarly Publishing. In 26 chapters by various authors from a range of disciplines the book explores the developing world of open science, starting from the first scientific revolution and bringing us to the next scientific revolution, sometimes referred to as “Science 2.0”. Some of the articles deal with the impact of the changing landscape of how science is done, looking at the impact of open science on Academia, or journal publishing, or medical research. Many of the articles look at the uses, pitfalls, and impact of specific tools, like microblogging (think Twitter), social networking, and reference management. There is lots of discussion and definition of terms you might use or misuse like “altmetrics” and “impact factor”. Science will probably never be completely open, and Twitter will probably never replace the journal article,

  16. Collapse of cooperation in evolving games.

    PubMed

    Stewart, Alexander J; Plotkin, Joshua B

    2014-12-09

    Game theory provides a quantitative framework for analyzing the behavior of rational agents. The Iterated Prisoner's Dilemma in particular has become a standard model for studying cooperation and cheating, with cooperation often emerging as a robust outcome in evolving populations. Here we extend evolutionary game theory by allowing players' payoffs as well as their strategies to evolve in response to selection on heritable mutations. In nature, many organisms engage in mutually beneficial interactions and individuals may seek to change the ratio of risk to reward for cooperation by altering the resources they commit to cooperative interactions. To study this, we construct a general framework for the coevolution of strategies and payoffs in arbitrary iterated games. We show that, when there is a tradeoff between the benefits and costs of cooperation, coevolution often leads to a dramatic loss of cooperation in the Iterated Prisoner's Dilemma. The collapse of cooperation is so extreme that the average payoff in a population can decline even as the potential reward for mutual cooperation increases. Depending upon the form of tradeoffs, evolution may even move away from the Iterated Prisoner's Dilemma game altogether. Our work offers a new perspective on the Prisoner's Dilemma and its predictions for cooperation in natural populations; and it provides a general framework to understand the coevolution of strategies and payoffs in iterated interactions.

  17. CR reliability testing

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Rill, Lynn; Frost, Meryll M.; Staab, Edward V.

    1998-07-01

    The purpose of this work was to develop a method for systematically testing the reliability of a CR system under realistic daily loads in a non-clinical environment prior to its clinical adoption. Once digital imaging replaces film, it will be very difficult to revert back should the digital system become unreliable. Prior to the beginning of the test, a formal evaluation was performed to set the benchmarks for performance and functionality. A formal protocol was established that included all the 62 imaging plates in the inventory for each 24-hour period in the study. Imaging plates were exposed using different combinations of collimation, orientation, and SID. Anthropomorphic phantoms were used to acquire images of different sizes. Each combination was chosen randomly to simulate the differences that could occur in clinical practice. The tests were performed over a wide range of times with batches of plates processed to simulate the temporal constraints required by the nature of portable radiographs taken in the Intensive Care Unit (ICU). Current patient demographics were used for the test studies so automatic routing algorithms could be tested. During the test, only three minor reliability problems occurred, two of which were not directly related to the CR unit. One plate was discovered to cause a segmentation error that essentially reduced the image to only black and white with no gray levels. This plate was removed from the inventory to be replaced. Another problem was a PACS routing problem that occurred when the DICOM server with which the CR was communicating had a problem with disk space. The final problem was a network printing failure to the laser cameras. Although the units passed the reliability test, problems with interfacing to workstations were discovered. The two issues that were identified were the interpretation of what constitutes a study for CR and the construction of the look-up table for a proper gray scale display.

  18. Ultimately Reliable Pyrotechnic Systems

    NASA Technical Reports Server (NTRS)

    Scott, John H.; Hinkel, Todd

    2015-01-01

    This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing

  19. Reliability Growth Prediction

    DTIC Science & Technology

    1986-09-01

    the Duane model because: *e ’he reliability gSod’ data analyzed were reflective of a single Lesr- for • each equipment as opposed to a series of zest ...fabrication) and costs wbich are a function of test length (e.g., chamber operations). A life -cycle cost model, Ref. 14 for example, can be exercised tc...J. Gibson and K. K. Mcain APPROVED FOR FUBLIC RE1EAS NI ODSrRUTIG1,N UY-LUlHfF :-:-4 ROME AIR DEVELOPMENT CENTER Air Force Systems Command Griffiss

  20. Blade reliability collaborative :

    SciTech Connect

    Ashwill, Thomas D.; Ogilvie, Alistair B.; Paquette, Joshua A.

    2013-04-01

    The Blade Reliability Collaborative (BRC) was started by the Wind Energy Technologies Department of Sandia National Laboratories and DOE in 2010 with the goal of gaining insight into planned and unplanned O&M issues associated with wind turbine blades. A significant part of BRC is the Blade Defect, Damage and Repair Survey task, which will gather data from blade manufacturers, service companies, operators and prior studies to determine details about the largest sources of blade unreliability. This report summarizes the initial findings from this work.

  1. Reliable VLSI sequential controllers

    NASA Technical Reports Server (NTRS)

    Whitaker, S.; Maki, G.; Shamanna, M.

    1990-01-01

    A VLSI architecture for synchronous sequential controllers is presented that has attractive qualities for producing reliable circuits. In these circuits, one hardware implementation can realize any flow table with a maximum of 2(exp n) internal states and m inputs. Also all design equations are identical. A real time fault detection means is presented along with a strategy for verifying the correctness of the checking hardware. This self check feature can be employed with no increase in hardware. The architecture can be modified to achieve fail safe designs. With no increase in hardware, an adaptable circuit can be realized that allows replacement of faulty transitions with fault free transitions.

  2. Ferrite logic reliability study

    NASA Technical Reports Server (NTRS)

    Baer, J. A.; Clark, C. B.

    1973-01-01

    Development and use of digital circuits called all-magnetic logic are reported. In these circuits the magnetic elements and their windings comprise the active circuit devices in the logic portion of a system. The ferrite logic device belongs to the all-magnetic class of logic circuits. The FLO device is novel in that it makes use of a dual or bimaterial ferrite composition in one physical ceramic body. This bimaterial feature, coupled with its potential for relatively high speed operation, makes it attractive for high reliability applications. (Maximum speed of operation approximately 50 kHz.)

  3. Testing of reliability - Analysis tools

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1989-01-01

    An outline is presented of issues raised in verifying the accuracy of reliability analysis tools. State-of-the-art reliability analysis tools implement various decomposition, aggregation, and estimation techniques to compute the reliability of a diversity of complex fault-tolerant computer systems. However, no formal methodology has been formulated for validating the reliability estimates produced by these tools. The author presents three states of testing that can be performed on most reliability analysis tools to effectively increase confidence in a tool. These testing stages were applied to the SURE (semi-Markov Unreliability Range Evaluator) reliability analysis tool, and the results of the testing are discussed.

  4. Understanding the Elements of Operational Reliability: A Key for Achieving High Reliability

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    2010-01-01

    This viewgraph presentation reviews operational reliability and its role in achieving high reliability through design and process reliability. The topics include: 1) Reliability Engineering Major Areas and interfaces; 2) Design Reliability; 3) Process Reliability; and 4) Reliability Applications.

  5. Initial value sensitivity of the Chinese stock market and its relationship with the investment psychology

    NASA Astrophysics Data System (ADS)

    Ying, Shangjun; Li, Xiaojun; Zhong, Xiuqin

    2015-04-01

    This paper discusses the initial value sensitivity (IVS) of Chinese stock market, including the single stock market and the Chinese A-share stock market, with respect to real markets and evolving models. The aim is to explore the relationship between IVS of the Chinese A-share stock market and the investment psychology based on the evolving model of genetic cellular automaton (GCA). We find: (1) The Chinese stock market is sensitively dependent on the initial conditions. (2) The GCA model provides a considerable reliability in complexity simulation (e.g. the IVS). (3) The IVS of stock market is positively correlated with the imitation probability when the intensity of the imitation psychology reaches a certain threshold. The paper suggests that the government should seek to keep the imitation psychology under a certain level, otherwise it may induce severe fluctuation to the market.

  6. Load Control System Reliability

    SciTech Connect

    Trudnowski, Daniel

    2015-04-03

    This report summarizes the results of the Load Control System Reliability project (DOE Award DE-FC26-06NT42750). The original grant was awarded to Montana Tech April 2006. Follow-on DOE awards and expansions to the project scope occurred August 2007, January 2009, April 2011, and April 2013. In addition to the DOE monies, the project also consisted of matching funds from the states of Montana and Wyoming. Project participants included Montana Tech; the University of Wyoming; Montana State University; NorthWestern Energy, Inc., and MSE. Research focused on two areas: real-time power-system load control methodologies; and, power-system measurement-based stability-assessment operation and control tools. The majority of effort was focused on area 2. Results from the research includes: development of fundamental power-system dynamic concepts, control schemes, and signal-processing algorithms; many papers (including two prize papers) in leading journals and conferences and leadership of IEEE activities; one patent; participation in major actual-system testing in the western North American power system; prototype power-system operation and control software installed and tested at three major North American control centers; and, the incubation of a new commercial-grade operation and control software tool. Work under this grant certainly supported the DOE-OE goals in the area of “Real Time Grid Reliability Management.”

  7. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1990-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  8. Integrated circuit reliability testing

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor); Sayah, Hoshyar R. (Inventor)

    1988-01-01

    A technique is described for use in determining the reliability of microscopic conductors deposited on an uneven surface of an integrated circuit device. A wafer containing integrated circuit chips is formed with a test area having regions of different heights. At the time the conductors are formed on the chip areas of the wafer, an elongated serpentine assay conductor is deposited on the test area so the assay conductor extends over multiple steps between regions of different heights. Also, a first test conductor is deposited in the test area upon a uniform region of first height, and a second test conductor is deposited in the test area upon a uniform region of second height. The occurrence of high resistances at the steps between regions of different height is indicated by deriving the measured length of the serpentine conductor using the resistance measured between the ends of the serpentine conductor, and comparing that to the design length of the serpentine conductor. The percentage by which the measured length exceeds the design length, at which the integrated circuit will be discarded, depends on the required reliability of the integrated circuit.

  9. The Reliability of Neurons

    PubMed Central

    Bullock, Theodore Holmes

    1970-01-01

    The prevalent probabilistic view is virtually untestable; it remains a plausible belief. The cases usually cited can not be taken as evidence for it. Several grounds for this conclusion are developed. Three issues are distinguished in an attempt to clarify a murky debate: (a) the utility of probabilistic methods in data reduction, (b) the value of models that assume indeterminacy, and (c) the validity of the inference that the nervous system is largely indeterministic at the neuronal level. No exception is taken to the first two; the second is a private heuristic question. The third is the issue to which the assertion in the first two sentences is addressed. Of the two kinds of uncertainty, statistical mechanical (= practical unpredictability) as in a gas, and Heisenbergian indeterminancy, the first certainly exists, the second is moot at the neuronal level. It would contribute to discussion to recognize that neurons perform with a degree of reliability. Although unreliability is difficult to establish, to say nothing of measure, evidence that some neurons have a high degree of reliability, in both connections and activity is increasing greatly. An example is given from sternarchine electric fish. PMID:5462670

  10. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  11. Analysis of an evolving email network

    NASA Astrophysics Data System (ADS)

    Zhu, Chaopin; Kuh, Anthony; Wang, Juan; de Wilde, Philippe

    2006-10-01

    In this paper we study an evolving email network model first introduced by Wang and De Wilde, to the best of our knowledge. The model is analyzed by formulating the network topology as a random process and studying the dynamics of the process. Our analytical results show a number of steady state properties about the email traffic between different nodes and the aggregate networking behavior (i.e., degree distribution, clustering coefficient, average path length, and phase transition), and also confirm the empirical results obtained by Wang and De Wilde. We also conducted simulations confirming the analytical results. Extensive simulations were run to evaluate email traffic behavior at the link and network levels, phase transition phenomena, and also studying the behavior of email traffic in a hierarchical network. The methods established here are also applicable to many other practical networks including sensor networks and social networks.

  12. Pulmonary Sporotrichosis: An Evolving Clinical Paradigm.

    PubMed

    Aung, Ar K; Spelman, Denis W; Thompson, Philip J

    2015-10-01

    In recent decades, sporotrichosis, caused by thermally dimorphic fungi Sporothrix schenckii complex, has become an emerging infection in many parts of the world. Pulmonary infection with S. schenckii still remains relatively uncommon, possibly due to underrecognition. Pulmonary sporotrichosis presents with distinct clinical and radiological patterns in both immunocompetent and immunocompromised hosts and can often result in significant morbidity and mortality despite treatment. Current understanding regarding S. schenckii biology, epidemiology, immunopathology, clinical diagnostics, and treatment options has been evolving in the recent years with increased availability of molecular sequencing techniques. However, this changing knowledge has not yet been fully translated into a better understanding of the clinical aspects of pulmonary sporotrichosis, as such current management guidelines remain unsupported by high-level clinical evidence. This article examines recent advances in the knowledge of sporotrichosis and its application to the difficult challenges of managing pulmonary sporotrichosis.

  13. Evolving resistance among Gram-positive pathogens.

    PubMed

    Munita, Jose M; Bayer, Arnold S; Arias, Cesar A

    2015-09-15

    Antimicrobial therapy is a key component of modern medical practice and a cornerstone for the development of complex clinical interventions in critically ill patients. Unfortunately, the increasing problem of antimicrobial resistance is now recognized as a major public health threat jeopardizing the care of thousands of patients worldwide. Gram-positive pathogens exhibit an immense genetic repertoire to adapt and develop resistance to virtually all antimicrobials clinically available. As more molecules become available to treat resistant gram-positive infections, resistance emerges as an evolutionary response. Thus, antimicrobial resistance has to be envisaged as an evolving phenomenon that demands constant surveillance and continuous efforts to identify emerging mechanisms of resistance to optimize the use of antibiotics and create strategies to circumvent this problem. Here, we will provide a broad perspective on the clinical aspects of antibiotic resistance in relevant gram-positive pathogens with emphasis on the mechanistic strategies used by these organisms to avoid being killed by commonly used antimicrobial agents.

  14. The distances of highly evolved planetary nebulae

    NASA Astrophysics Data System (ADS)

    Phillips, J. P.

    2005-02-01

    The central stars of highly evolved planetary nebulae (PNe) are expected to have closely similar absolute visual magnitudes MV. This enables us to determine approximate distances to these sources where one knows their central star visual magnitudes, and levels of extinction. We find that such an analysis implies values of D which are similar to those determined by Phillips; Cahn, Kaler & Stanghellin; Acker, and Daub. However, our distances are very much smaller than those of Zhang; Bensby & Lundstrom, and van de Steene & Zijlstra. The reasons for these differences are discussed, and can be traced to errors in the assumed relation between brightness temperature and radius. Finally, we determine that the binary companions of such stars can be no brighter than MV~ 6mag, implying a spectral type of K0 or later in the case of main-sequence stars.

  15. Synchronization in evolving snowdrift game model

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Wu, L.; Zhu, S. Q.

    2009-06-01

    The interaction between the evolution of the game and the underlying network structure with evolving snowdrift game model is investigated. The constructed network follows a power-law degree distribution typically showing scale-free feature. The topological features of average path length, clustering coefficient, degree-degree correlations and the dynamical feature of synchronizability are studied. The synchronizability of the constructed networks changes by the interaction. It will converge to a certain value when sufficient new nodes are added. It is found that initial payoffs of nodes greatly affect the synchronizability. When initial payoffs for players are equal, low common initial payoffs may lead to more heterogeneity of the network and good synchronizability. When initial payoffs follow certain distributions, better synchronizability is obtained compared to equal initial payoff. The result is also true for phase synchronization of nonidentical oscillators.

  16. Design Space Issues for Intrinsic Evolvable Hardware

    NASA Technical Reports Server (NTRS)

    Hereford, James; Gwaltney, David

    2004-01-01

    This paper discuss the problem of increased programming time for intrinsic evolvable hardware (EHW) as the complexity of the circuit grows. We develop equations for the size of the population, n, and the number of generations required for the population to converge, ngen, based on L, the length of the programming string. We show that the processing time of the computer becomes negligible for intrinsic EHW since the selection/crossover/mutation steps are only done once per generation, suggesting there is room for use of more complex evolutionary algorithms m intrinsic EHW. F i y , we review the state of the practice and discuss the notion of a system design approach for intrinsic EHW.

  17. Evolving unipolar memristor spiking neural networks

    NASA Astrophysics Data System (ADS)

    Howard, David; Bull, Larry; De Lacy Costello, Ben

    2015-10-01

    Neuromorphic computing - brain-like computing in hardware - typically requires myriad complimentary metal oxide semiconductor spiking neurons interconnected by a dense mesh of nanoscale plastic synapses. Memristors are frequently cited as strong synapse candidates due to their statefulness and potential for low-power implementations. To date, plentiful research has focused on the bipolar memristor synapse, which is capable of incremental weight alterations and can provide adaptive self-organisation under a Hebbian learning scheme. In this paper, we consider the unipolar memristor synapse - a device capable of non-Hebbian switching between only two states (conductive and resistive) through application of a suitable input voltage - and discuss its suitability for neuromorphic systems. A self-adaptive evolutionary process is used to autonomously find highly fit network configurations. Experimentation on two robotics tasks shows that unipolar memristor networks evolve task-solving controllers faster than both bipolar memristor networks and networks containing constant non-plastic connections whilst performing at least comparably.

  18. Life cycle planning: An evolving concept

    SciTech Connect

    Moore, P.J.R.; Gorman, I.G.

    1994-12-31

    Life-cycle planning is an evolving concept in the management of oil and gas projects. BHP Petroleum now interprets this idea to include all development planning from discovery and field appraisal to final abandonment and includes safety, environmental, technical, plant, regulatory, and staffing issues. This article describes in the context of the Timor Sea, how despite initial successes and continuing facilities upgrades, BHPP came to perceive that current operations could be the victim of early development successes, particularly in the areas of corrosion and maintenance. The search for analogies elsewhere lead to the UK North Sea, including the experiences of Britoil and BP, both of which performed detailed Life of Field studies in the later eighties. These materials have been used to construct a format and content for total Life-cycle plans in general and the social changes required to ensure their successful application in Timor Sea operations and deployment throughout Australia.

  19. Modelling of the Evolving Stable Boundary Layer

    NASA Astrophysics Data System (ADS)

    Sorbjan, Zbigniew

    2014-06-01

    A single-column model of the evolving stable boundary layer (SBL) is tested for self-similar properties of the flow and effects of ambient forcing. The turbulence closure of the model is diagnostic, based on the K-theory approach, with a semi-empirical form of the mixing length, and empirical stability functions of the Richardson number. The model results, expressed in terms of local similarity scales, are universal functions, satisfied in the entire SBL. Based on similarity expression, a realizability condition is derived for the minimum allowable turbulent heat flux in the SBL. Numerical experiments show that the development of "horse-shoe" shaped, fixed-elevation hodographs in the interior of the SBL around sunrise is controlled by effects imposed by surface thermal forcing.

  20. Language as a coordination tool evolves slowly

    PubMed Central

    2016-01-01

    Social living ultimately depends on coordination between group members, and communication is necessary to make this possible. We suggest that this might have been the key selection pressure acting on the evolution of language in humans and use a behavioural coordination model to explore the impact of communication efficiency on social group coordination. We show that when language production is expensive but there is an individual benefit to the efficiency with which individuals coordinate their behaviour, the evolution of efficient communication is selected for. Contrary to some views of language evolution, the speed of evolution is necessarily slow because there is no advantage in some individuals evolving communication abilities that much exceed those of the community at large. However, once a threshold competence has been achieved, evolution of higher order language skills may indeed be precipitate. PMID:28083091

  1. Regulatory mechanisms link phenotypic plasticity to evolvability.

    PubMed

    van Gestel, Jordi; Weissing, Franz J

    2016-04-18

    Organisms have a remarkable capacity to respond to environmental change. They can either respond directly, by means of phenotypic plasticity, or they can slowly adapt through evolution. Yet, how phenotypic plasticity links to evolutionary adaptability is largely unknown. Current studies of plasticity tend to adopt a phenomenological reaction norm (RN) approach, which neglects the mechanisms underlying plasticity. Focusing on a concrete question - the optimal timing of bacterial sporulation - we here also consider a mechanistic approach, the evolution of a gene regulatory network (GRN) underlying plasticity. Using individual-based simulations, we compare the RN and GRN approach and find a number of striking differences. Most importantly, the GRN model results in a much higher diversity of responsive strategies than the RN model. We show that each of the evolved strategies is pre-adapted to a unique set of unseen environmental conditions. The regulatory mechanisms that control plasticity therefore critically link phenotypic plasticity to the adaptive potential of biological populations.

  2. Fatigue Reliability of Gas Turbine Engine Structures

    NASA Technical Reports Server (NTRS)

    Cruse, Thomas A.; Mahadevan, Sankaran; Tryon, Robert G.

    1997-01-01

    The results of an investigation are described for fatigue reliability in engine structures. The description consists of two parts. Part 1 is for method development. Part 2 is a specific case study. In Part 1, the essential concepts and practical approaches to damage tolerance design in the gas turbine industry are summarized. These have evolved over the years in response to flight safety certification requirements. The effect of Non-Destructive Evaluation (NDE) methods on these methods is also reviewed. Assessment methods based on probabilistic fracture mechanics, with regard to both crack initiation and crack growth, are outlined. Limit state modeling techniques from structural reliability theory are shown to be appropriate for application to this problem, for both individual failure mode and system-level assessment. In Part 2, the results of a case study for the high pressure turbine of a turboprop engine are described. The response surface approach is used to construct a fatigue performance function. This performance function is used with the First Order Reliability Method (FORM) to determine the probability of failure and the sensitivity of the fatigue life to the engine parameters for the first stage disk rim of the two stage turbine. A hybrid combination of regression and Monte Carlo simulation is to use incorporate time dependent random variables. System reliability is used to determine the system probability of failure, and the sensitivity of the system fatigue life to the engine parameters of the high pressure turbine. 'ne variation in the primary hot gas and secondary cooling air, the uncertainty of the complex mission loading, and the scatter in the material data are considered.

  3. Studying evolved stars with Herschel observations

    NASA Astrophysics Data System (ADS)

    da Silva Santos, João Manuel

    2016-07-01

    A systematic inspection of the far-infrared (FIR) properties of evolved stars allows not only to constrain physical models, but also to understand the chemical evolution that takes place in the end of their lives. In this work we intend to study the circumstellar envelopes (CSE) on a sample of stars in the THROES catalogue from AGB/post-AGB stars to planetary nebulae using photometry and spectroscopy provided by the PACS instrument on-board Herschel telescope. In the first part we are interested in obtaining an estimate of the size of FIR emitting region and to sort our targets in two classes: point-like and extended. Secondly, we focus on the molecular component of the envelope traced by carbon monoxide (CO) rotational lines. We conduct a line survey on a sample of evolved stars by identifying and measuring flux of both 12CO and 13CO isotopologues in the PACS range, while looking at the overall properties of the sample. Lastly, we will be interested in obtaining physical parameters of the CSE, namely gas temperature, mass and mass-loss rate on a sample of carbon stars. For that, we make use of PACS large wavelength coverage, which enables the simultaneous study of a large number of CO transitions, to perform the rotational diagram analysis. We report the detection of CO emission in a high number of stars from the catalogue, which were mostly classified as point-like targets with a few exceptions of planetary nebulae. High J rotational number transitions were detected in a number of targets, revealing the presence of a significant amount of hot gas (T ˜ 400-900 K) and high mass-loss rates. We conclude that Herschel/PACS is in a privileged position to detect a new population of warmer gas, typically missed in sub-mm/mm observations.

  4. Evolving surgical approaches in liver transplantation.

    PubMed

    Petrowsky, Henrik; Busuttil, Ronald W

    2009-02-01

    The growing discrepancy between the need and the availability of donor livers has resulted in evolving surgical approaches in liver transplantation during the last two decades to expand the donor pool. One approach is to transplant partial grafts, obtained either from a living donor or splitting a cadaveric donor liver. For both surgical methods, it is important to obtain a minimal viable graft volume to prevent small-for-size syndrome and graft failure. This minimal volume, expressed as graft-to-whole body ratio, must be between 0.8 and 1%. Living donor liver transplantation (LDLT) became the primary transplant option in many Asian countries and is increasingly performed as an adjunct transplant option in countries with low donation rates. Split liver transplantation (SLT) is a surgical method that creates two allografts from one deceased donor. The most widely used splitting technique is the division of the liver into a left lateral sectoral graft (segments 2 and 3) for a pediatric patient and a right trisegmental graft (segments 1 and 4 to 8) for an adult patient. Both LDLT and SLT are also important and established methods for the treatment of pediatric patients. Another evolving surgical approach is auxiliary liver transplantation, which describes the transplanting a whole or partial graft with preservation of the partial native liver. This bridging technique is applied in patients with fulminate liver failure and should allow the regeneration of the injured liver with the potential to discontinue immunosuppression. Other methods such as xenotransplantation, as well as hepatocyte and stem cell transplantation, are promising approaches that are still in experimental phases.

  5. How evolved psychological mechanisms empower cultural group selection.

    PubMed

    Henrich, Joseph; Boyd, Robert

    2016-01-01

    Driven by intergroup competition, social norms, beliefs, and practices can evolve in ways that more effectively tap into a wide variety of evolved psychological mechanisms to foster group-beneficial behavior. The more powerful such evolved mechanisms are, the more effectively culture can potentially harness and manipulate them to generate greater phenotypic variation across groups, thereby fueling cultural group selection.

  6. Static and Evolving Norovirus Genotypes: Implications for Epidemiology and Immunity.

    PubMed

    Parra, Gabriel I; Squires, R Burke; Karangwa, Consolee K; Johnson, Jordan A; Lepore, Cara J; Sosnovtsev, Stanislav V; Green, Kim Y

    2017-01-01

    Noroviruses are major pathogens associated with acute gastroenteritis worldwide. Their RNA genomes are diverse, with two major genogroups (GI and GII) comprised of at least 28 genotypes associated with human disease. To elucidate mechanisms underlying norovirus diversity and evolution, we used a large-scale genomics approach to analyze human norovirus sequences. Comparison of over 2000 nearly full-length ORF2 sequences representing most of the known GI and GII genotypes infecting humans showed a limited number (≤5) of distinct intra-genotypic variants within each genotype, with the exception of GII.4. The non-GII.4 genotypes were comprised of one or more intra-genotypic variants, with each variant containing strains that differed by only a few residues over several decades (remaining "static") and that have co-circulated with no clear epidemiologic pattern. In contrast, the GII.4 genotype presented the largest number of variants (>10) that have evolved over time with a clear pattern of periodic variant replacement. To expand our understanding of these two patterns of diversification ("static" versus "evolving"), we analyzed using NGS the nearly full-length norovirus genome in healthy individuals infected with GII.4, GII.6 or GII.17 viruses in different outbreak settings. The GII.4 viruses accumulated mutations rapidly within and between hosts, while the GII.6 and GII.17 viruses remained relatively stable, consistent with their diversification patterns. Further analysis of genetic relationships and natural history patterns identified groupings of certain genotypes into larger related clusters designated here as "immunotypes". We propose that "immunotypes" and their evolutionary patterns influence the prevalence of a particular norovirus genotype in the human population.

  7. Reliability of Fault Tolerant Control Systems. Part 1

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva

    2001-01-01

    This paper reports Part I of a two part effort, that is intended to delineate the relationship between reliability and fault tolerant control in a quantitative manner. Reliability analysis of fault-tolerant control systems is performed using Markov models. Reliability properties, peculiar to fault-tolerant control systems are emphasized. As a consequence, coverage of failures through redundancy management can be severely limited. It is shown that in the early life of a syi1ein composed of highly reliable subsystems, the reliability of the overall system is affine with respect to coverage, and inadequate coverage induces dominant single point failures. The utility of some existing software tools for assessing the reliability of fault tolerant control systems is also discussed. Coverage modeling is attempted in Part II in a way that captures its dependence on the control performance and on the diagnostic resolution.

  8. Testing for PV Reliability (Presentation)

    SciTech Connect

    Kurtz, S.; Bansal, S.

    2014-09-01

    The DOE SUNSHOT workshop is seeking input from the community about PV reliability and how the DOE might address gaps in understanding. This presentation describes the types of testing that are needed for PV reliability and introduces a discussion to identify gaps in our understanding of PV reliability testing.

  9. Discrete Reliability Projection

    DTIC Science & Technology

    2014-12-01

    Defense, Handbook MIL - HDBK -189C, 2011 Hall, J. B., Methodology for Evaluating Reliability Growth Programs of Discrete Systems, Ph.D. thesis, University...pk,i ] · [ 1− (1− θ̆k) · ( N k · T )]k−m , (2.13) 5 2 Hall’s Model where m is the number of observed failure modes and d∗i estimates di (either based...Mode Failures FEF Ni d ∗ i 1 1 0.95 2 1 0.70 3 1 0.90 4 1 0.90 5 4 0.95 6 2 0.70 7 1 0.80 Using equations 2.1 and 2.2 we can calculate the failure

  10. Sustaining an International Partnership: An Evolving Collaboration

    ERIC Educational Resources Information Center

    Pierson, Melinda R.; Myck-Wayne, Janice; Stang, Kristin K.; Basinska, Anna

    2015-01-01

    Universities across the United States have an increasing interest in international education. Increasing global awareness through educational collaborations will promote greater cross-cultural understanding and build effective relationships with diverse communities. This paper documents one university's effort to build an effective international…

  11. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  12. Epidemic spreading on evolving signed networks

    NASA Astrophysics Data System (ADS)

    Saeedian, M.; Azimi-Tafreshi, N.; Jafari, G. R.; Kertesz, J.

    2017-02-01

    Most studies of disease spreading consider the underlying social network as obtained without the contagion, though epidemic influences people's willingness to contact others: A "friendly" contact may be turned to "unfriendly" to avoid infection. We study the susceptible-infected disease-spreading model on signed networks, in which each edge is associated with a positive or negative sign representing the friendly or unfriendly relation between its end nodes. In a signed network, according to Heider's theory, edge signs evolve such that finally a state of structural balance is achieved, corresponding to no frustration in physics terms. However, the danger of infection affects the evolution of its edge signs. To describe the coupled problem of the sign evolution and disease spreading, we generalize the notion of structural balance by taking into account the state of the nodes. We introduce an energy function and carry out Monte Carlo simulations on complete networks to test the energy landscape, where we find local minima corresponding to the so-called jammed states. We study the effect of the ratio of initial friendly to unfriendly connections on the propagation of disease. The steady state can be balanced or a jammed state such that a coexistence occurs between susceptible and infected nodes in the system.

  13. Evolving role of MRI in Crohn's disease.

    PubMed

    Yacoub, Joseph H; Obara, Piotr; Oto, Aytekin

    2013-06-01

    MR enterography is playing an evolving role in the evaluation of small bowel Crohn's disease (CD). Standard MR enterography includes a combination of rapidly acquired T2 sequence, balanced steady-state acquisition, and contrast enhanced T1-weighted gradient echo sequence. The diagnostic performance of these sequences has been shown to be comparable, and in some respects superior, to other small bowel imaging modalities. The findings of CD on MR enterography have been well described in the literature. New and emerging techniques such as diffusion-weighted imaging (DWI), dynamic contrast enhanced MRI (DCE-MRI), cinematography, and magnetization transfer, may lead to improved accuracy in characterizing the disease. These advanced techniques can provide quantitative parameters that may prove to be useful in assessing disease activity, severity, and response to treatment. In the future, MR enterography may play an increasing role in management decisions for patients with small bowel CD; however, larger studies are needed to validate these emerging MRI parameters as imaging biomarkers.

  14. Evolving application of biomimetic nanostructured hydroxyapatite

    PubMed Central

    Roveri, Norberto; Iafisco, Michele

    2010-01-01

    By mimicking Nature, we can design and synthesize inorganic smart materials that are reactive to biological tissues. These smart materials can be utilized to design innovative third-generation biomaterials, which are able to not only optimize their interaction with biological tissues and environment, but also mimic biogenic materials in their functionalities. The biomedical applications involve increasing the biomimetic levels from chemical composition, structural organization, morphology, mechanical behavior, nanostructure, and bulk and surface chemical–physical properties until the surface becomes bioreactive and stimulates cellular materials. The chemical–physical characteristics of biogenic hydroxyapatites from bone and tooth have been described, in order to point out the elective sides, which are important to reproduce the design of a new biomimetic synthetic hydroxyapatite. This review outlines the evolving applications of biomimetic synthetic calcium phosphates, details the main characteristics of bone and tooth, where the calcium phosphates are present, and discusses the chemical–physical characteristics of biomimetic calcium phosphates, methods of synthesizing them, and some of their biomedical applications. PMID:24198477

  15. Women's oral health: the evolving science.

    PubMed

    Sinkford, Jeanne C; Valachovic, Richard W; Harrison, Sonja G

    2008-02-01

    The evidence base for women's oral health is emerging from legislative action, clinical research, and survey documentation. The Women's Health in the Dental School Curriculum study (1999) followed a similar study (1996) of medical school curricula. Both of these major efforts resulted from statutory mandates in the National Institutes of Health Revitalization Act of 1993 (updated October 2000). A major study of the Institute of Medicine (IOM) National Academy of Sciences in 2001 concluded that "the study of sex differences is evolving into a mature science." This IOM study documented the scientific basis for gender-related policy and research and challenged the dental research enterprise to conduct collaborative, cross-disciplinary research on gender-related issues in oral health, disease, and disparities. This report chronicles some of the factors that have and continue to influence concepts of women's oral health in dental education, research, and practice. Gender issues related to women's health are no longer restricted to reproductive issues but are being considered across the life span and include psychosocial factors that impact women's health and treatment outcomes.

  16. Speciation genetics: current status and evolving approaches

    PubMed Central

    Wolf, Jochen B. W.; Lindell, Johan; Backström, Niclas

    2010-01-01

    The view of species as entities subjected to natural selection and amenable to change put forth by Charles Darwin and Alfred Wallace laid the conceptual foundation for understanding speciation. Initially marred by a rudimental understanding of hereditary principles, evolutionists gained appreciation of the mechanistic underpinnings of speciation following the merger of Mendelian genetic principles with Darwinian evolution. Only recently have we entered an era where deciphering the molecular basis of speciation is within reach. Much focus has been devoted to the genetic basis of intrinsic postzygotic isolation in model organisms and several hybrid incompatibility genes have been successfully identified. However, concomitant with the recent technological advancements in genome analysis and a newfound interest in the role of ecology in the differentiation process, speciation genetic research is becoming increasingly open to non-model organisms. This development will expand speciation research beyond the traditional boundaries and unveil the genetic basis of speciation from manifold perspectives and at various stages of the splitting process. This review aims at providing an extensive overview of speciation genetics. Starting from key historical developments and core concepts of speciation genetics, we focus much of our attention on evolving approaches and introduce promising methodological approaches for future research venues. PMID:20439277

  17. Extreme insular dwarfism evolved in a mammoth.

    PubMed

    Herridge, Victoria L; Lister, Adrian M

    2012-08-22

    The insular dwarfism seen in Pleistocene elephants has come to epitomize the island rule; yet our understanding of this phenomenon is hampered by poor taxonomy. For Mediterranean dwarf elephants, where the most extreme cases of insular dwarfism are observed, a key systematic question remains unresolved: are all taxa phyletic dwarfs of a single mainland species Palaeoloxodon antiquus (straight-tusked elephant), or are some referable to Mammuthus (mammoths)? Ancient DNA and geochronological evidence have been used to support a Mammuthus origin for the Cretan 'Palaeoloxodon' creticus, but these studies have been shown to be flawed. On the basis of existing collections and recent field discoveries, we present new, morphological evidence for the taxonomic status of 'P'. creticus, and show that it is indeed a mammoth, most probably derived from Early Pleistocene Mammuthus meridionalis or possibly Late Pliocene Mammuthus rumanus. We also show that Mammuthus creticus is smaller than other known insular dwarf mammoths, and is similar in size to the smallest dwarf Palaeoloxodon species from Sicily and Malta, making it the smallest mammoth species known to have existed. These findings indicate that extreme insular dwarfism has evolved to a similar degree independently in two elephant lineages.

  18. Consensus in evolving networks of mobile agents

    NASA Astrophysics Data System (ADS)

    Baronchelli, Andrea; Díaz-Guilera, Albert

    2012-02-01

    Populations of mobile and communicating agents describe a vast array of technological and natural systems, ranging from sensor networks to animal groups. Here, we investigate how a group-level agreement may emerge in the continuously evolving networks defined by the local interactions of the moving individuals. We adopt a general scheme of motion in two dimensions and we let the individuals interact through the minimal naming game, a prototypical scheme to investigate social consensus. We distinguish different regimes of convergence determined by the emission range of the agents and by their mobility, and we identify the corresponding scaling behaviors of the consensus time. In the same way, we rationalize also the behavior of the maximum memory used during the convergence process, which determines the minimum cognitive/storage capacity needed by the individuals. Overall, we believe that the simple and general model presented in this talk can represent a helpful reference for a better understanding of the behavior of populations of mobile agents.

  19. Metapopulation capacity of evolving fluvial landscapes

    NASA Astrophysics Data System (ADS)

    Bertuzzo, Enrico; Rodriguez-Iturbe, Ignacio; Rinaldo, Andrea

    2015-04-01

    The form of fluvial landscapes is known to attain stationary network configurations that settle in dynamically accessible minima of total energy dissipation by landscape-forming discharges. Recent studies have highlighted the role of the dendritic structure of river networks in controlling population dynamics of the species they host and large-scale biodiversity patterns. Here, we systematically investigate the relation between energy dissipation, the physical driver for the evolution of river networks, and the ecological dynamics of their embedded biota. To that end, we use the concept of metapopulation capacity, a measure to link landscape structures with the population dynamics they host. Technically, metapopulation capacity is the leading eigenvalue λM of an appropriate "landscape" matrix subsuming whether a given species is predicted to persist in the long run. λM can conveniently be used to rank different landscapes in terms of their capacity to support viable metapopulations. We study how λM changes in response to the evolving network configurations of spanning trees. Such sequence of configurations is theoretically known to relate network selection to general landscape evolution equations through imperfect searches for dynamically accessible states frustrated by the vagaries of Nature. Results show that the process shaping the metric and the topological properties of river networks, prescribed by physical constraints, leads to a progressive increase in the corresponding metapopulation capacity and therefore on the landscape capacity to support metapopulations—with implications on biodiversity in fluvial ecosystems.

  20. Tearing Mode Stability of Evolving Toroidal Equilibria

    NASA Astrophysics Data System (ADS)

    Pletzer, A.; McCune, D.; Manickam, J.; Jardin, S. C.

    2000-10-01

    There are a number of toroidal equilibrium (such as JSOLVER, ESC, EFIT, and VMEC) and transport codes (such as TRANSP, BALDUR, and TSC) in our community that utilize differing equilibrium representations. There are also many heating and current drive (LSC and TORRAY), and stability (PEST1-3, GATO, NOVA, MARS, DCON, M3D) codes that require this equilibrium information. In an effort to provide seamless compatibility between the codes that produce and need these equilibria, we have developed two Fortran 90 modules, MEQ and XPLASMA, that serve as common interfaces between these two classes of codes. XPLASMA provides a common equilibrium representation for the heating and current drive applications while MEQ provides common equilibrium and associated metric information needed by MHD stability codes. We illustrate the utility of this approach by presenting results of PEST-3 tearing stability calculations of an NSTX discharge performed on profiles provided by the TRANSP code. Using the MEQ module, the TRANSP equilibrium data are stored in a Fortran 90 derived type and passed to PEST3 as a subroutine argument. All calculations are performed on the fly, as the profiles evolve.

  1. Origins of stereoselectivity in evolved ketoreductases.

    PubMed

    Noey, Elizabeth L; Tibrewal, Nidhi; Jiménez-Osés, Gonzalo; Osuna, Sílvia; Park, Jiyong; Bond, Carly M; Cascio, Duilio; Liang, Jack; Zhang, Xiyun; Huisman, Gjalt W; Tang, Yi; Houk, Kendall N

    2015-12-22

    Mutants of Lactobacillus kefir short-chain alcohol dehydrogenase, used here as ketoreductases (KREDs), enantioselectively reduce the pharmaceutically relevant substrates 3-thiacyclopentanone and 3-oxacyclopentanone. These substrates differ by only the heteroatom (S or O) in the ring, but the KRED mutants reduce them with different enantioselectivities. Kinetic studies show that these enzymes are more efficient with 3-thiacyclopentanone than with 3-oxacyclopentanone. X-ray crystal structures of apo- and NADP(+)-bound selected mutants show that the substrate-binding loop conformational preferences are modified by these mutations. Quantum mechanical calculations and molecular dynamics (MD) simulations are used to investigate the mechanism of reduction by the enzyme. We have developed an MD-based method for studying the diastereomeric transition state complexes and rationalize different enantiomeric ratios. This method, which probes the stability of the catalytic arrangement within the theozyme, shows a correlation between the relative fractions of catalytically competent poses for the enantiomeric reductions and the experimental enantiomeric ratio. Some mutations, such as A94F and Y190F, induce conformational changes in the active site that enlarge the small binding pocket, facilitating accommodation of the larger S atom in this region and enhancing S-selectivity with 3-thiacyclopentanone. In contrast, in the E145S mutant and the final variant evolved for large-scale production of the intermediate for the antibiotic sulopenem, R-selectivity is promoted by shrinking the small binding pocket, thereby destabilizing the pro-S orientation.

  2. Origins of stereoselectivity in evolved ketoreductases

    PubMed Central

    Noey, Elizabeth L.; Tibrewal, Nidhi; Jiménez-Osés, Gonzalo; Osuna, Sílvia; Park, Jiyong; Bond, Carly M.; Cascio, Duilio; Liang, Jack; Zhang, Xiyun; Huisman, Gjalt W.; Tang, Yi; Houk, Kendall N.

    2015-01-01

    Mutants of Lactobacillus kefir short-chain alcohol dehydrogenase, used here as ketoreductases (KREDs), enantioselectively reduce the pharmaceutically relevant substrates 3-thiacyclopentanone and 3-oxacyclopentanone. These substrates differ by only the heteroatom (S or O) in the ring, but the KRED mutants reduce them with different enantioselectivities. Kinetic studies show that these enzymes are more efficient with 3-thiacyclopentanone than with 3-oxacyclopentanone. X-ray crystal structures of apo- and NADP+-bound selected mutants show that the substrate-binding loop conformational preferences are modified by these mutations. Quantum mechanical calculations and molecular dynamics (MD) simulations are used to investigate the mechanism of reduction by the enzyme. We have developed an MD-based method for studying the diastereomeric transition state complexes and rationalize different enantiomeric ratios. This method, which probes the stability of the catalytic arrangement within the theozyme, shows a correlation between the relative fractions of catalytically competent poses for the enantiomeric reductions and the experimental enantiomeric ratio. Some mutations, such as A94F and Y190F, induce conformational changes in the active site that enlarge the small binding pocket, facilitating accommodation of the larger S atom in this region and enhancing S-selectivity with 3-thiacyclopentanone. In contrast, in the E145S mutant and the final variant evolved for large-scale production of the intermediate for the antibiotic sulopenem, R-selectivity is promoted by shrinking the small binding pocket, thereby destabilizing the pro-S orientation. PMID:26644568

  3. Evolving paradigms in multifocal breast cancer.

    PubMed

    Salgado, Roberto; Aftimos, Philippe; Sotiriou, Christos; Desmedt, Christine

    2015-04-01

    The 7th edition of the TNM defines multifocal breast cancer as multiple simultaneous ipsilateral and synchronous breast cancer lesions, provided they are macroscopically distinct and measurable using current traditional pathological and clinical tools. According to the College of American Pathologists (CAP), the characterization of only the largest lesion is considered sufficient, unless the grade and/or histology are different between the lesions. Here, we review three potentially clinically relevant aspects of multifocal breast cancers: first, the importance of a different intrinsic breast cancer subtype of the various lesions; second, the emerging awareness of inter-lesion heterogeneity; and last but not least, the potential introduction of bias in clinical trials due to the unrecognized biological diversity of these cancers. Although the current strategy to assess the lesion with the largest diameter has clearly its advantages in terms of costs and feasibility, this recommendation may not be sustainable in time and might need to be adapted to be compliant with new evolving paradigms in breast cancer.

  4. Speciation genetics: current status and evolving approaches.

    PubMed

    Wolf, Jochen B W; Lindell, Johan; Backström, Niclas

    2010-06-12

    The view of species as entities subjected to natural selection and amenable to change put forth by Charles Darwin and Alfred Wallace laid the conceptual foundation for understanding speciation. Initially marred by a rudimental understanding of hereditary principles, evolutionists gained appreciation of the mechanistic underpinnings of speciation following the merger of Mendelian genetic principles with Darwinian evolution. Only recently have we entered an era where deciphering the molecular basis of speciation is within reach. Much focus has been devoted to the genetic basis of intrinsic postzygotic isolation in model organisms and several hybrid incompatibility genes have been successfully identified. However, concomitant with the recent technological advancements in genome analysis and a newfound interest in the role of ecology in the differentiation process, speciation genetic research is becoming increasingly open to non-model organisms. This development will expand speciation research beyond the traditional boundaries and unveil the genetic basis of speciation from manifold perspectives and at various stages of the splitting process. This review aims at providing an extensive overview of speciation genetics. Starting from key historical developments and core concepts of speciation genetics, we focus much of our attention on evolving approaches and introduce promising methodological approaches for future research venues.

  5. An evolving model of online bipartite networks

    NASA Astrophysics Data System (ADS)

    Zhang, Chu-Xu; Zhang, Zi-Ke; Liu, Chuang

    2013-12-01

    Understanding the structure and evolution of online bipartite networks is a significant task since they play a crucial role in various e-commerce services nowadays. Recently, various attempts have been tried to propose different models, resulting in either power-law or exponential degree distributions. However, many empirical results show that the user degree distribution actually follows a shifted power-law distribution, the so-called Mandelbrot’s law, which cannot be fully described by previous models. In this paper, we propose an evolving model, considering two different user behaviors: random and preferential attachment. Extensive empirical results on two real bipartite networks, Delicious and CiteULike, show that the theoretical model can well characterize the structure of real networks for both user and object degree distributions. In addition, we introduce a structural parameter p, to demonstrate that the hybrid user behavior leads to the shifted power-law degree distribution, and the region of power-law tail will increase with the increment of p. The proposed model might shed some lights in understanding the underlying laws governing the structure of real online bipartite networks.

  6. How does cognition evolve? Phylogenetic comparative psychology.

    PubMed

    MacLean, Evan L; Matthews, Luke J; Hare, Brian A; Nunn, Charles L; Anderson, Rindy C; Aureli, Filippo; Brannon, Elizabeth M; Call, Josep; Drea, Christine M; Emery, Nathan J; Haun, Daniel B M; Herrmann, Esther; Jacobs, Lucia F; Platt, Michael L; Rosati, Alexandra G; Sandel, Aaron A; Schroepfer, Kara K; Seed, Amanda M; Tan, Jingzhi; van Schaik, Carel P; Wobber, Victoria

    2012-03-01

    Now more than ever animal studies have the potential to test hypotheses regarding how cognition evolves. Comparative psychologists have developed new techniques to probe the cognitive mechanisms underlying animal behavior, and they have become increasingly skillful at adapting methodologies to test multiple species. Meanwhile, evolutionary biologists have generated quantitative approaches to investigate the phylogenetic distribution and function of phenotypic traits, including cognition. In particular, phylogenetic methods can quantitatively (1) test whether specific cognitive abilities are correlated with life history (e.g., lifespan), morphology (e.g., brain size), or socio-ecological variables (e.g., social system), (2) measure how strongly phylogenetic relatedness predicts the distribution of cognitive skills across species, and (3) estimate the ancestral state of a given cognitive trait using measures of cognitive performance from extant species. Phylogenetic methods can also be used to guide the selection of species comparisons that offer the strongest tests of a priori predictions of cognitive evolutionary hypotheses (i.e., phylogenetic targeting). Here, we explain how an integration of comparative psychology and evolutionary biology will answer a host of questions regarding the phylogenetic distribution and history of cognitive traits, as well as the evolutionary processes that drove their evolution.

  7. Fast evolving pair-instability supernovae

    DOE PAGES

    Kozyreva, Alexandra; Gilmer, Matthew; Hirschi, Raphael; ...

    2016-10-06

    With an increasing number of superluminous supernovae (SLSNe) discovered the ques- tion of their origin remains open and causes heated debates in the supernova commu- nity. Currently, there are three proposed mechanisms for SLSNe: (1) pair-instability supernovae (PISN), (2) magnetar-driven supernovae, and (3) models in which the su- pernova ejecta interacts with a circumstellar material ejected before the explosion. Based on current observations of SLSNe, the PISN origin has been disfavoured for a number of reasons. Many PISN models provide overly broad light curves and too reddened spectra, because of massive ejecta and a high amount of nickel. In themore » cur- rent study we re-examine PISN properties using progenitor models computed with the GENEC code. We calculate supernova explosions with FLASH and light curve evolu- tion with the radiation hydrodynamics code STELLA. We find that high-mass models (200 M⊙ and 250 M⊙) at relatively high metallicity (Z=0.001) do not retain hydro- gen in the outer layers and produce relatively fast evolving PISNe Type I and might be suitable to explain some SLSNe. We also investigate uncertainties in light curve modelling due to codes, opacities, the nickel-bubble effect and progenitor structure and composition.« less

  8. Fast evolving pair-instability supernovae

    SciTech Connect

    Kozyreva, Alexandra; Gilmer, Matthew; Hirschi, Raphael; Frohlich, Carla; Blinnikov, Sergey; Wollaeger, Ryan Thomas; Noebauer, Ulrich M.; van Rossum, Daniel R.; Heger, Alexander; Even, Wesley Paul; Waldman, Roni; Tolstov, Alexey; Chatzopoulos, Emmanouil; Sorokina, Elena

    2016-10-06

    With an increasing number of superluminous supernovae (SLSNe) discovered the ques- tion of their origin remains open and causes heated debates in the supernova commu- nity. Currently, there are three proposed mechanisms for SLSNe: (1) pair-instability supernovae (PISN), (2) magnetar-driven supernovae, and (3) models in which the su- pernova ejecta interacts with a circumstellar material ejected before the explosion. Based on current observations of SLSNe, the PISN origin has been disfavoured for a number of reasons. Many PISN models provide overly broad light curves and too reddened spectra, because of massive ejecta and a high amount of nickel. In the cur- rent study we re-examine PISN properties using progenitor models computed with the GENEC code. We calculate supernova explosions with FLASH and light curve evolu- tion with the radiation hydrodynamics code STELLA. We find that high-mass models (200 M⊙ and 250 M⊙) at relatively high metallicity (Z=0.001) do not retain hydro- gen in the outer layers and produce relatively fast evolving PISNe Type I and might be suitable to explain some SLSNe. We also investigate uncertainties in light curve modelling due to codes, opacities, the nickel-bubble effect and progenitor structure and composition.

  9. Extreme insular dwarfism evolved in a mammoth

    PubMed Central

    Herridge, Victoria L.; Lister, Adrian M.

    2012-01-01

    The insular dwarfism seen in Pleistocene elephants has come to epitomize the island rule; yet our understanding of this phenomenon is hampered by poor taxonomy. For Mediterranean dwarf elephants, where the most extreme cases of insular dwarfism are observed, a key systematic question remains unresolved: are all taxa phyletic dwarfs of a single mainland species Palaeoloxodon antiquus (straight-tusked elephant), or are some referable to Mammuthus (mammoths)? Ancient DNA and geochronological evidence have been used to support a Mammuthus origin for the Cretan ‘Palaeoloxodon’ creticus, but these studies have been shown to be flawed. On the basis of existing collections and recent field discoveries, we present new, morphological evidence for the taxonomic status of ‘P’. creticus, and show that it is indeed a mammoth, most probably derived from Early Pleistocene Mammuthus meridionalis or possibly Late Pliocene Mammuthus rumanus. We also show that Mammuthus creticus is smaller than other known insular dwarf mammoths, and is similar in size to the smallest dwarf Palaeoloxodon species from Sicily and Malta, making it the smallest mammoth species known to have existed. These findings indicate that extreme insular dwarfism has evolved to a similar degree independently in two elephant lineages. PMID:22572206

  10. Evolving dynamic web pages using web mining

    NASA Astrophysics Data System (ADS)

    Menon, Kartik; Dagli, Cihan H.

    2003-08-01

    The heterogeneity and the lack of structure that permeates much of the ever expanding information sources on the WWW makes it difficult for the user to properly and efficiently access different web pages. Different users have different needs from the same web page. It is necessary to train the system to understand the needs and demands of the users. In other words there is a need for efficient and proper web mining. In this paper issues and possible ways of training the system and providing high level of organization for semi structured data available on the web is discussed. Web pages can be evolved based on history of query searches, browsing, links traversed and observation of the user behavior like book marking and time spent on viewing. Fuzzy clustering techniques help in grouping natural users and groups, neural networks, association rules and web traversals patterns help in efficient sequential anaysis based on previous searches and queries by the user. In this paper we analyze web server logs using above mentioned techniques to know more about user interactions. Analyzing these web server logs help to closely understand the user behavior and his/her web access pattern.

  11. Generative Representations for Evolving Families of Designs

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2003-01-01

    Since typical evolutionary design systems encode only a single artifact with each individual, each time the objective changes a new set of individuals must be evolved. When this objective varies in a way that can be parameterized, a more general method is to use a representation in which a single individual encodes an entire class of artifacts. In addition to saving time by preventing the need for multiple evolutionary runs, the evolution of parameter-controlled designs can create families of artifacts with the same style and a reuse of parts between members of the family. In this paper an evolutionary design system is described which uses a generative representation to encode families of designs. Because a generative representation is an algorithmic encoding of a design, its input parameters are a way to control aspects of the design it generates. By evaluating individuals multiple times with different input parameters the evolutionary design system creates individuals in which the input parameter controls specific aspects of a design. This system is demonstrated on two design substrates: neural-networks which solve the 3/5/7-parity problem and three-dimensional tables of varying heights.

  12. Lower mass limit of an evolving interstellar cloud and chemistry in an evolving oscillatory cloud

    NASA Technical Reports Server (NTRS)

    Tarafdar, S. P.

    1986-01-01

    Simultaneous solution of the equation of motion, equation of state and energy equation including heating and cooling processes for interstellar medium gives for a collapsing cloud a lower mass limit which is significantly smaller than the Jeans mass for the same initial density. The clouds with higher mass than this limiting mass collapse whereas clouds with smaller than critical mass pass through a maximum central density giving apparently similar clouds (i.e., same Av, size and central density) at two different phases of its evolution (i.e., with different life time). Preliminary results of chemistry in such an evolving oscillatory cloud show significant difference in abundances of some of the molecules in two physically similar clouds with different life times. The problems of depletion and short life time of evolving clouds appear to be less severe in such an oscillatory cloud.

  13. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  14. Eosinophilic esophagitis: current understanding and evolving concepts

    PubMed Central

    Kweh, Barry; Thien, Francis

    2017-01-01

    Eosinophilic esophagitis (EoE) is now considered to represent a form of food allergy and this is demonstrated by a response to elimination diet in many patients. A critical additional factor may be an inherent impairment in epithelial barrier integrity, possibly worsened by reflux of gastric contents and improved with proton pump inhibitor (PPI) use. Key clinic challenges are posed by the absence of reliable allergy tests to guide elimination diet, and the subsequent need for invasive endoscopic assessment following empirical food challenge, meaning that corticosteroids will remain the mainstay of therapy for many. From a research standpoint, determining if impairments in barrier integrity are innate, and how PPIs address this deficit (which may be pH independent) are important questions that when answered may allow future therapeutic advancement. PMID:28154800

  15. Variant profiling of evolving prokaryotic populations

    PubMed Central

    Zojer, Markus; Schuster, Lisa N.; Schulz, Frederik; Pfundner, Alexander; Horn, Matthias

    2017-01-01

    Genomic heterogeneity of bacterial species is observed and studied in experimental evolution experiments and clinical diagnostics, and occurs as micro-diversity of natural habitats. The challenge for genome research is to accurately capture this heterogeneity with the currently used short sequencing reads. Recent advances in NGS technologies improved the speed and coverage and thus allowed for deep sequencing of bacterial populations. This facilitates the quantitative assessment of genomic heterogeneity, including low frequency alleles or haplotypes. However, false positive variant predictions due to sequencing errors and mapping artifacts of short reads need to be prevented. We therefore created VarCap, a workflow for the reliable prediction of different types of variants even at low frequencies. In order to predict SNPs, InDels and structural variations, we evaluated the sensitivity and accuracy of different software tools using synthetic read data. The results suggested that the best sensitivity could be reached by a union of different tools, however at the price of increased false positives. We identified possible reasons for false predictions and used this knowledge to improve the accuracy by post-filtering the predicted variants according to properties such as frequency, coverage, genomic environment/localization and co-localization with other variants. We observed that best precision was achieved by using an intersection of at least two tools per variant. This resulted in the reliable prediction of variants above a minimum relative abundance of 2%. VarCap is designed for being routinely used within experimental evolution experiments or for clinical diagnostics. The detected variants are reported as frequencies within a VCF file and as a graphical overview of the distribution of the different variant/allele/haplotype frequencies. The source code of VarCap is available at https://github.com/ma2o/VarCap. In order to provide this workflow to a broad community

  16. Variant profiling of evolving prokaryotic populations.

    PubMed

    Zojer, Markus; Schuster, Lisa N; Schulz, Frederik; Pfundner, Alexander; Horn, Matthias; Rattei, Thomas

    2017-01-01

    Genomic heterogeneity of bacterial species is observed and studied in experimental evolution experiments and clinical diagnostics, and occurs as micro-diversity of natural habitats. The challenge for genome research is to accurately capture this heterogeneity with the currently used short sequencing reads. Recent advances in NGS technologies improved the speed and coverage and thus allowed for deep sequencing of bacterial populations. This facilitates the quantitative assessment of genomic heterogeneity, including low frequency alleles or haplotypes. However, false positive variant predictions due to sequencing errors and mapping artifacts of short reads need to be prevented. We therefore created VarCap, a workflow for the reliable prediction of different types of variants even at low frequencies. In order to predict SNPs, InDels and structural variations, we evaluated the sensitivity and accuracy of different software tools using synthetic read data. The results suggested that the best sensitivity could be reached by a union of different tools, however at the price of increased false positives. We identified possible reasons for false predictions and used this knowledge to improve the accuracy by post-filtering the predicted variants according to properties such as frequency, coverage, genomic environment/localization and co-localization with other variants. We observed that best precision was achieved by using an intersection of at least two tools per variant. This resulted in the reliable prediction of variants above a minimum relative abundance of 2%. VarCap is designed for being routinely used within experimental evolution experiments or for clinical diagnostics. The detected variants are reported as frequencies within a VCF file and as a graphical overview of the distribution of the different variant/allele/haplotype frequencies. The source code of VarCap is available at https://github.com/ma2o/VarCap. In order to provide this workflow to a broad community

  17. The Evolving Context for Science and Society

    NASA Astrophysics Data System (ADS)

    Leshner, Alan I.

    2012-01-01

    The relationship between science and the rest of society is critical both to the support it receives from the public and to the receptivity of the broader citizenry to science's explanations of the nature of the world and to its other outputs. Science's ultimate usefulness depends on a receptive public. For example, given that science and technology are imbedded in virtually every issue of modern life, either as a cause or a cure, it is critical that the relationship be strong and that the role of science is well appreciated by society, or the impacts of scientific advances will fall short of their great potential. Unfortunately, a variety of problems have been undermining the science-society relationship for over a decade. Some problems emerge from within the scientific enterprise - like scientific misconduct or conflicts of interest - and tarnish or weaken its image and credibility. Other problems and stresses come from outside the enterprise. The most obvious external pressure is that the world economic situation is undermining the financial support of both the conduct and infrastructure of science. Other examples of external pressures include conflicts between what science is revealing and political or economic expediency - e.g., global climate change - or instances where scientific advances encroach upon core human values or beliefs - e.g., scientific understanding of the origins and evolution of the universe as compared to biblical accounts of creation. Significant efforts - some dramatically non-traditional for many in the scientific community - are needed to restore balance to the science-society relationship.

  18. Synthetic Model of the Oxygen-Evolving Center: Photosystem II under the Spotlight.

    PubMed

    Yu, Yang; Hu, Cheng; Liu, Xiaohong; Wang, Jiangyun

    2015-09-21

    The oxygen-evolving center (OEC) in photosystem II catalyzes a water splitting reaction. Great efforts have already been made to artificially synthesize the OEC, in order to elucidate the structure-function relationship and the mechanism of the reaction. Now, a new synthetic model makes the best mimic yet of the OEC. This recent study opens up the possibility to study the mechanism of photosystem II and photosynthesis in general for applications in renewable energy and synthetic biology.

  19. Ethical Implications of Validity-vs.-Reliability Trade-Offs in Educational Research

    ERIC Educational Resources Information Center

    Fendler, Lynn

    2016-01-01

    In educational research that calls itself empirical, the relationship between validity and reliability is that of trade-off: the stronger the bases for validity, the weaker the bases for reliability (and vice versa). Validity and reliability are widely regarded as basic criteria for evaluating research; however, there are ethical implications of…

  20. BUBBLE DYNAMICS AT GAS-EVOLVING ELECTRODES

    SciTech Connect

    Sides, Paul J.

    1980-12-01

    Nucleation of bubbles, their growth by diffusion of dissolved gas to the bubble surface and by coalescence, and their detachment from the electrode are all very fast phenomena; furthermore, electrolytically generated bubbles range in size from ten to a few hundred microns; therefore, magnification and high speed cinematography are required to observe bubbles and the phenomena of their growth on the electrode surface. Viewing the action from the front side (the surface on which the bubbles form) is complicated because the most important events occur close to the surface and are obscured by other bubbles passing between the camera and the electrode; therefore, oxygen was evolved on a transparent tin oxide "window" electrode and the events were viewed from the backside. The movies showed that coalescence of bubbles is very important for determining the size of bubbles and in the chain of transport processes; growth by diffusion and by coalescence proceeds in series and parallel; coalescing bubbles cause significant fluid motion close to the electrode; bubbles can leave and reattach; and bubbles evolve in a cycle of growth by diffusion and different modes of coalescence. An analytical solution for the primary potential and current distribution around a spherical bubble in contact with a plane electrode is presented. Zero at the contact point, the current density reaches only one percent of its undisturbed value at 30 percent of the radius from that point and goes through a shallow maximum two radii away. The solution obtained for spherical bubbles is shown to apply for the small bubbles of electrolytic processes. The incremental resistance in ohms caused by sparse arrays of bubbles is given by {Delta}R = 1.352 af/kS where f is the void fraction of gas in the bubble layer, a is the bubble layer thickness, k is the conductivity of gas free electrolyte, and S is the electrode area. A densely populated gas bubble layer on an electrode was modeled as a hexagonal array of

  1. Evolving Recommendations on Prostate Cancer Screening.

    PubMed

    Brawley, Otis W; Thompson, Ian M; Grönberg, Henrik

    2016-01-01

    Results of a number of studies demonstrate that the serum prostate-specific antigen (PSA) in and of itself is an inadequate screening test. Today, one of the most pressing questions in prostate cancer medicine is how can screening be honed to identify those who have life-threatening disease and need aggressive treatment. A number of efforts are underway. One such effort is the assessment of men in the landmark Prostate Cancer Prevention Trial that has led to a prostate cancer risk calculator (PCPTRC), which is available online. PCPTRC version 2.0 predicts the probability of the diagnosis of no cancer, low-grade cancer, or high-grade cancer when variables such as PSA, age, race, family history, and physical findings are input. Modern biomarker development promises to provide tests with fewer false positives and improved ability to find high-grade cancers. Stockholm III (STHLM3) is a prospective, population-based, paired, screen-positive, prostate cancer diagnostic study assessing a combination of plasma protein biomarkers along with age, family history, previous biopsy, and prostate examination for prediction of prostate cancer. Multiparametric MRI incorporates anatomic and functional imaging to better characterize and predict future behavior of tumors within the prostate. After diagnosis of cancer, several genomic tests promise to better distinguish the cancers that need treatment versus those that need observation. Although the new technologies are promising, there is an urgent need for evaluation of these new tests in high-quality, large population-based studies. Until these technologies are proven, most professional organizations have evolved to a recommendation of informed or shared decision making in which there is a discussion between the doctor and patient.

  2. Emergent spacetime in stochastically evolving dimensions

    NASA Astrophysics Data System (ADS)

    Afshordi, Niayesh; Stojkovic, Dejan

    2014-12-01

    Changing the dimensionality of the space-time at the smallest and largest distances has manifold theoretical advantages. If the space is lower dimensional in the high energy regime, then there are no ultraviolet divergencies in field theories, it is possible to quantize gravity, and the theory of matter plus gravity is free of divergencies or renormalizable. If the space is higher dimensional at cosmological scales, then some cosmological problems (including the cosmological constant problem) can be attacked from a completely new perspective. In this paper, we construct an explicit model of "evolving dimensions" in which the dimensions open up as the temperature of the universe drops. We adopt the string theory framework in which the dimensions are fields that live on the string worldsheet, and add temperature dependent mass terms for them. At the Big Bang, all the dimensions are very heavy and are not excited. As the universe cools down, dimensions open up one by one. Thus, the dimensionality of the space we live in depends on the energy or temperature that we are probing. In particular, we provide a kinematic Brandenberger-Vafa argument for how a discrete causal set, and eventually a continuum (3 + 1)-dim spacetime along with Einstein gravity emerges in the Infrared from the worldsheet action. The (3 + 1)-dim Planck mass and the string scale become directly related, without any compactification. Amongst other predictions, we argue that LHC might be blind to new physics even if it comes at the TeV scale. In contrast, cosmic ray experiments, especially those that can register the very beginning of the shower, and collisions with high multiplicity and density of particles, might be sensitive to the dimensional cross-over.

  3. Evolutionary genomics of fast evolving tunicates.

    PubMed

    Berná, Luisa; Alvarez-Valin, Fernando

    2014-07-08

    Tunicates have been extensively studied because of their crucial phylogenetic location (the closest living relatives of vertebrates) and particular developmental plan. Recent genome efforts have disclosed that tunicates are also remarkable in their genome organization and molecular evolutionary patterns. Here, we review these latter aspects, comparing the similarities and specificities of two model species of the group: Oikopleura dioica and Ciona intestinalis. These species exhibit great genome plasticity and Oikopleura in particular has undergone a process of extreme genome reduction and compaction that can be explained in part by gene loss, but is mostly due to other mechanisms such as shortening of intergenic distances and introns, and scarcity of mobile elements. In Ciona, genome reorganization was less severe being more similar to the other chordates in several aspects. Rates and patterns of molecular evolution are also peculiar in tunicates, being Ciona about 50% faster than vertebrates and Oikopleura three times faster. In fact, the latter species is considered as the fastest evolving metazoan recorded so far. Two processes of increase in evolutionary rates have taken place in tunicates. One of them is more extreme, and basically restricted to genes encoding regulatory proteins (transcription regulators, chromatin remodeling proteins, and metabolic regulators), and the other one is less pronounced but affects the whole genome. Very likely adaptive evolution has played a very significant role in the first, whereas the functional and/or evolutionary causes of the second are less clear and the evidence is not conclusive. The evidences supporting the incidence of increased mutation and less efficient negative selection are presented and discussed.

  4. Evolutionary Genomics of Fast Evolving Tunicates

    PubMed Central

    Berná, Luisa; Alvarez-Valin, Fernando

    2014-01-01

    Tunicates have been extensively studied because of their crucial phylogenetic location (the closest living relatives of vertebrates) and particular developmental plan. Recent genome efforts have disclosed that tunicates are also remarkable in their genome organization and molecular evolutionary patterns. Here, we review these latter aspects, comparing the similarities and specificities of two model species of the group: Oikopleura dioica and Ciona intestinalis. These species exhibit great genome plasticity and Oikopleura in particular has undergone a process of extreme genome reduction and compaction that can be explained in part by gene loss, but is mostly due to other mechanisms such as shortening of intergenic distances and introns, and scarcity of mobile elements. In Ciona, genome reorganization was less severe being more similar to the other chordates in several aspects. Rates and patterns of molecular evolution are also peculiar in tunicates, being Ciona about 50% faster than vertebrates and Oikopleura three times faster. In fact, the latter species is considered as the fastest evolving metazoan recorded so far. Two processes of increase in evolutionary rates have taken place in tunicates. One of them is more extreme, and basically restricted to genes encoding regulatory proteins (transcription regulators, chromatin remodeling proteins, and metabolic regulators), and the other one is less pronounced but affects the whole genome. Very likely adaptive evolution has played a very significant role in the first, whereas the functional and/or evolutionary causes of the second are less clear and the evidence is not conclusive. The evidences supporting the incidence of increased mutation and less efficient negative selection are presented and discussed. PMID:25008364

  5. Sauropod dinosaurs evolved moderately sized genomes unrelated to body size.

    PubMed

    Organ, Chris L; Brusatte, Stephen L; Stein, Koen

    2009-12-22

    Sauropodomorph dinosaurs include the largest land animals to have ever lived, some reaching up to 10 times the mass of an African elephant. Despite their status defining the upper range for body size in land animals, it remains unknown whether sauropodomorphs evolved larger-sized genomes than non-avian theropods, their sister taxon, or whether a relationship exists between genome size and body size in dinosaurs, two questions critical for understanding broad patterns of genome evolution in dinosaurs. Here we report inferences of genome size for 10 sauropodomorph taxa. The estimates are derived from a Bayesian phylogenetic generalized least squares approach that generates posterior distributions of regression models relating genome size to osteocyte lacunae volume in extant tetrapods. We estimate that the average genome size of sauropodomorphs was 2.02 pg (range of species means: 1.77-2.21 pg), a value in the upper range of extant birds (mean = 1.42 pg, range: 0.97-2.16 pg) and near the average for extant non-avian reptiles (mean = 2.24 pg, range: 1.05-5.44 pg). The results suggest that the variation in size and architecture of genomes in extinct dinosaurs was lower than the variation found in mammals. A substantial difference in genome size separates the two major clades within dinosaurs, Ornithischia (large genomes) and Saurischia (moderate to small genomes). We find no relationship between body size and estimated genome size in extinct dinosaurs, which suggests that neutral forces did not dominate the evolution of genome size in this group.

  6. Sauropod dinosaurs evolved moderately sized genomes unrelated to body size

    PubMed Central

    Organ, Chris L.; Brusatte, Stephen L.; Stein, Koen

    2009-01-01

    Sauropodomorph dinosaurs include the largest land animals to have ever lived, some reaching up to 10 times the mass of an African elephant. Despite their status defining the upper range for body size in land animals, it remains unknown whether sauropodomorphs evolved larger-sized genomes than non-avian theropods, their sister taxon, or whether a relationship exists between genome size and body size in dinosaurs, two questions critical for understanding broad patterns of genome evolution in dinosaurs. Here we report inferences of genome size for 10 sauropodomorph taxa. The estimates are derived from a Bayesian phylogenetic generalized least squares approach that generates posterior distributions of regression models relating genome size to osteocyte lacunae volume in extant tetrapods. We estimate that the average genome size of sauropodomorphs was 2.02 pg (range of species means: 1.77–2.21 pg), a value in the upper range of extant birds (mean = 1.42 pg, range: 0.97–2.16 pg) and near the average for extant non-avian reptiles (mean = 2.24 pg, range: 1.05–5.44 pg). The results suggest that the variation in size and architecture of genomes in extinct dinosaurs was lower than the variation found in mammals. A substantial difference in genome size separates the two major clades within dinosaurs, Ornithischia (large genomes) and Saurischia (moderate to small genomes). We find no relationship between body size and estimated genome size in extinct dinosaurs, which suggests that neutral forces did not dominate the evolution of genome size in this group. PMID:19793755

  7. Reliability of Wireless Sensor Networks

    PubMed Central

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2014-01-01

    Wireless Sensor Networks (WSNs) consist of hundreds or thousands of sensor nodes with limited processing, storage, and battery capabilities. There are several strategies to reduce the power consumption of WSN nodes (by increasing the network lifetime) and increase the reliability of the network (by improving the WSN Quality of Service). However, there is an inherent conflict between power consumption and reliability: an increase in reliability usually leads to an increase in power consumption. For example, routing algorithms can send the same packet though different paths (multipath strategy), which it is important for reliability, but they significantly increase the WSN power consumption. In this context, this paper proposes a model for evaluating the reliability of WSNs considering the battery level as a key factor. Moreover, this model is based on routing algorithms used by WSNs. In order to evaluate the proposed models, three scenarios were considered to show the impact of the power consumption on the reliability of WSNs. PMID:25157553

  8. Nuclear weapon reliability evaluation methodology

    SciTech Connect

    Wright, D.L.

    1993-06-01

    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  9. Predatory prokaryotes: predation and primary consumption evolved in bacteria

    NASA Technical Reports Server (NTRS)

    Guerrero, R.; Pedros-Alio, C.; Esteve, I.; Mas, J.; Chase, D.; Margulis, L.

    1986-01-01

    Two kinds of predatory bacteria have been observed and characterized by light and electron microscopy in samples from freshwater sulfurous lakes in northeastern Spain. The first bacterium, named Vampirococcus, is Gram-negative and ovoidal (0.6 micrometer wide). An anaerobic epibiont, it adheres to the surface of phototrophic bacteria (Chromatium spp.) by specific attachment structures and, as it grows and divides by fission, destroys its prey. An important in situ predatory role can be inferred for Vampirococcus from direct counts in natural samples. The second bacterium, named Daptobacter, is a Gram-negative, facultatively anaerobic straight rod (0.5 x 1.5 micrometers) with a single polar flagellum, which collides, penetrates, and grows inside the cytoplasm of its prey (several genera of Chromatiaceae). Considering also the well-known case of Bdellovibrio, a Gram-negative, aerobic curved rod that penetrates and divides in the periplasmic space of many chemotrophic Gram-negative bacteria, there are three types of predatory prokaryotes presently known (epibiotic, cytoplasmic, and periplasmic). Thus, we conclude that antagonistic relationships such as primary consumption, predation, and scavenging had already evolved in microbial ecosystems prior to the appearance of eukaryotes. Furthermore, because they represent methods by which prokaryotes can penetrate other prokaryotes in the absence of phagocytosis, these associations can be considered preadaptation for the origin of intracellular organelles.

  10. Evolving Improvements to TRMM Ground Validation Rainfall Estimates

    NASA Technical Reports Server (NTRS)

    Robinson, M.; Kulie, M. S.; Marks, D. A.; Wolff, D. B.; Ferrier, B. S.; Amitai, E.; Silberstein, D. S.; Fisher, B. L.; Wang, J.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The primary function of the TRMM Ground Validation (GV) Program is to create GV rainfall products that provide basic validation of satellite-derived precipitation measurements for select primary sites. Since the successful 1997 launch of the TRMM satellite, GV rainfall estimates have demonstrated systematic improvements directly related to improved radar and rain gauge data, modified science techniques, and software revisions. Improved rainfall estimates have resulted in higher quality GV rainfall products and subsequently, much improved evaluation products for the satellite-based precipitation estimates from TRMM. This presentation will demonstrate how TRMM GV rainfall products created in a semi-automated, operational environment have evolved and improved through successive generations. Monthly rainfall maps and rainfall accumulation statistics for each primary site will be presented for each stage of GV product development. Contributions from individual product modifications involving radar reflectivity (Ze)-rain rate (R) relationship refinements, improvements in rain gauge bulk-adjustment and data quality control processes, and improved radar and gauge data will be discussed. Finally, it will be demonstrated that as GV rainfall products have improved, rainfall estimation comparisons between GV and satellite have converged, lending confidence to the satellite-derived precipitation measurements from TRMM.

  11. Fifty Years of Evolving Partnerships in Veterinary Medical Education.

    PubMed

    Kochevar, Deborah T

    2015-01-01

    The Association of American Veterinary Medical College's (AAVMC's) role in the progression of academic veterinary medical education has been about building successful partnerships in the US and internationally. Membership in the association has evolved over the past 50 years, as have traditions of collaboration that strengthen veterinary medical education and the association. The AAVMC has become a source of information and a place for debate on educational trends, innovative pedagogy, and the value of a diverse learning environment. The AAVMC's relationship with the American Veterinary Medical Association Council on Education (AVMA COE), the accreditor of veterinary medical education recognized by the United Sates Department of Education (DOE), is highlighted here because of the key role that AAVMC members have played in the evolution of veterinary accreditation. The AAVMC has also been a partner in the expansion of veterinary medical education to include global health and One Health and in the engagement of international partners around shared educational opportunities and challenges. Recently, the association has reinforced its desire to be a truly international organization rather than an American organization with international members. To that end, strategic AAVMC initiatives aim to expand and connect the global community of veterinary educators to the benefit of students and the profession around the world. Tables in this article are intended to provide historical context, chronology, and an accessible way to view highlights.

  12. Complex Formation History of Highly Evolved Basaltic Shergottite, Zagami

    NASA Technical Reports Server (NTRS)

    Niihara, T.; Misawa, K.; Mikouchi, T.; Nyquist, L. E.; Park, J.; Hirata, D.

    2012-01-01

    Zagami, a basaltic shergottite, contains several kinds of lithologies such as Normal Zagami consisting of Fine-grained (FG) and Coarse-grained (CG), Dark Mottled lithology (DML), and Olivine-rich late-stage melt pocket (DN). Treiman and Sutton concluded that Zagami (Normal Zagami) is a fractional crystallization product from a single magma. It has been suggested that there were two igneous stages (deep magma chamber and shallow magma chamber or surface lava flow) on the basis of chemical zoning features of pyroxenes which have homogeneous Mg-rich cores and FeO, CaO zoning at the rims. Nyquist et al. reported that FG has a different initial Sr isotopic ratio than CG and DML, and suggested the possibility of magma mixing on Mars. Here we report new results of petrology and mineralogy for DML and the Olivine-rich lithology (we do not use DN here), the most evolved lithology in this rock, to understand the relationship among lithologies and reveal Zagami s formation history

  13. Highly dynamically evolved intermediate-age open clusters

    NASA Astrophysics Data System (ADS)

    Piatti, Andrés E.; Dias, Wilton S.; Sampedro, Laura M.

    2017-04-01

    We present a comprehensive UBVRI and Washington CT1T2 photometric analysis of seven catalogued open clusters, namely: Ruprecht 3, 9, 37, 74, 150, ESO 324-15 and 436-2. The multiband photometric data sets in combination with 2MASS photometry and Gaia astrometry for the brighter stars were used to estimate their structural parameters and fundamental astrophysical properties. We found that Ruprecht 3 and ESO 436-2 do not show self-consistent evidence of being physical systems. The remained studied objects are open clusters of intermediate age (9.0 ≤ log(t yr-1) ≤ 9.6), of relatively small size (rcls ∼ 0.4-1.3 pc) and placed between 0.6 and 2.9 kpc from the Sun. We analysed the relationships between core, half-mass, tidal and Jacoby radii as well as half-mass relaxation times to conclude that the studied clusters are in an evolved dynamical stage. The total cluster masses obtained by summing those of the observed cluster stars resulted to be ∼10-15 per cent of the masses of open clusters of similar age located closer than 2 kpc from the Sun. We found that cluster stars occupy volumes as large as those for tidally filled clusters.

  14. Hemicrania continua evolving from episodic paroxysmal hemicrania.

    PubMed

    Castellanos-Pinedo, F; Zurdo, M; Martínez-Acebes, E

    2006-09-01

    A 45-year-old woman, who had been diagnosed in our unit with episodic paroxysmal hemicrania, was seen 2 years later for ipsilateral hemicrania continua in remitting form. Both types of headache had a complete response to indomethacin and did not occur simultaneously. The patient had a previous history of episodic moderate headaches that met criteria for probable migraine without aura and also had a family history of headache. The clinical course in this case suggests a pathogenic relationship between both types of primary headache.

  15. Did the ctenophore nervous system evolve independently?

    PubMed

    Ryan, Joseph F

    2014-08-01

    Recent evidence supports the placement of ctenophores as the most distant relative to all other animals. This revised animal tree means that either the ancestor of all animals possessed neurons (and that sponges and placozoans apparently lost them) or that ctenophores developed them independently. Differentiating between these possibilities is important not only from a historical perspective, but also for the interpretation of a wide range of neurobiological results. In this short perspective paper, I review the evidence in support of each scenario and show that the relationship between the nervous system of ctenophores and other animals is an unsolved, yet tractable problem.

  16. A fourth generation reliability predictor

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Martensen, Anna L.

    1988-01-01

    A reliability/availability predictor computer program has been developed and is currently being beta-tested by over 30 US companies. The computer program is called the Hybrid Automated Reliability Predictor (HARP). HARP was developed to fill an important gap in reliability assessment capabilities. This gap was manifested through the use of its third-generation cousin, the Computer-Aided Reliability Estimation (CARE III) program, over a six-year development period and an additional three-year period during which CARE III has been in the public domain. The accumulated experience of the over 30 establishments now using CARE III was used in the development of the HARP program.

  17. Shoulder arthroplasty: evolving techniques and indications.

    PubMed

    Walch, Gilles; Boileau, Pascal; Noël, Eric

    2010-12-01

    The development of modern shoulder replacement surgery started over half a century ago with the pioneering work done by CS Neer. Several designs for shoulder prostheses are now available, allowing surgeons to select the best design for each situation. When the rotator cuff is intact, unconstrained prostheses produce reliable and reproducible results, with prosthesis survival rates of 97% after 10 years and 84% after 20 years. In patients with three- or four-part fractures of the proximal humerus, the outcome of shoulder arthroplasty depends largely on healing of the greater tuberosity, which is therefore a major treatment objective. Factors crucial to greater tuberosity union include selection of the optimal prosthesis design, flawless fixation of the tuberosities, and appropriate postoperative immobilization. The reverse shoulder prosthesis developed by Grammont has been recognized since 1991 as a valid option for patients with glenohumeral osteoarthritis. Ten-year prosthesis survival rates are 91% overall (including trauma and revisions) and 94% for glenohumeral osteoarthritis with head migration. These good results are generating interest in the reverse shoulder prosthesis as a treatment option in situations where unconstrained prostheses are unsatisfactory (primary glenohumeral osteoarthritis with marked glenoid cavity erosion; comminuted fractures in patients older than 75 years; post-traumatic osteoarthritis with severe tuberosity malunion or nonunion; massive irreparable rotator cuff tears with pseudoparalysis; failed rotator cuff repair; and proximal humerus tumor requiring resection of the rotator cuff insertions).

  18. The Evolvement of Automobile Steering System Based on TRIZ

    NASA Astrophysics Data System (ADS)

    Zhao, Xinjun; Zhang, Shuang

    Products and techniques pass through a process of birth, growth, maturity, death and quit the stage like biological evolution process. The developments of products and techniques conform to some evolvement rules. If people know and hold these rules, they can design new kind of products and forecast the develop trends of the products. Thereby, enterprises can grasp the future technique directions of products, and make product and technique innovation. Below, based on TRIZ theory, the mechanism evolvement, the function evolvement and the appearance evolvement of automobile steering system had been analyzed and put forward some new ideas about future automobile steering system.

  19. The investigation of supply chain's reliability measure: a case study

    NASA Astrophysics Data System (ADS)

    Taghizadeh, Houshang; Hafezi, Ehsan

    2012-10-01

    In this paper, using supply chain operational reference, the reliability evaluation of available relationships in supply chain is investigated. For this purpose, in the first step, the chain under investigation is divided into several stages including first and second suppliers, initial and final customers, and the producing company. Based on the formed relationships between these stages, the supply chain system is then broken down into different subsystem parts. The formed relationships between the stages are based on the transportation of the orders between stages. Paying attention to the system elements' location, which can be in one of the five forms of series namely parallel, series/parallel, parallel/series, or their combinations, we determine the structure of relationships in the divided subsystems. According to reliability evaluation scales on the three levels of supply chain, the reliability of each chain is then calculated. Finally, using the formulas of calculating the reliability in combined systems, the reliability of each system and ultimately the whole system is investigated.

  20. Impact of Device Scaling on Deep Sub-micron Transistor Reliability: A Study of Reliability Trends using SRAM

    NASA Technical Reports Server (NTRS)

    White, Mark; Huang, Bing; Qin, Jin; Gur, Zvi; Talmor, Michael; Chen, Yuan; Heidecker, Jason; Nguyen, Duc; Bernstein, Joseph

    2005-01-01

    As microelectronics are scaled in to the deep sub-micron regime, users of advanced technology CMOS, particularly in high-reliability applications, should reassess how scaling effects impact long-term reliability. An experimental based reliability study of industrial grade SRAMs, consisting of three different technology nodes, is proposed to substantiate current acceleration models for temperature and voltage life-stress relationships. This reliability study utilizes step-stress techniques to evaluate memory technologies (0.25mum, 0.15mum, and 0.13mum) embedded in many of today's high-reliability space/aerospace applications. Two acceleration modeling approaches are presented to relate experimental FIT calculations to Mfr's qualification data.

  1. Food Addiction: An Evolving Nonlinear Science

    PubMed Central

    Shriner, Richard; Gold, Mark

    2014-01-01

    The purpose of this review is to familiarize readers with the role that addiction plays in the formation and treatment of obesity, type 2 diabetes and disorders of eating. We will outline several useful models that integrate metabolism, addiction, and human relationship adaptations to eating. A special effort will be made to demonstrate how the use of simple and straightforward nonlinear models can and are being used to improve our knowledge and treatment of patients suffering from nutritional pathology. Moving forward, the reader should be able to incorporate some of the findings in this review into their own practice, research, teaching efforts or other interests in the fields of nutrition, diabetes, and/or bariatric (weight) management. PMID:25421535

  2. Could life have evolved in cometary nuclei

    NASA Technical Reports Server (NTRS)

    Bar-Nun, A.; Lazcano-Araujo, A.; Oro, J.

    1981-01-01

    The suggestion by Hoyle and Wickramasinghe (1978) that life might have originated in cometary nuclei rather than directly on the earth is discussed. Factors in the cometary environment including the conditions at perihelion passage leading to the ablation of cometary ices, ice temperatures, the absence of an atmosphere and discrete liquid and solid surfaces, weak cometary structure incapable of supporting a liquid core, and radiation are presented as arguments against biopoesis in comets. It is concluded that although the contribution of cometary and meteoritic matter was significant in shaping the earth environment, the view that life on earth originally arose in comets is untenable, and the proposition that the process of interplanetary infection still occurs is unlikely in view of the high specificity of host-parasite relationships.

  3. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  4. Computer-Aided Reliability Estimation

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.; Stiffler, J. J.; Bryant, L. A.; Petersen, P. L.

    1986-01-01

    CARE III (Computer-Aided Reliability Estimation, Third Generation) helps estimate reliability of complex, redundant, fault-tolerant systems. Program specifically designed for evaluation of fault-tolerant avionics systems. However, CARE III general enough for use in evaluation of other systems as well.

  5. The Validity of Reliability Measures.

    ERIC Educational Resources Information Center

    Seddon, G. M.

    1988-01-01

    Demonstrates that some commonly used indices can be misleading in their quantification of reliability. The effects are most pronounced on gain or difference scores. Proposals are made to avoid sources of invalidity by using a procedure to assess reliability in terms of upper and lower limits for the true scores of each examinee. (Author/JDH)

  6. Reliability in CMOS IC processing

    NASA Technical Reports Server (NTRS)

    Shreeve, R.; Ferrier, S.; Hall, D.; Wang, J.

    1990-01-01

    Critical CMOS IC processing reliability monitors are defined in this paper. These monitors are divided into three categories: process qualifications, ongoing production workcell monitors, and ongoing reliability monitors. The key measures in each of these categories are identified and prioritized based on their importance.

  7. Essay Reliability: Form and Meaning.

    ERIC Educational Resources Information Center

    Shale, Doug

    This study is an attempt at a cohesive characterization of the concept of essay reliability. As such, it takes as a basic premise that previous and current practices in reporting reliability estimates for essay tests have certain shortcomings. The study provides an analysis of these shortcomings--partly to encourage a fuller understanding of the…

  8. Reliability-based design optimization using efficient global reliability analysis.

    SciTech Connect

    Bichon, Barron J.; Mahadevan, Sankaran; Eldred, Michael Scott

    2010-05-01

    Finding the optimal (lightest, least expensive, etc.) design for an engineered component that meets or exceeds a specified level of reliability is a problem of obvious interest across a wide spectrum of engineering fields. Various methods for this reliability-based design optimization problem have been proposed. Unfortunately, this problem is rarely solved in practice because, regardless of the method used, solving the problem is too expensive or the final solution is too inaccurate to ensure that the reliability constraint is actually satisfied. This is especially true for engineering applications involving expensive, implicit, and possibly nonlinear performance functions (such as large finite element models). The Efficient Global Reliability Analysis method was recently introduced to improve both the accuracy and efficiency of reliability analysis for this type of performance function. This paper explores how this new reliability analysis method can be used in a design optimization context to create a method of sufficient accuracy and efficiency to enable the use of reliability-based design optimization as a practical design tool.

  9. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  10. Software Reliability, Measurement, and Testing Software Reliability and Test Integration

    DTIC Science & Technology

    1992-04-01

    process rariables on software reliability A guidebook was produced to help pro- gram managers control and manage softwar~e reliability and testing...Integrated Reliability Management System "IRMS" 17 1.6 Organization of Report 17 2.0 SURVEYS 20 2.1 Software Projects Survey 20 2.1.1 Candidate Projects...Systems 103 4.2.2.3 Compiler 103 4.2.2.4 Data Management and Analysis 103 4.2.3 Test/Support Tools 103 4.2.3.1 DEC Test Manager 103 4.2.3.2 SDDL 103

  11. PV Reliability Development Lessons from JPL's Flat Plate Solar Array Project

    NASA Technical Reports Server (NTRS)

    Ross, Ronald G., Jr.

    2013-01-01

    Key reliability and engineering lessons learned from the 20-year history of the Jet Propulsion Laboratory's Flat-Plate Solar Array Project and thin film module reliability research activities are presented and analyzed. Particular emphasis is placed on lessons applicable to evolving new module technologies and the organizations involved with these technologies. The user-specific demand for reliability is a strong function of the application, its location, and its expected duration. Lessons relative to effective means of specifying reliability are described, and commonly used test requirements are assessed from the standpoint of which are the most troublesome to pass, and which correlate best with field experience. Module design lessons are also summarized, including the significance of the most frequently encountered failure mechanisms and the role of encapsulate and cell reliability in determining module reliability. Lessons pertaining to research, design, and test approaches include the historical role and usefulness of qualification tests and field tests.

  12. Adaptation of Escherichia coli to glucose promotes evolvability in lactose.

    PubMed

    Phillips, Kelly N; Castillo, Gerardo; Wünsche, Andrea; Cooper, Tim F

    2016-02-01

    The selective history of a population can influence its subsequent evolution, an effect known as historical contingency. We previously observed that five of six replicate populations that were evolved in a glucose-limited environment for 2000 generations, then switched to lactose for 1000 generations, had higher fitness increases in lactose than populations started directly from the ancestor. To test if selection in glucose systematically increased lactose evolvability, we started 12 replay populations--six from a population subsample and six from a single randomly selected clone--from each of the six glucose-evolved founder populations. These replay populations and 18 ancestral populations were evolved for 1000 generations in a lactose-limited environment. We found that replay populations were initially slightly less fit in lactose than the ancestor, but were more evolvable, in that they increased in fitness at a faster rate and to higher levels. This result indicates that evolution in the glucose environment resulted in genetic changes that increased the potential of genotypes to adapt to lactose. Genome sequencing identified four genes--iclR, nadR, spoT, and rbs--that were mutated in most glucose-evolved clones and are candidates for mediating increased evolvability. Our results demonstrate that short-term selective costs during selection in one environment can lead to changes in evolvability that confer longer term benefits.

  13. Loops and autonomy promote evolvability of ecosystem networks.

    PubMed

    Luo, Jianxi

    2014-09-29

    The structure of ecological networks, in particular food webs, determines their ability to evolve further, i.e. evolvability. The knowledge about how food web evolvability is determined by the structures of diverse ecological networks can guide human interventions purposefully to either promote or limit evolvability of ecosystems. However, the focus of prior food web studies was on stability and robustness; little is known regarding the impact of ecological network structures on their evolvability. To correlate ecosystem structure and evolvability, we adopt the NK model originally from evolutionary biology to generate and assess the ruggedness of fitness landscapes of a wide spectrum of model food webs with gradual variation in the amount of feeding loops and link density. The variation in network structures is controlled by linkage rewiring. Our results show that more feeding loops and lower trophic link density, i.e. higher autonomy of species, of food webs increase the potential for the ecosystem to generate heritable variations with improved fitness. Our findings allow the prediction of the evolvability of actual food webs according to their network structures, and provide guidance to enhancing or controlling the evolvability of specific ecosystems.

  14. Evolving Expert Knowledge Bases: Applications of Crowdsourcing and Serious Gaming to Advance Knowledge Development for Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Floryan, Mark

    2013-01-01

    This dissertation presents a novel effort to develop ITS technologies that adapt by observing student behavior. In particular, we define an evolving expert knowledge base (EEKB) that structures a domain's information as a set of nodes and the relationships that exist between those nodes. The structure of this model is not the particularly novel…

  15. Impact of staffing parameters on operational reliability

    SciTech Connect

    Hahn, H.A.; Houghton, F.K.

    1993-02-01

    This paper reports on a project related to human resource management of the Department of Energy`s (DOE`s) High-Level Waste (HLW) Tank program. Safety and reliability of waste tank operations is impacted by several issues, including not only the design of the tanks themselves, but also how operations and operational personnel are managed. As demonstrated by management assessments performed by the Tiger Teams, DOE believes that the effective use of human resources impacts environment safety, and health concerns. For the of the current paper, human resource management activities are identified as ``Staffing`` and include the of developing the functional responsibilities and qualifications of technical and administrative personnel. This paper discusses the importance of staff plans and management in the overall view of safety and reliability. The work activities and procedures associated with the project, a review of the results of these activities, including a summary of the literature and a preliminary analysis of the data. We conclude that although identification of staffing issues and the development of staffing plans contributes to the overall reliability and safety of the HLW tanks, the relationship is not well understood and is in need of further development.

  16. Impact of staffing parameters on operational reliability

    SciTech Connect

    Hahn, H.A.; Houghton, F.K.

    1993-01-01

    This paper reports on a project related to human resource management of the Department of Energy's (DOE's) High-Level Waste (HLW) Tank program. Safety and reliability of waste tank operations is impacted by several issues, including not only the design of the tanks themselves, but also how operations and operational personnel are managed. As demonstrated by management assessments performed by the Tiger Teams, DOE believes that the effective use of human resources impacts environment safety, and health concerns. For the of the current paper, human resource management activities are identified as Staffing'' and include the of developing the functional responsibilities and qualifications of technical and administrative personnel. This paper discusses the importance of staff plans and management in the overall view of safety and reliability. The work activities and procedures associated with the project, a review of the results of these activities, including a summary of the literature and a preliminary analysis of the data. We conclude that although identification of staffing issues and the development of staffing plans contributes to the overall reliability and safety of the HLW tanks, the relationship is not well understood and is in need of further development.

  17. Gene Essentiality Is a Quantitative Property Linked to Cellular Evolvability.

    PubMed

    Liu, Gaowen; Yong, Mei Yun Jacy; Yurieva, Marina; Srinivasan, Kandhadayar Gopalan; Liu, Jaron; Lim, John Soon Yew; Poidinger, Michael; Wright, Graham Daniel; Zolezzi, Francesca; Choi, Hyungwon; Pavelka, Norman; Rancati, Giulia

    2015-12-03

    Gene essentiality is typically determined by assessing the viability of the corresponding mutant cells, but this definition fails to account for the ability of cells to adaptively evolve to genetic perturbations. Here, we performed a stringent screen to assess the degree to which Saccharomyces cerevisiae cells can survive the deletion of ~1,000 individual "essential" genes and found that ~9% of these genetic perturbations could in fact be overcome by adaptive evolution. Our analyses uncovered a genome-wide gradient of gene essentiality, with certain essential cellular functions being more "evolvable" than others. Ploidy changes were prevalent among the evolved mutant strains, and aneuploidy of a specific chromosome was adaptive for a class of evolvable nucleoporin mutants. These data justify a quantitative redefinition of gene essentiality that incorporates both viability and evolvability of the corresponding mutant cells and will enable selection of therapeutic targets associated with lower risk of emergence of drug resistance.

  18. Evolving minds: Helping students with cognitive dissonance

    NASA Astrophysics Data System (ADS)

    Bramschreiber, Terry L.

    Even 150 years after Charles Darwin published On the Origin of Species, public school teachers still find themselves dealing with student resistance to learning about biological evolution. Some teachers deal with this pressure by undermining, deemphasizing, or even omitting the topic in their science curriculum. Others face the challenge and deliver solid scientific instruction of evolutionary theory despite the conflicts that may arise. The latter were the topic of this study. I interviewed five teachers that had experience dealing with resistance to learning evolution in their school community. Through these in-depth interviews, I examined strategies these teachers use when facing resistance and how they help students deal with the cognitive dissonance that may be experienced when learning about evolution. I selected the qualitative method of educational criticism and connoisseurship to organize and categorize my data. From the interviews, the following findings emerged. Experienced teachers increased their confidence in teaching evolution by pursuing outside professional development. They not only learned more about evolutionary theory, but about creationist arguments against evolution. These teachers front-load their curriculum to integrate the nature of science into their lessons to address misunderstandings about how science works. They also highlight the importance of learning evolutionary theory but ensure students they do not have an agenda to indoctrinate students. Finally these experienced teachers work hard to create an intellectually safe learning environment to build trusting and respectful relationships with their students.

  19. Fundamental mechanisms of micromachine reliability

    SciTech Connect

    DE BOER,MAARTEN P.; SNIEGOWSKI,JEFFRY J.; KNAPP,JAMES A.; REDMOND,JAMES M.; MICHALSKE,TERRY A.; MAYER,THOMAS K.

    2000-01-01

    Due to extreme surface to volume ratios, adhesion and friction are critical properties for reliability of Microelectromechanical Systems (MEMS), but are not well understood. In this LDRD the authors established test structures, metrology and numerical modeling to conduct studies on adhesion and friction in MEMS. They then concentrated on measuring the effect of environment on MEMS adhesion. Polycrystalline silicon (polysilicon) is the primary material of interest in MEMS because of its integrated circuit process compatibility, low stress, high strength and conformal deposition nature. A plethora of useful micromachined device concepts have been demonstrated using Sandia National Laboratories' sophisticated in-house capabilities. One drawback to polysilicon is that in air the surface oxidizes, is high energy and is hydrophilic (i.e., it wets easily). This can lead to catastrophic failure because surface forces can cause MEMS parts that are brought into contact to adhere rather than perform their intended function. A fundamental concern is how environmental constituents such as water will affect adhesion energies in MEMS. The authors first demonstrated an accurate method to measure adhesion as reported in Chapter 1. In Chapter 2 through 5, they then studied the effect of water on adhesion depending on the surface condition (hydrophilic or hydrophobic). As described in Chapter 2, they find that adhesion energy of hydrophilic MEMS surfaces is high and increases exponentially with relative humidity (RH). Surface roughness is the controlling mechanism for this relationship. Adhesion can be reduced by several orders of magnitude by silane coupling agents applied via solution processing. They decrease the surface energy and render the surface hydrophobic (i.e. does not wet easily). However, only a molecular monolayer coats the surface. In Chapters 3-5 the authors map out the extent to which the monolayer reduces adhesion versus RH. They find that adhesion is independent of

  20. A reliable multicast for XTP

    NASA Technical Reports Server (NTRS)

    Dempsey, Bert J.; Weaver, Alfred C.

    1990-01-01

    Multicast services needed for current distributed applications on LAN's fall generally into one of three categories: datagram, semi-reliable, and reliable. Transport layer multicast datagrams represent unreliable service in which the transmitting context 'fires and forgets'. XTP executes these semantics when the MULTI and NOERR mode bits are both set. Distributing sensor data and other applications in which application-level error recovery strategies are appropriate benefit from the efficiency in multidestination delivery offered by datagram service. Semi-reliable service refers to multicasting in which the control algorithms of the transport layer--error, flow, and rate control--are used in transferring the multicast distribution to the set of receiving contexts, the multicast group. The multicast defined in XTP provides semi-reliable service. Since, under a semi-reliable service, joining a multicast group means listening on the group address and entails no coordination with other members, a semi-reliable facility can be used for communication between a client and a server group as well as true peer-to-peer group communication. Resource location in a LAN is an important application domain. The term 'semi-reliable' refers to the fact that group membership changes go undetected. No attempt is made to assess the current membership of the group at any time--before, during, or after--the data transfer.

  1. Calculating system reliability with SRFYDO

    SciTech Connect

    Morzinski, Jerome; Anderson - Cook, Christine M; Klamann, Richard M

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  2. The 3-cycle weighted spectral distribution in evolving community-based networks

    NASA Astrophysics Data System (ADS)

    Jiao, Bo; Wu, Xiaoqun

    2017-03-01

    One of the main organizing principles in real-world networks is that of network communities, where sets of nodes organize into densely linked clusters. Many of these community-based networks evolve over time, that is, we need some size-independent metrics to capture the connection relationships embedded in these clusters. One of these metrics is the average clustering coefficient, which represents the triangle relationships between all nodes of networks. However, the vast majority of network communities is composed of low-degree nodes. Thus, we should further investigate other size-independent metrics to subtly measure the triangle relationships between low-degree nodes. In this paper, we study the 3-cycle weighted spectral distribution (WSD) defined as the weighted sum of the normalized Laplacian spectral distribution with a scaling factor n, where n is the network size (i.e., the node number). Using some diachronic community-based network models and real-world networks, we demonstrate that the ratio of the 3-cycle WSD to the network size is asymptotically independent of the network size and strictly represents the triangle relationships between low-degree nodes. Additionally, we find that the ratio is a good indicator of the average clustering coefficient in evolving community-based systems.

  3. Enhancing the Principal-School Counselor Relationship: Toolkit

    ERIC Educational Resources Information Center

    College Board Advocacy & Policy Center, 2011

    2011-01-01

    The College Board, NASSP and ASCA believe that the principal-counselor relationship is a dynamic and organic relationship that evolves over time in response to the ever-changing needs of a school. The goal of an effective principal-counselor relationship is to use the strength of the relationship to collaboratively lead school reform efforts to…

  4. (Centralized Reliability Data Organization (CRDO))

    SciTech Connect

    Haire, M J

    1987-04-21

    One of the primary goals of the Centralized Reliability Data Organization (CREDO) is to be an international focal point for the collection, analysis, and dissemination of liquid metal reactor (LMR) component reliability, availability, and maintainability (RAM) data. During FY-1985, the Department of Energy (DOE) entered into a Specific Memorandum of Agreement (SMA) with Japan's Power Reactor and Nuclear Fuel Development Corporation (PNC) regarding cooperative data exchange efforts. This agreement was CREDO's first step toward internationalization and represented an initial realization of the previously mentioned goal. DOE's interest in further internationalization of the CREDO system was the primary motivation for the traveler's attendance at the Reliability '87 conference.

  5. Integrated modular engine - Reliability assessment

    NASA Astrophysics Data System (ADS)

    Parsley, R. C.; Ward, T. B.

    1992-07-01

    A major driver in the increased interest in integrated modular engine configurations is the desire for ultra reliability for future rocket propulsion systems. The concept of configuring multiple sets of turbomachinery networked to multiple thrust chamber assemblies has been identified as an approach with potential to achieve significant reliability enhancement. This paper summarizes the results of a reliability study comparing networked systems vs. discrete engine installations, both with and without major module and engine redundancy. The study was conducted for gas generator, expander, and staged combustion cycles. The results are representative of either booster or upper-stage applications and are indicative of either plug or nonplug installation philosophies.

  6. Aerospace reliability applied to biomedicine.

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.; Vargo, D. J.

    1972-01-01

    An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.

  7. NASA's Space Launch System: An Evolving Capability for Exploration An Evolving Capability for Exploration

    NASA Technical Reports Server (NTRS)

    Creech, Stephen D.; Crumbly, Christopher M.; Robinson, Kimerly F.

    2016-01-01

    A foundational capability for international human deep-space exploration, NASA's Space Launch System (SLS) vehicle represents a new spaceflight infrastructure asset, creating opportunities for mission profiles and space systems that cannot currently be executed. While the primary purpose of SLS, which is making rapid progress towards initial launch readiness in two years, will be to support NASA's Journey to Mars, discussions are already well underway regarding other potential utilization of the vehicle's unique capabilities. In its initial Block 1 configuration, capable of launching 70 metric tons (t) to low Earth orbit (LEO), SLS is capable of propelling the Orion crew vehicle to cislunar space, while also delivering small CubeSat-class spacecraft to deep-space destinations. With the addition of a more powerful upper stage, the Block 1B configuration of SLS will be able to deliver 105 t to LEO and enable more ambitious human missions into the proving ground of space. This configuration offers opportunities for launching co-manifested payloads with the Orion crew vehicle, and a class of secondary payloads, larger than today's CubeSats. Further upgrades to the vehicle, including advanced boosters, will evolve its performance to 130 t in its Block 2 configuration. Both Block 1B and Block 2 also offer the capability to carry 8.4- or 10-m payload fairings, larger than any contemporary launch vehicle. With unmatched mass-lift capability, payload volume, and C3, SLS not only enables spacecraft or mission designs currently impossible with contemporary EELVs, it also offers enhancing benefits, such as reduced risk, operational costs and/or complexity, shorter transit time to destination or launching large systems either monolithically or in fewer components. This paper will discuss both the performance and capabilities of Space Launch System as it evolves, and the current state of SLS utilization planning.

  8. Statistical Significance and Reliability Analyses in Recent "Journal of Counseling & Development" Research Articles.

    ERIC Educational Resources Information Center

    Thompson, Bruce; Snyder, Patricia A.

    1998-01-01

    Investigates two aspects of research analyses in quantitative research studies reported in the 1996 issues of "Journal of Counseling & Development" (JCD). Acceptable methodological practice regarding significance testing and evaluation of score reliability has evolved considerably. Contemporary thinking on these issues is described; practice as…

  9. Static and Evolving Norovirus Genotypes: Implications for Epidemiology and Immunity

    PubMed Central

    Karangwa, Consolee K.; Sosnovtsev, Stanislav V.

    2017-01-01

    Noroviruses are major pathogens associated with acute gastroenteritis worldwide. Their RNA genomes are diverse, with two major genogroups (GI and GII) comprised of at least 28 genotypes associated with human disease. To elucidate mechanisms underlying norovirus diversity and evolution, we used a large-scale genomics approach to analyze human norovirus sequences. Comparison of over 2000 nearly full-length ORF2 sequences representing most of the known GI and GII genotypes infecting humans showed a limited number (≤5) of distinct intra-genotypic variants within each genotype, with the exception of GII.4. The non-GII.4 genotypes were comprised of one or more intra-genotypic variants, with each variant containing strains that differed by only a few residues over several decades (remaining “static”) and that have co-circulated with no clear epidemiologic pattern. In contrast, the GII.4 genotype presented the largest number of variants (>10) that have evolved over time with a clear pattern of periodic variant replacement. To expand our understanding of these two patterns of diversification (“static” versus “evolving”), we analyzed using NGS the nearly full-length norovirus genome in healthy individuals infected with GII.4, GII.6 or GII.17 viruses in different outbreak settings. The GII.4 viruses accumulated mutations rapidly within and between hosts, while the GII.6 and GII.17 viruses remained relatively stable, consistent with their diversification patterns. Further analysis of genetic relationships and natural history patterns identified groupings of certain genotypes into larger related clusters designated here as “immunotypes”. We propose that “immunotypes” and their evolutionary patterns influence the prevalence of a particular norovirus genotype in the human population. PMID:28103318

  10. Diversity Against Adversity: How Adaptive Immune System Evolves Potent Antibodies

    NASA Astrophysics Data System (ADS)

    Heo, Muyoung; Zeldovich, Konstantin B.; Shakhnovich, Eugene I.

    2011-07-01

    Adaptive immunity is an amazing mechanism, whereby new protein functions—affinity of antibodies (Immunoglobulins) to new antigens—evolve through mutation and selection in a matter of a few days. Despite numerous experimental studies, the fundamental physical principles underlying immune response are still poorly understood. In considerable departure from past approaches, here, we propose a microscopic multiscale model of adaptive immune response, which consists of three essential players: The host cells, viruses, and B-cells in Germinal Centers (GC). Each moiety carries a genome, which encodes proteins whose stability and interactions are determined from their sequences using laws of Statistical Mechanics, providing an exact relationship between genomic sequences and strength of interactions between pathogens and antibodies and antibodies and host proteins (autoimmunity). We find that evolution of potent antibodies (the process known as Affinity Maturation (AM)) is a delicate balancing act, which has to reconcile the conflicting requirements of protein stability, lack of autoimmunity, and high affinity of antibodies to incoming antigens. This becomes possible only when antibody producing B cells elevate their mutation rates (process known as Somatic Hypermutation (SHM)) to fall into a certain range—not too low to find potency increasing mutations but not too high to destroy stable Immunoglobulins and/or already achieved affinity. Potent antibodies develop through clonal expansion of initial B cells expressing marginally potent antibodies followed by their subsequent affinity maturation through mutation and selection. As a result, in each GC the population of mature potent Immunoglobulins is monoclonal being ancestors of a single cell from initial (germline) pool. We developed a simple analytical theory, which provides further rationale to our findings. The model and theory reveal the molecular factors that determine the efficiency of affinity maturation

  11. Reliability science and patient safety.

    PubMed

    Luria, Joseph W; Muething, Stephen E; Schoettker, Pamela J; Kotagal, Uma R

    2006-12-01

    Reliability is failure-free operation over time--the measurable capability of a process, procedure, or service to perform its intended function. Reliability science has the potential to help health care organizations reduce defects in care, increase the consistency with which care is delivered, and improve patient outcomes. Based on its principles, the Institute for Health care Improvement has developed a three-step model to prevent failures, mitigate the failures that occur, and redesign systems to reduce failures. Lessons may also be learned from complex organizations that have already adopted the principles of reliability science and operate with high rates of reliability. They share a preoccupation with failure, reluctance to simplify interpretations, sensitivity to operations, commitment to resilience, and underspecification of structures.

  12. How Reliable Is Laboratory Testing?

    MedlinePlus

    ... page: Was this page helpful? Overview | Key Concepts | Quality Control | Role of Testing | Conclusion | Sources What are the ... but is constantly monitored for reliability through comprehensive quality control and quality assurance procedures. Therefore, when your blood ...

  13. Reliability and Maintainability (RAM) Training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Packard, Michael H. (Editor)

    2000-01-01

    The theme of this manual is failure physics-the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low-cost reliable products. In a broader sense the manual should do more. It should underscore the urgent need CI for mature attitudes toward reliability. Five of the chapters were originally presented as a classroom course to over 1000 Martin Marietta engineers and technicians. Another four chapters and three appendixes have been added, We begin with a view of reliability from the years 1940 to 2000. Chapter 2 starts the training material with a review of mathematics and a description of what elements contribute to product failures. The remaining chapters elucidate basic reliability theory and the disciplines that allow us to control and eliminate failures.

  14. Reliability Quantification of Advanced Stirling Convertor (ASC) Components

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward

    2010-01-01

    The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.

  15. By-product information can stabilize the reliability of communication.

    PubMed

    Schaefer, H Martin; Ruxton, G D

    2012-12-01

    Although communication underpins many biological processes, its function and basic definition remain contentious. In particular, researchers have debated whether information should be an integral part of a definition of communication and how it remains reliable. So far the handicap principle, assuming signal costs to stabilize reliable communication, has been the predominant paradigm in the study of animal communication. The role of by-product information produced by mechanisms other than the communicative interaction has been neglected in the debate on signal reliability. We argue that by-product information is common and that it provides the starting point for ritualization as the process of the evolution of communication. Second, by-product information remains unchanged during ritualization and enforces reliable communication by restricting the options for manipulation and cheating. Third, this perspective changes the focus of research on communication from studying signal costs to studying the costs of cheating. It can thus explain the reliability of signalling in many communication systems that do not rely on handicaps. We emphasize that communication can often be informative but that the evolution of communication does not cause the evolution of information because by-product information often predates and stimulates the evolution of communication. Communication is thus a consequence but not a cause of reliability. Communication is the interplay of inadvertent, informative traits and evolved traits that increase the stimulation and perception of perceivers. Viewing communication as a complex of inadvertent and derived traits facilitates understanding of the selective pressures shaping communication and those shaping information and its reliability. This viewpoint further contributes to resolving the current controversy on the role of information in communication.

  16. Accelerator Availability and Reliability Issues

    SciTech Connect

    Steve Suhring

    2003-05-01

    Maintaining reliable machine operations for existing machines as well as planning for future machines' operability present significant challenges to those responsible for system performance and improvement. Changes to machine requirements and beam specifications often reduce overall machine availability in an effort to meet user needs. Accelerator reliability issues from around the world will be presented, followed by a discussion of the major factors influencing machine availability.

  17. Reliability Validation and Improvement Framework

    DTIC Science & Technology

    2012-11-01

    discover and remove bugs using various test cover- age metrics to determine test sufficiency. Failure-probability density function based on code met- rics ...Coverage Metrics Traditional reliability engineering has focused on fault density and reliability growth as key met- rics . These are statistical...abs_all.jsp?arnumber=781027 [Kwiatkowska 2010] Kwiatkowska, M., Norman, G., & Parker , D. “Advances and Challenges of Probabilistic Model Checking

  18. Robust fusion with reliabilities weights

    NASA Astrophysics Data System (ADS)

    Grandin, Jean-Francois; Marques, Miguel

    2002-03-01

    The reliability is a value of the degree of trust in a given measurement. We analyze and compare: ML (Classical Maximum Likelihood), MLE (Maximum Likelihood weighted by Entropy), MLR (Maximum Likelihood weighted by Reliability), MLRE (Maximum Likelihood weighted by Reliability and Entropy), DS (Credibility Plausibility), DSR (DS weighted by reliabilities). The analysis is based on a model of a dynamical fusion process. It is composed of three sensors, which have each it's own discriminatory capacity, reliability rate, unknown bias and measurement noise. The knowledge of uncertainties is also severely corrupted, in order to analyze the robustness of the different fusion operators. Two sensor models are used: the first type of sensor is able to estimate the probability of each elementary hypothesis (probabilistic masses), the second type of sensor delivers masses on union of elementary hypotheses (DS masses). In the second case probabilistic reasoning leads to sharing the mass abusively between elementary hypotheses. Compared to the classical ML or DS which achieves just 50% of correct classification in some experiments, DSR, MLE, MLR and MLRE reveals very good performances on all experiments (more than 80% of correct classification rate). The experiment was performed with large variations of the reliability coefficients for each sensor (from 0 to 1), and with large variations on the knowledge of these coefficients (from 0 0.8). All four operators reveal good robustness, but the MLR reveals to be uniformly dominant on all the experiments in the Bayesian case and achieves the best mean performance under incomplete a priori information.

  19. MEMS reliability: coming of age

    NASA Astrophysics Data System (ADS)

    Douglass, Michael R.

    2008-02-01

    In today's high-volume semiconductor world, one could easily take reliability for granted. As the MOEMS/MEMS industry continues to establish itself as a viable alternative to conventional manufacturing in the macro world, reliability can be of high concern. Currently, there are several emerging market opportunities in which MOEMS/MEMS is gaining a foothold. Markets such as mobile media, consumer electronics, biomedical devices, and homeland security are all showing great interest in microfabricated products. At the same time, these markets are among the most demanding when it comes to reliability assurance. To be successful, each company developing a MOEMS/MEMS device must consider reliability on an equal footing with cost, performance and manufacturability. What can this maturing industry learn from the successful development of DLP technology, air bag accelerometers and inkjet printheads? This paper discusses some basic reliability principles which any MOEMS/MEMS device development must use. Examples from the commercially successful and highly reliable Digital Micromirror Device complement the discussion.

  20. Approximation of reliability of direct genomic breeding values

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Two methods to efficiently approximate theoretical genomic reliabilities are presented. The first method is based on the direct inverse of the left hand side (LHS) of mixed model equations. It uses the genomic relationship matrix for a small subset of individuals with the highest genomic relationshi...

  1. Reliability analysis of common hazardous waste treatment processes

    SciTech Connect

    Waters, Robert D.

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  2. The Reliability of Content Analysis of Computer Conference Communication

    ERIC Educational Resources Information Center

    Rattleff, Pernille

    2007-01-01

    The focus of this article is the reliability of content analysis of students' computer conference communication. Content analysis is often used when researching the relationship between learning and the use of information and communications technology in educational settings. A number of studies where content analysis is used and classification…

  3. Evolving role of pharmaceutical physicians in the industry: Indian perspective.

    PubMed

    Patil, Anant; Rajadhyaksha, Viraj

    2012-01-01

    The Indian pharmaceutical industry, like any other industry, has undergone significant change in the last decade. The role of a Medical advisor has always been of paramount importance in the pharmaceutical companies in India. On account of the evolving medical science and the competitive environment, the medical advisor's role is also increasingly becoming critical. In India, with changes in regulatory rules, safety surveillance, and concept of medical liaisons, the role of the medical advisor is evolving continuously and is further likely to evolve in the coming years in important areas like health economics, public private partnerships, and strategic planning.

  4. Heterogeneous edge weights promote epidemic diffusion in weighted evolving networks

    NASA Astrophysics Data System (ADS)

    Duan, Wei; Song, Zhichao; Qiu, Xiaogang

    2016-08-01

    The impact that the heterogeneities of links’ weights have on epidemic diffusion in weighted networks has received much attention. Investigating how heterogeneous edge weights affect epidemic spread is helpful for disease control. In this paper, we study a Reed-Frost epidemic model in weighted evolving networks. Our results indicate that a higher heterogeneity of edge weights leads to higher epidemic prevalence and epidemic incidence at earlier stage of epidemic diffusion in weighted evolving networks. In addition, weighted evolving scale-free networks come with a higher epidemic prevalence and epidemic incidence than unweighted scale-free networks.

  5. Evolving role of pharmaceutical physicians in the industry: Indian perspective

    PubMed Central

    Patil, Anant; Rajadhyaksha, Viraj

    2012-01-01

    The Indian pharmaceutical industry, like any other industry, has undergone significant change in the last decade. The role of a Medical advisor has always been of paramount importance in the pharmaceutical companies in India. On account of the evolving medical science and the competitive environment, the medical advisor's role is also increasingly becoming critical. In India, with changes in regulatory rules, safety surveillance, and concept of medical liaisons, the role of the medical advisor is evolving continuously and is further likely to evolve in the coming years in important areas like health economics, public private partnerships, and strategic planning. PMID:22347701

  6. Petrography, Geochemistry, and Pairing Relationships of Basaltic Lunar Meteorite Miller Range 13317

    NASA Astrophysics Data System (ADS)

    Zeigler, R. A.; Korotev, R. L.

    2016-08-01

    A petrographic and geochemical description of "new" lunar meteorite MIL 13317, an evolved lunar basaltic regolith breccia. The pairing relationships with previously described lunar meteorites are also explored.

  7. Reliability of plantar pressure platforms.

    PubMed

    Hafer, Jocelyn F; Lenhoff, Mark W; Song, Jinsup; Jordan, Joanne M; Hannan, Marian T; Hillstrom, Howard J

    2013-07-01

    Plantar pressure measurement is common practice in many research and clinical protocols. While the accuracy of some plantar pressure measuring devices and methods for ensuring consistency in data collection on plantar pressure measuring devices have been reported, the reliability of different devices when testing the same individuals is not known. This study calculated intra-mat, intra-manufacturer, and inter-manufacturer reliability of plantar pressure parameters as well as the number of plantar pressure trials needed to reach a stable estimate of the mean for an individual. Twenty-two healthy adults completed ten walking trials across each of two Novel emed-x(®) and two Tekscan MatScan(®) plantar pressure measuring devices in a single visit. Intraclass correlation (ICC) was used to describe the agreement between values measured by different devices. All intra-platform reliability correlations were greater than 0.70. All inter-emed-x(®) reliability correlations were greater than 0.70. Inter-MatScan(®) reliability correlations were greater than 0.70 in 31 and 52 of 56 parameters when looking at a 10-trial average and a 5-trial average, respectively. Inter-manufacturer reliability including all four devices was greater than 0.70 for 52 and 56 of 56 parameters when looking at a 10-trial average and a 5-trial average, respectively. All parameters reached a value within 90% of an unbiased estimate of the mean within five trials. Overall, reliability results are encouraging for investigators and clinicians who may have plantar pressure data sets that include data collected on different devices.

  8. Evolutionary genetics: you are what you evolve to eat.

    PubMed

    Dworkin, Ian; Jones, Corbin D

    2015-04-20

    The evolution of host specialization can potentially limit future evolutionary opportunities. A new study now shows how Drosophila sechellia, specialized on the toxic Morinda fruit, has evolved new nutritional needs influencing its reproduction.

  9. 18 CFR 39.11 - Reliability reports.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Reliability reports. 39... RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.11 Reliability reports. (a) The Electric Reliability Organization shall...

  10. Evolvable hardware: genetic search in a physical realm

    NASA Astrophysics Data System (ADS)

    Raichman, Nadav; Segev, Ronen; Ben-Jacob, Eshel

    2003-08-01

    The application of evolution-inspired strategies to hardware design and circuit self-configuration leads to the concept of evolvable hardware (EHW). EHW refers to self-configuration of electronic hardware by evolutionary/genetic algorithms (EA and GA, respectively). Unconventional circuits, for which there are no textbook design guidelines, are particularly appealing for EHW. Here we applied an evolutionary algorithm on a configurable digital FPGA chip in order to evolve analog-behavior circuits. Though the configurable chip is explicitly built for digital designs, analog circuits were successfully evolved by allowing feedback routings and by disabling the general clock. The results were unconventional circuits that were well fitted both to the task for which the circuits were evolved, and to the environment in which the evolution took place. We analyzed the morphotype (configuration) changes in circuit size and circuit operation through evolutionary time. The results showed that the evolved circuit structure had two distinct areas: an active area in which signal processing took place and a surrounding neutral area. The active area of the evolved circuits was small in size, but complex in structure. Results showed that the active area may grow during evolution, indicating that progress is achieved through the addition of units taken from the neutral area. Monitor views of the circuit outputs through evolution indicate that several distinct stages occurred in which evolution evolved. This is in accordance with the plots of fitness that show a progressive climb in a stair-like manner. Competitive studies were also performed of evolutions with various population sizes. Results showed that the smaller the size of the evolved population, the faster was the evolutionary process. This was attributed to the high degeneracy in gene variance within the large population, resulting in a futile search.

  11. Mass Loss and Dust Injection rates from Evolved Stars

    NASA Astrophysics Data System (ADS)

    Sargent, Benjamin A.

    2010-01-01

    The Spitzer Space Telescope is continuing to contribute greatly to our understanding of the mass return from evolved stars in the Magellanic Clouds (MCs). I first review a number of smaller early Spitzer studies of evolved stars in the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC). These studies often built upon earlier such studies using data from prior missions, like the Midcourse Space Experiment. I discuss various Spitzer spectroscopic studies that have investigated the dust compositions of evolved stars in the lower metallicity environments of the MCs. Also, I review studies of the MCs' massive evolved stars, which have been given somewhat less attention than other populations. Excitingly, using Spitzer data, for the first time the mass-loss from the diverse evolved star MC populations is being quantified. With the advent of the Surveying the Agents of a Galaxy's Evolution (SAGE; PI: M. Meixner) Spitzer Legacy program, tens of thousands of stars in the LMC have been classified as evolved stars using SAGE Spitzer data. I briefly review how evolved stars are classified (e.g., by using color-magnitude and color-color diagrams) using data from the SAGE surveys. Finally, I discuss work on radiative transfer (RT) modeling of evolved stars, which follows earlier work estimating their mass-loss using colors or emission in excess of stellar photosphere emission. This RT work starts by seeking acceptable dust properties for RT models of both SAGE Spectral Energy Distributions (SEDs) and SAGE-Spectroscopy (Spitzer Legacy program; PI: F. Kemper) spectra of asymptotic giant branch (AGB) stars. Afterwards, large grids of RT models are constructed to determine mass-loss rates for AGB stars and red supergiants in the SAGE samples of the LMC and, eventually, the SMC.

  12. Reliability model for planetary gear

    NASA Technical Reports Server (NTRS)

    Savage, M.; Paridon, C. A.; Coy, J. J.

    1982-01-01

    A reliability model is presented for planetary gear trains in which the ring gear is fixed, the Sun gear is the input, and the planet arm is the output. The input and output shafts are coaxial and the input and output torques are assumed to be coaxial with these shafts. Thrust and side loading are neglected. This type of gear train is commonly used in main rotor transmissions for helicopters and in other applications which require high reductions in speed. The reliability model is based on the Weibull distribution of the individual reliabilities of the transmission components. The transmission's basic dynamic capacity is defined as the input torque which may be applied for one million input rotations of the Sun gear. Load and life are related by a power law. The load life exponent and basic dynamic capacity are developed as functions of the component capacities.

  13. Electronics reliability and measurement technology

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S. (Editor)

    1987-01-01

    A summary is presented of the Electronics Reliability and Measurement Technology Workshop. The meeting examined the U.S. electronics industry with particular focus on reliability and state-of-the-art technology. A general consensus of the approximately 75 attendees was that "the U.S. electronics industries are facing a crisis that may threaten their existence". The workshop had specific objectives to discuss mechanisms to improve areas such as reliability, yield, and performance while reducing failure rates, delivery times, and cost. The findings of the workshop addressed various aspects of the industry from wafers to parts to assemblies. Key problem areas that were singled out for attention are identified, and action items necessary to accomplish their resolution are recommended.

  14. A Review of Score Reliability: Contemporary Thinking on Reliability Issues

    ERIC Educational Resources Information Center

    Rosen, Gerald A.

    2004-01-01

    Bruce Thompson's edited volume begins with a basic principle, one might call it a basic truth, "reliability is a property that applies to scores, and not immutably across all conceivable uses everywhere of a given measure." (p. 3). The author claims that this principle is little known and-or little understood. While that is an arguable point, the…

  15. Designing magnetic systems for reliability

    SciTech Connect

    Heitzenroeder, P.J.

    1991-01-01

    Designing magnetic system is an iterative process in which the requirements are set, a design is developed, materials and manufacturing processes are defined, interrelationships with the various elements of the system are established, engineering analyses are performed, and fault modes and effects are studied. Reliability requires that all elements of the design process, from the seemingly most straightforward such as utilities connection design and implementation, to the most sophisticated such as advanced finite element analyses, receives a balanced and appropriate level of attention. D.B. Montgomery's study of magnet failures has shown that the predominance of magnet failures tend not to be in the most intensively engineered areas, but are associated with insulation, leads, ad unanticipated conditions. TFTR, JET, JT-60, and PBX are all major tokamaks which have suffered loss of reliability due to water leaks. Similarly the majority of causes of loss of magnet reliability at PPPL has not been in the sophisticated areas of the design but are due to difficulties associated with coolant connections, bus connections, and external structural connections. Looking towards the future, the major next-devices such as BPX and ITER are most costly and complex than any of their predecessors and are pressing the bounds of operating levels, materials, and fabrication. Emphasis on reliability is a must as the fusion program enters a phase where there are fewer, but very costly devices with the goal of reaching a reactor prototype stage in the next two or three decades. This paper reviews some of the magnet reliability issues which PPPL has faced over the years the lessons learned from them, and magnet design and fabrication practices which have been found to contribute to magnet reliability.

  16. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... approve the following Reliability Standards that were submitted to the Commission for approval by the North American Electric Reliability Corporation, the Commission-certified Electric...

  17. 76 FR 16277 - System Restoration Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ... Energy Regulatory Commission 18 CFR Part 40 System Restoration Reliability Standards AGENCY: Federal... Act, the Commission approves three Emergency Operations and Preparedness (EOP) Reliability Standards... Resource'' submitted to the Commission for approval by the North American Electric Reliability...

  18. 18 CFR 39.5 - Reliability Standards.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... competition. (d) An approved Reliability Standard or modification to a Reliability Standard shall take effect... will not defer to the Electric Reliability Organization or a Regional Entity with respect to the...

  19. Photovoltaic power system reliability considerations

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.

    1980-01-01

    An example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems is presented. This particular application is for a solar cell power system demonstration project designed to provide electric power requirements for remote villages. The techniques utilized involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of fail-safe and planned spare parts engineering philosophy.

  20. Metrological Reliability of Medical Devices

    NASA Astrophysics Data System (ADS)

    Costa Monteiro, E.; Leon, L. F.

    2015-02-01

    The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.

  1. Mechanically reliable scales and coatings

    SciTech Connect

    Tortorelli, P.F.; Alexander, K.B.

    1995-07-01

    As the first stage in examining the mechanical reliability of protective surface oxides, the behavior of alumina scales formed on iron-aluminum alloys during high-temperature cyclic oxidation was characterized in terms of damage and spallation tendencies. Scales were thermally grown on specimens of three iron-aluminum composition using a series of exposures to air at 1000{degrees}C. Gravimetric data and microscopy revealed substantially better integrity and adhesion of the scales grown on an alloy containing zirconium. The use of polished (rather than just ground) specimens resulted in scales that were more suitable for subsequent characterization of mechanical reliability.

  2. Phenotypic effect of mutations in evolving populations of RNA molecules

    PubMed Central

    2010-01-01

    Background The secondary structure of folded RNA sequences is a good model to map phenotype onto genotype, as represented by the RNA sequence. Computational studies of the evolution of ensembles of RNA molecules towards target secondary structures yield valuable clues to the mechanisms behind adaptation of complex populations. The relationship between the space of sequences and structures, the organization of RNA ensembles at mutation-selection equilibrium, the time of adaptation as a function of the population parameters, the presence of collective effects in quasispecies, or the optimal mutation rates to promote adaptation all are issues that can be explored within this framework. Results We investigate the effect of microscopic mutations on the phenotype of RNA molecules during their in silico evolution and adaptation. We calculate the distribution of the effects of mutations on fitness, the relative fractions of beneficial and deleterious mutations and the corresponding selection coefficients for populations evolving under different mutation rates. Three different situations are explored: the mutation-selection equilibrium (optimized population) in three different fitness landscapes, the dynamics during adaptation towards a goal structure (adapting population), and the behavior under periodic population bottlenecks (perturbed population). Conclusions The ratio between the number of beneficial and deleterious mutations experienced by a population of RNA sequences increases with the value of the mutation rate μ at which evolution proceeds. In contrast, the selective value of mutations remains almost constant, independent of μ, indicating that adaptation occurs through an increase in the amount of beneficial mutations, with little variations in the average effect they have on fitness. Statistical analyses of the distribution of fitness effects reveal that small effects, either beneficial or deleterious, are well described by a Pareto distribution. These results

  3. Climate in Context - How partnerships evolve in regions

    NASA Astrophysics Data System (ADS)

    Parris, A. S.

    2014-12-01

    In 2015, NOAA's RISA program will celebrate its 20th year of exploration in the development of usable climate information. In the mid-1990s, a vision emerged to develop interdisciplinary research efforts at the regional scale for several important reasons. Recognizable climate patterns, such as the El Nino Southern Oscillation (ENSO), emerge at the regional level where our understanding of observations and models coalesce. Critical resources for society are managed in a context of regional systems, such as water supply and human populations. Multiple scales of governance (local, state, and federal) with complex institutional relationships can be examined across a region. Climate information (i.e. data, science, research etc) developed within these contexts has greater potential for use. All of this work rests on a foundation of iterative engagement between scientists and decision makers. Throughout these interactions, RISAs have navigated diverse politics, extreme events and disasters, socio-economic and ecological disruptions, and advances in both science and technology. Our understanding of information needs is evolving into a richer understanding of complex institutional, legal, political, and cultural contexts within which people can use science to make informed decisions. The outcome of RISA work includes both cases where climate information was used in decisions and cases where capacity for using climate information and making climate resilient decisions has increased over time. In addition to balancing supply and demand of scientific information, RISAs are engaged in a social process of reconciling climate information use with important drivers of society. Because partnerships are critical for sustained engagement, and because engagement is critically important to the use of science, the rapid development of new capacity in regionally-based science programs focused on providing climate decision support is both needed and challenging. New actors can bolster

  4. Bacterial type III secretion systems are ancient and evolved by multiple horizontal-transfer events.

    PubMed

    Gophna, Uri; Ron, Eliora Z; Graur, Dan

    2003-07-17

    Type III secretion systems (TTSS) are unique bacterial mechanisms that mediate elaborate interactions with their hosts. The fact that several of the TTSS proteins are closely related to flagellar export proteins has led to the suggestion that TTSS had evolved from flagella. Here we reconstruct the evolutionary history of four conserved type III secretion proteins and their phylogenetic relationships with flagellar paralogs. Our analysis indicates that the TTSS and the flagellar export mechanism share a common ancestor, but have evolved independently from one another. The suggestion that TTSS genes have evolved from genes encoding flagellar proteins is effectively refuted. A comparison of the species tree, as deduced from 16S rDNA sequences, to the protein phylogenetic trees has led to the identification of several major lateral transfer events involving clusters of TTSS genes. It is hypothesized that horizontal gene transfer has occurred much earlier and more frequently than previously inferred for TTSS genes and is, consequently, a major force shaping the evolution of species that harbor type III secretion systems.

  5. Reliability Analysis of Money Habitudes

    ERIC Educational Resources Information Center

    Delgadillo, Lucy M.; Bushman, Brittani S.

    2015-01-01

    Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…

  6. Reliability and cost analysis methods

    NASA Technical Reports Server (NTRS)

    Suich, Ronald C.

    1991-01-01

    In the design phase of a system, how does a design engineer or manager choose between a subsystem with .990 reliability and a more costly subsystem with .995 reliability? When is the increased cost justified? High reliability is not necessarily an end in itself but may be desirable in order to reduce the expected cost due to subsystem failure. However, this may not be the wisest use of funds since the expected cost due to subsystem failure is not the only cost involved. The subsystem itself may be very costly. We should not consider either the cost of the subsystem or the expected cost due to subsystem failure separately but should minimize the total of the two costs, i.e., the total of the cost of the subsystem plus the expected cost due to subsystem failure. This final report discusses the Combined Analysis of Reliability, Redundancy, and Cost (CARRAC) methods which were developed under Grant Number NAG 3-1100 from the NASA Lewis Research Center. CARRAC methods and a CARRAC computer program employ five models which can be used to cover a wide range of problems. The models contain an option which can include repair of failed modules.

  7. Interactive Maximum Reliability Cluster Analysis.

    ERIC Educational Resources Information Center

    Mays, Robert

    1978-01-01

    A FORTRAN program for clustering variables using the alpha coefficient of reliability is described. For batch operation, a rule for stopping the agglomerative precedure is available. The conversational version of the program allows the user to intervene in the process in order to test the final solution for sensitivity to changes. (Author/JKS)

  8. The Reliability of College Grades

    ERIC Educational Resources Information Center

    Beatty, Adam S.; Walmsley, Philip T.; Sackett, Paul R.; Kuncel, Nathan R.; Koch, Amanda J.

    2015-01-01

    Little is known about the reliability of college grades relative to how prominently they are used in educational research, and the results to date tend to be based on small sample studies or are decades old. This study uses two large databases (N > 800,000) from over 200 educational institutions spanning 13 years and finds that both first-year…

  9. Becoming a high reliability organization.

    PubMed

    Christianson, Marlys K; Sutcliffe, Kathleen M; Miller, Melissa A; Iwashyna, Theodore J

    2011-01-01

    Aircraft carriers, electrical power grids, and wildland firefighting, though seemingly different, are exemplars of high reliability organizations (HROs)--organizations that have the potential for catastrophic failure yet engage in nearly error-free performance. HROs commit to safety at the highest level and adopt a special approach to its pursuit. High reliability organizing has been studied and discussed for some time in other industries and is receiving increasing attention in health care, particularly in high-risk settings like the intensive care unit (ICU). The essence of high reliability organizing is a set of principles that enable organizations to focus attention on emergent problems and to deploy the right set of resources to address those problems. HROs behave in ways that sometimes seem counterintuitive--they do not try to hide failures but rather celebrate them as windows into the health of the system, they seek out problems, they avoid focusing on just one aspect of work and are able to see how all the parts of work fit together, they expect unexpected events and develop the capability to manage them, and they defer decision making to local frontline experts who are empowered to solve problems. Given the complexity of patient care in the ICU, the potential for medical error, and the particular sensitivity of critically ill patients to harm, high reliability organizing principles hold promise for improving ICU patient care.

  10. Wind turbine reliability database update.

    SciTech Connect

    Peters, Valerie A.; Hill, Roger Ray; Stinebaugh, Jennifer A.; Veers, Paul S.

    2009-03-01

    This report documents the status of the Sandia National Laboratories' Wind Plant Reliability Database. Included in this report are updates on the form and contents of the Database, which stems from a fivestep process of data partnerships, data definition and transfer, data formatting and normalization, analysis, and reporting. Selected observations are also reported.

  11. Wanted: A Solid, Reliable PC

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    This article discusses PC reliability, one of the most pressing issues regarding computers. Nearly a quarter century after the introduction of the first IBM PC and the outset of the personal computer revolution, PCs have largely become commodities, with little differentiating one brand from another in terms of capability and performance. Most of…

  12. The Humanities versus Interrater Reliability

    ERIC Educational Resources Information Center

    RiCharde, R. Stephen

    2008-01-01

    A persistent conflict between assessment professionals and faculty members in the humanities seems to focus inevitably on resistance to the concept of interrater reliability. While humanities faculty are often willing to engage in course-embedded assessment that uses some type of scoring rubric, when the demand for agreement in scoring is…

  13. Photovoltaic performance and reliability workshop

    SciTech Connect

    Kroposki, B

    1996-10-01

    This proceedings is the compilation of papers presented at the ninth PV Performance and Reliability Workshop held at the Sheraton Denver West Hotel on September 4--6, 1996. This years workshop included presentations from 25 speakers and had over 100 attendees. All of the presentations that were given are included in this proceedings. Topics of the papers included: defining service lifetime and developing models for PV module lifetime; examining and determining failure and degradation mechanisms in PV modules; combining IEEE/IEC/UL testing procedures; AC module performance and reliability testing; inverter reliability/qualification testing; standardization of utility interconnect requirements for PV systems; need activities to separate variables by testing individual components of PV systems (e.g. cells, modules, batteries, inverters,charge controllers) for individual reliability and then test them in actual system configurations; more results reported from field experience on modules, inverters, batteries, and charge controllers from field deployed PV systems; and system certification and standardized testing for stand-alone and grid-tied systems.

  14. The States and Higher Education: An Evolving Relationship at a Pivotal Moment

    ERIC Educational Resources Information Center

    Meotti, Michael P.

    2016-01-01

    The "proud-parent" attitude of states towards higher education between 1945 and 1970--due to the baby boom, the technological contributions that research universities had made to the war effort, and the GI Bill--began to cool in the late 1960s, when inflation and increasing demands from other state services such as Medicaid, prisons,…

  15. The evolving placenta: convergent evolution of variations in the endotheliochorial relationship.

    PubMed

    Enders, A C; Carter, A M

    2012-05-01

    Endotheliochorial placentas occur in orders from all four major clades of eutherian mammal. Species with this type of placenta include one of the smallest (pygmy shrew) and largest (African elephant) land mammals. The endotheliochorial placenta as a definitive form has an interhemal area consisting of maternal endothelium, interstitial lamina, trophoblast, individual or conjoint basal laminas, and fetal endothelium. We commonly think of such placentas as having hypertrophied maternal endothelium with abundant rough endoplasmic reticulum (rER), and as having hemophagous regions. Considering them as a whole, the trophoblast may be syncytial or cellular, fenestrated or nonfenestrated, and there may or may not be hemophagous regions. Variations also appear in the extent of hypertrophy of the maternal endothelium and in the abundance of rER in these cells. This combination of traits and a few other features produces many morphological variants. In addition to endotheliochorial as a definitive condition, a transitory endotheliochorial condition may appear in the course of forming a hemochorial placenta. In some emballonurid bats the early endotheliochorial placenta has two layers of trophoblast, but the definitive placenta lacks an outer syncytial trophoblast layer. In mollosid bats a well developed endotheliochorial placenta is present for a short time even after a definitive hemochorial placenta has developed in a different region. It is concluded that the endotheliochorial placenta is more widespread and diversified than originally thought, with the variant with cellular trophoblast in particular appearing in several species studied recently.

  16. Crossroads and Connections: An Evolving Relationship between NASA and the Navajo Nation

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Carron, A.

    2010-08-01

    Is working with Native Americans business as usual? We live in a project-based world that operates on three-to-five-year grants. A long term commitment can be next to impossible to keep, even if you have the best of intentions. Are there things one "must know" before approaching an indigenous population? How is it best to evaluate projects and programs involving Native Americans? In the NASA and the Navajo Nation project, which will turn five in January, 2010, we have compiled some key lessons learned that we hope will inform and encourage future partnerships between the space science education and Native American communities.

  17. Beyond PTSD: An Evolving Relationship Between Trauma Theory and Family Violence Research

    ERIC Educational Resources Information Center

    Becker-Blease, Kathryn A.; Freyd, Jennifer J.

    2005-01-01

    During the past 20 years, we have learned how similarly harmful are experiences of terror, violence, and abuse, whether they occur on the combat field or at home. The field of family violence has gained much from the field of traumatic stress, and collaborations between these two previously separate fields have yielded important new answers, as…

  18. Preliminary study of the reliability of imaging charge coupled devices

    NASA Technical Reports Server (NTRS)

    Beall, J. R.; Borenstein, M. D.; Homan, R. A.; Johnson, D. L.; Wilson, D. D.; Young, V. F.

    1978-01-01

    Imaging CCDs are capable of low light level response and high signal-to-noise ratios. In space applications they offer the user the ability to achieve extremely high resolution imaging with minimum circuitry in the photo sensor array. This work relates the CCD121H Fairchild device to the fundamentals of CCDs and the representative technologies. Several failure modes are described, construction is analyzed and test results are reported. In addition, the relationship of the device reliability to packaging principles is analyzed and test data presented. Finally, a test program is defined for more general reliability evaluation of CCDs.

  19. How Hierarchical Topics Evolve in Large Text Corpora.

    PubMed

    Cui, Weiwei; Liu, Shixia; Wu, Zhuofeng; Wei, Hao

    2014-12-01

    Using a sequence of topic trees to organize documents is a popular way to represent hierarchical and evolving topics in text corpora. However, following evolving topics in the context of topic trees remains difficult for users. To address this issue, we present an interactive visual text analysis approach to allow users to progressively explore and analyze the complex evolutionary patterns of hierarchical topics. The key idea behind our approach is to exploit a tree cut to approximate each tree and allow users to interactively modify the tree cuts based on their interests. In particular, we propose an incremental evolutionary tree cut algorithm with the goal of balancing 1) the fitness of each tree cut and the smoothness between adjacent tree cuts; 2) the historical and new information related to user interests. A time-based visualization is designed to illustrate the evolving topics over time. To preserve the mental map, we develop a stable layout algorithm. As a result, our approach can quickly guide users to progressively gain profound insights into evolving hierarchical topics. We evaluate the effectiveness of the proposed method on Amazon's Mechanical Turk and real-world news data. The results show that users are able to successfully analyze evolving topics in text data.

  20. Attack resilience of the evolving scientific collaboration network.

    PubMed

    Liu, Xiao Fan; Xu, Xiao-Ke; Small, Michael; Tse, Chi K

    2011-01-01

    Stationary complex networks have been extensively studied in the last ten years. However, many natural systems are known to be continuously evolving at the local ("microscopic") level. Understanding the response to targeted attacks of an evolving network may shed light on both how to design robust systems and finding effective attack strategies. In this paper we study empirically the response to targeted attacks of the scientific collaboration networks. First we show that scientific collaboration network is a complex system which evolves intensively at the local level--fewer than 20% of scientific collaborations last more than one year. Then, we investigate the impact of the sudden death of eminent scientists on the evolution of the collaboration networks of their former collaborators. We observe in particular that the sudden death, which is equivalent to the removal of the center of the egocentric network of the eminent scientist, does not affect the topological evolution of the residual network. Nonetheless, removal of the eminent hub node is exactly the strategy one would adopt for an effective targeted attack on a stationary network. Hence, we use this evolving collaboration network as an experimental model for attack on an evolving complex network. We find that such attacks are ineffectual, and infer that the scientific collaboration network is the trace of knowledge propagation on a larger underlying social network. The redundancy of the underlying structure in fact acts as a protection mechanism against such network attacks.

  1. Reliability prediction for a class of highly reliable digital systems

    NASA Technical Reports Server (NTRS)

    White, Allan L.

    1992-01-01

    Three theorems which show that the reliability of a popular class of systems can be computed using small and simple models are presented. This class consists of systems that are assemblages of subsystems where each subsystem is a majority-voting threeplex plus spares or majority-voting fourplex plus spares. The theorems are error bounds for model reduction and simplification. The error bounds are given in terms of readily available system parameters. The three theorems have been applied to a system that has been used as an example that generates extremely large reliability models; the system considered is one version of AIPS (Advanced Information Processing System) for IAPSA (Integrated Airframe Propulsion System Architecture).

  2. Reliability prediction for a class of highly reliable digital systems

    NASA Astrophysics Data System (ADS)

    White, Allan L.

    Three theorems which show that the reliability of a popular class of systems can be computed using small and simple models are presented. This class consists of systems that are assemblages of subsystems where each subsystem is a majority-voting threeplex plus spares or majority-voting fourplex plus spares. The theorems are error bounds for model reduction and simplification. The error bounds are given in terms of readily available system parameters. The three theorems have been applied to a system that has been used as an example that generates extremely large reliability models; the system considered is one version of AIPS (Advanced Information Processing System) for IAPSA (Integrated Airframe Propulsion System Architecture).

  3. Field-evolved insect resistance to Bt crops: definition, theory, and data.

    PubMed

    Tabashnik, Bruce E; Van Rensburg, J B J; Carrière, Yves

    2009-12-01

    Transgenic crops producing Bacillus thuringiensis (Bt) toxins for insect pest control have been successful, but their efficacy is reduced when pests evolve resistance. Here we review the definition of field-evolved resistance, the relationship between resistance and field control problems, the theory underlying strategies for delaying resistance, and resistance monitoring methods. We also analyze resistance monitoring data from five continents reported in 41 studies that evaluate responses of field populations of 11 lepidopteran pests to four Bt toxins produced by Bt corn and cotton. After more than a decade since initial commercialization of Bt crops, most target pest populations remain susceptible, whereas field-evolved resistance has been documented in some populations of three noctuid moth species: Spodoptera frugiperda (J. E. Smith) to Cry1F in Bt corn in Puerto Rico, Busseola fusca (Fuller) to CrylAb in Bt corn in South Africa, and Helicoverpa zea (Boddie) to CrylAc and Cry2Ab in Bt cotton in the southeastern United States. Field outcomes are consistent with predictions from theory, suggesting that factors delaying resistance include recessive inheritance of resistance, abundant refuges of non-Bt host plants, and two-toxin Bt crops deployed separately from one-toxin Bt crops. The insights gained from systematic analyses of resistance monitoring data may help to enhance the durability of transgenic insecticidal crops. We recommend continued use of the longstanding definition of resistance cited here and encourage discussions about which regulatory actions, if any, should be triggered by specific data on the magnitude, distribution, and impact of field-evolved resistance.

  4. Sex chromosomes evolved from independent ancestral linkage groups in winged insects.

    PubMed

    Pease, James B; Hahn, Matthew W

    2012-06-01

    The evolution of a pair of chromosomes that differ in appearance between males and females (heteromorphic sex chromosomes) has occurred repeatedly across plants and animals. Recent work has shown that the male heterogametic (XY) and female heterogametic (ZW) sex chromosomes evolved independently from different pairs of homomorphic autosomes in the common ancestor of birds and mammals but also that X and Z chromosomes share many convergent molecular features. However, little is known about how often heteromorphic sex chromosomes have either evolved convergently from different autosomes or in parallel from the same pair of autosomes and how universal patterns of molecular evolution on sex chromosomes really are. Among winged insects with sequenced genomes, there are male heterogametic species in both the Diptera (e.g., Drosophila melanogaster) and the Coleoptera (Tribolium castaneum), female heterogametic species in the Lepidoptera (Bombyx mori), and haplodiploid species in the Hymenoptera (e.g., Nasonia vitripennis). By determining orthologous relationships among genes on the X and Z chromosomes of insects with sequenced genomes, we are able to show that these chromosomes are not homologous to one another but are homologous to autosomes in each of the other species. These results strongly imply that heteromorphic sex chromosomes have evolved independently from different pairs of ancestral chromosomes in each of the insect orders studied. We also find that the convergently evolved X chromosomes of Diptera and Coleoptera share genomic features with each other and with vertebrate X chromosomes, including excess gene movement from the X to the autosomes. However, other patterns of molecular evolution--such as increased codon bias, decreased gene density, and the paucity of male-biased genes on the X--differ among the insect X and Z chromosomes. Our results provide evidence for both differences and nearly universal similarities in patterns of evolution among

  5. How is cyber threat evolving and what do organisations need to consider?

    PubMed

    Borrett, Martin; Carter, Roger; Wespi, Andreas

    Organisations and members of the public are becoming accustomed to the increasing velocity, frequency and variety of cyber-attacks that they have been facing over the last few years. In response to this challenge, it is important to explore what can be done to offer commercial and private users a reliable and functioning environment. This paper discusses how cyber threats might evolve in the future and seeks to explore these threats more fully. Attention is paid to the changing nature of cyber-attackers and their motivations and what this means for organisations. Finally, useful and actionable steps are provided, which practitioners can use to understand how they can start to address the future challenges of cyber security.

  6. Evolving public health approaches to the global challenge of foodborne infections.

    PubMed

    Tauxe, R V; Doyle, M P; Kuchenmüller, T; Schlundt, J; Stein, C E

    2010-05-30

    The landscape of foodborne infections is in flux. New pathogens emerge, established pathogens may acquire new characteristics and appear in unexpected food vehicles, while many existing problems remain unsolved. Consumers want more fresh foods year round, populations age and migrate, and the technologies and trade practices that produce foods change. Protecting the public health and minimizing the burden of foodborne illness mean expecting the unexpected, and being prepared to understand it when it occurs, so that prevention can be improved. Public health surveillance is also constantly evolving, as new diseases emerge and are judged worthy of notification, as new diagnostic tests change the ease and specificity of routine diagnosis and as social interest in particular issues waxes and wanes. Accurate health information, including reliable estimates of the burden of foodborne disease, can improve foodborne disease prevention, foster global health security, promote economic growth and development and strengthen evidence-based policy making.

  7. Differences in Reliability of Reproductive History Recall among Women in North Africa

    ERIC Educational Resources Information Center

    Soliman, Amr; Allen, Katharine; Lo, An-Chi; Banerjee, Mousumi; Hablas, Ahmed; Benider, Abdellatif; Benchekroun, Nadya; Samir, Salwa; Omar, Hoda G.; Merajver, Sofia; Mullan, Patricia

    2009-01-01

    Breast cancer is the most common cancer among women in North Africa. Women in this region have unique reproductive profiles. It is essential to obtain reliable information on reproductive histories to help better understand the relationship between reductive health and breast cancer. We tested the reliability of a reproductive history-based…

  8. A Model for Estimating the Reliability and Validity of Criterion-Referenced Measures.

    ERIC Educational Resources Information Center

    Edmonston, Leon P.; Randall, Robert S.

    A decision model designed to determine the reliability and validity of criterion referenced measures (CRMs) is presented. General procedures which pertain to the model are discussed as to: Measures of relationship, Reliability, Validity (content, criterion-oriented, and construct validation), and Item Analysis. The decision model is presented in…

  9. Evolving Systems: An Outcome of Fondest Hopes and Wildest Dreams

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Balas, Mark J.

    2012-01-01

    New theory is presented for evolving systems, which are autonomously controlled subsystems that self-assemble into a new evolved system with a higher purpose. Evolving systems of aerospace structures often require additional control when assembling to maintain stability during the entire evolution process. This is the concept of Adaptive Key Component Control that operates through one specific component to maintain stability during the evolution. In addition, this control must often overcome persistent disturbances that occur while the evolution is in progress. Theoretical results will be presented for Adaptive Key Component control for persistent disturbance rejection. An illustrative example will demonstrate the Adaptive Key Component controller on a system composed of rigid body and flexible body modes.

  10. Hybridization Reveals the Evolving Genomic Architecture of Speciation

    PubMed Central

    Kronforst, Marcus R.; Hansen, Matthew E.B.; Crawford, Nicholas G.; Gallant, Jason R.; Zhang, Wei; Kulathinal, Rob J.; Kapan, Durrell D.; Mullen, Sean P.

    2014-01-01

    SUMMARY The rate at which genomes diverge during speciation is unknown, as are the physical dynamics of the process. Here, we compare full genome sequences of 32 butterflies, representing five species from a hybridizing Heliconius butterfly community, to examine genome-wide patterns of introgression and infer how divergence evolves during the speciation process. Our analyses reveal that initial divergence is restricted to a small fraction of the genome, largely clustered around known wing-patterning genes. Over time, divergence evolves rapidly, due primarily to the origin of new divergent regions. Furthermore, divergent genomic regions display signatures of both selection and adaptive introgression, demonstrating the link between microevolutionary processes acting within species and the origin of species across macroevolutionary timescales. Our results provide a uniquely comprehensive portrait of the evolving species boundary due to the role that hybridization plays in reducing the background accumulation of divergence at neutral sites. PMID:24183670

  11. Synthesis of Evolving Cells for Reconfigurable Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Padayachee, J.; Bright, G.

    2014-07-01

    The concept of Reconfigurable Manufacturing Systems (RMSs) was formulated due to the global necessity for production systems that are able to economically evolve according to changes in markets and products. Technologies and design methods are under development to enable RMSs to exhibit transformable system layouts, reconfigurable processes, cells and machines. Existing factory design methods and software have not yet advanced to include reconfigurable manufacturing concepts. This paper presents the underlying group technology framework for the design of manufacturing cells that are able to evolve according to a changing product mix by mechanisms of reconfiguration. The framework is based on a Norton- Bass forecast and time variant BOM models. An adaptation of legacy group technology methods is presented for the synthesis of evolving cells and two optimization problems are presented within this context.

  12. Cooperative coevolution: an architecture for evolving coadapted subcomponents.

    PubMed

    Potter, M A; De Jong, K A

    2000-01-01

    To successfully apply evolutionary algorithms to the solution of increasingly complex problems, we must develop effective techniques for evolving solutions in the form of interacting coadapted subcomponents. One of the major difficulties is finding computational extensions to our current evolutionary paradigms that will enable such subcomponents to "emerge" rather than being hand designed. In this paper, we describe an architecture for evolving such subcomponents as a collection of cooperating species. Given a simple string-matching task, we show that evolutionary pressure to increase the overall fitness of the ecosystem can provide the needed stimulus for the emergence of an appropriate number of interdependent subcomponents that cover multiple niches, evolve to an appropriate level of generality, and adapt as the number and roles of their fellow subcomponents change over time. We then explore these issues within the context of a more complicated domain through a case study involving the evolution of artificial neural networks.

  13. Evolving Lorentzian wormholes supported by phantom matter and cosmological constant

    SciTech Connect

    Cataldo, Mauricio; Campo, Sergio del; Minning, Paul; Salgado, Patricio

    2009-01-15

    In this paper we study the possibility of sustaining an evolving wormhole via exotic matter made of phantom energy in the presence of a cosmological constant. We derive analytical evolving wormhole geometries by supposing that the radial tension of the phantom matter, which is negative to the radial pressure, and the pressure measured in the tangential directions have barotropic equations of state with constant state parameters. In this case the presence of a cosmological constant ensures accelerated expansion of the wormhole configurations. More specifically, for positive cosmological constant we have wormholes which expand forever and, for negative cosmological constant we have wormholes which expand to a maximum value and then recollapse. At spatial infinity the energy density and the pressures of the anisotropic phantom matter threading the wormholes vanish; thus these evolving wormholes are asymptotically vacuum {lambda}-Friedmann models with either open or closed or flat topologies.

  14. Highly reliable multisensor array (MSA) smart transducers

    NASA Astrophysics Data System (ADS)

    Perotti, José; Lucena, Angel; Mackey, Paul; Mata, Carlos; Immer, Christopher

    2006-05-01

    Many developments in the field of multisensor array (MSA) transducers have taken place in the last few years. Advancements in fabrication technology, such as Micro-Electro-Mechanical Systems (MEMS) and nanotechnology, have made implementation of MSA devices a reality. NASA Kennedy Space Center (KSC) has been developing this type of technology because of the increases in safety, reliability, and performance and the reduction in operational and maintenance costs that can be achieved with these devices. To demonstrate the MSA technology benefits, KSC quantified the relationship between the number of sensors (N) and the associated improvement in sensor life and reliability. A software algorithm was developed to monitor and assess the health of each element and the overall MSA. Furthermore, the software algorithm implemented criteria on how these elements would contribute to the MSA-calculated output to ensure required performance. The hypothesis was that a greater number of statistically independent sensor elements would provide a measurable increase in measurement reliability. A computer simulation was created to answer this question. An array of N sensors underwent random failures in the simulation and a life extension factor (LEF equals the percentage of the life of a single sensor) was calculated by the program. When LEF was plotted as a function of N, a quasiexponential behavior was detected with marginal improvement above N = 30. The hypothesis and follow-on simulation results were then corroborated experimentally. An array composed of eight independent pressure sensors was fabricated. To accelerate sensor life cycle and failure and to simulate degradation over time, the MSA was exposed to an environmental tem-perature of 125°C. Every 24 hours, the experiment's environmental temperature was returned to ambient temperature (27°C), and the outputs of all the MSA sensor elements were measured. Once per week, the MSA calibration was verified at five different

  15. Perturbation propagation in random and evolved Boolean networks

    NASA Astrophysics Data System (ADS)

    Fretter, Christoph; Szejka, Agnes; Drossel, Barbara

    2009-03-01

    In this paper, we investigate the propagation of perturbations in Boolean networks by evaluating the Derrida plot and its modifications. We show that even small random Boolean networks agree well with the predictions of the annealed approximation, but nonrandom networks show a very different behaviour. We focus on networks that were evolved for high dynamical robustness. The most important conclusion is that the simple distinction between frozen, critical and chaotic networks is no longer useful, since such evolved networks can display the properties of all three types of networks. Furthermore, we evaluate a simplified empirical network and show how its specific state space properties are reflected in the modified Derrida plots.

  16. Interaction-free evolving states of a bipartite system

    NASA Astrophysics Data System (ADS)

    Napoli, A.; Guccione, M.; Messina, A.; Chruściński, D.

    2014-06-01

    We show that two interacting physical systems may admit entangled pure or nonseparable mixed states evolving in time as if the mutual interaction Hamiltonian were absent. In this paper we define these interaction-free evolving (IFE) states and characterize their existence for a generic binary system described by a time-independent Hamiltonian. A comparison between IFE subspace and the decoherence-free subspace is reported. The set of all pure IFE states is explicitly constructed for a nonhomogeneous spin-star-system model

  17. The cartography of pain: the evolving contribution of pain maps.

    PubMed

    Schott, Geoffrey D

    2010-09-01

    Pain maps are nowadays widely used in clinical practice. This article aims to critically review the fundamental principles that underlie the mapping of pain, to analyse the evolving iconography of pain maps and their sometimes straightforward and sometimes contentious nature when used in the clinic, and to draw attention to some more recent developments in mapping pain. It is concluded that these maps are intriguing and evolving cartographic tools which can be used for depicting not only the spatial features but also the interpretative or perceptual components and accompaniments of pain.

  18. Active Printed Materials for Complex Self-Evolving Deformations

    PubMed Central

    Raviv, Dan; Zhao, Wei; McKnelly, Carrie; Papadopoulou, Athina; Kadambi, Achuta; Shi, Boxin; Hirsch, Shai; Dikovsky, Daniel; Zyracki, Michael; Olguin, Carlos; Raskar, Ramesh; Tibbits, Skylar

    2014-01-01

    We propose a new design of complex self-evolving structures that vary over time due to environmental interaction. In conventional 3D printing systems, materials are meant to be stable rather than active and fabricated models are designed and printed as static objects. Here, we introduce a novel approach for simulating and fabricating self-evolving structures that transform into a predetermined shape, changing property and function after fabrication. The new locally coordinated bending primitives combine into a single system, allowing for a global deformation which can stretch, fold and bend given environmental stimulus. PMID:25522053

  19. Oxygen evolving complex in photosystem II: better than excellent.

    PubMed

    Najafpour, Mohammad Mahdi; Govindjee

    2011-09-28

    The Oxygen Evolving Complex in photosystem II, which is responsible for the oxidation of water to oxygen in plants, algae and cyanobacteria, contains a cluster of one calcium and four manganese atoms. This cluster serves as a model for the splitting of water by energy obtained from sunlight. The recent published data on the mechanism and the structure of photosystem II provide a detailed architecture of the oxygen-evolving complex and the surrounding amino acids. Biomimetically, we expect to learn some strategies from this natural system to synthesize an efficient catalyst for water oxidation, that is necessary for artificial photosynthesis.

  20. Active printed materials for complex self-evolving deformations.

    PubMed

    Raviv, Dan; Zhao, Wei; McKnelly, Carrie; Papadopoulou, Athina; Kadambi, Achuta; Shi, Boxin; Hirsch, Shai; Dikovsky, Daniel; Zyracki, Michael; Olguin, Carlos; Raskar, Ramesh; Tibbits, Skylar

    2014-12-18

    We propose a new design of complex self-evolving structures that vary over time due to environmental interaction. In conventional 3D printing systems, materials are meant to be stable rather than active and fabricated models are designed and printed as static objects. Here, we introduce a novel approach for simulating and fabricating self-evolving structures that transform into a predetermined shape, changing property and function after fabrication. The new locally coordinated bending primitives combine into a single system, allowing for a global deformation which can stretch, fold and bend given environmental stimulus.

  1. Seyfert's Sextet (HGC 79): An Evolved Stephan's Quintet?

    NASA Astrophysics Data System (ADS)

    Durbala, A.; Sulentic, J.; Rosado, M.; Del Olmo, A.; Perea, J.; Plana, H.

    Scanning Fabry-Perot interferometers MOS/SIS (3.6m CFHT)+PUMA (2.1m OAN-SPM, México) and the long-slit spectrograph ALFOSC (2.5m NOT, La Palma) were used to measure the kinematics of gas and stars in Seyfert's Sextet (HCG79). We interpret it as a highly evolved group that formed from sequential acquistion of mostly late-type galaxies that are now slowly coalescing and undergoing strong secular evolution. We find evidence for possible feedback as revealed by accretion and minor merger events in two of the most evolved members.

  2. 78 FR 38311 - Reliability Technical Conference Agenda

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-26

    ... Energy Regulatory Commission Reliability Technical Conference Agenda Reliability Technical Docket No. AD13-6-000 Conference. North American Electric Docket No. RC11-6-004 Reliability Corporation. North American Electric Docket No. RR13-2-000 Reliability Corporation. Not consolidated. As announced in...

  3. 75 FR 71625 - System Restoration Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ... Energy Regulatory Commission 18 CFR Part 40 System Restoration Reliability Standards November 18, 2010... to approve Reliability Standards EOP-001-1 (Emergency Operations Planning), EOP- 005-2 (System... Commission by the North American Electric Reliability Corporation, the Electric Reliability Organization...

  4. Neural Networks, Reliability and Data Analysis

    DTIC Science & Technology

    1993-01-01

    Neural network technology has been surveyed with the intent of determining the feasibility and impact neural networks may have in the area of...automated reliability tools. Data analysis capabilities of neural networks appear to be very applicable to reliability science due to similar mathematical...tendencies in data.... Neural networks , Reliability, Data analysis, Automated reliability tools, Automated intelligent information processing, Statistical neural network.

  5. JMP Applications in Photovoltaic Reliability (Presentation)

    SciTech Connect

    Jordan, D.; Gotwalt, C.

    2011-09-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted into power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders?utility companies, integrators, investors, and scientist alike. Outdoor testing plays a vital part in quantifying degradation rates of different technologies in various climates. Due to seasonal changes, however, several complete cycles (typically 3-5 years) need to be completed traditionally to obtain reasonably accurate degradation rates. In a rapidly evolving industry such a time span is often unacceptable and the need exists to determine degradation rates more accurately in a shorter period of time. Advanced time series modeling such as ARIMA (Autoregressive Integrated Moving Average) modeling can be utilized to decrease the required time span and is compared with some non-linear modeling. In addition, it will be demonstrated how the JMP 9 map feature was used to reveal important technological trends by climate.

  6. Gearbox Reliability Collaborative Bearing Calibration

    SciTech Connect

    van Dam, J.

    2011-10-01

    NREL has initiated the Gearbox Reliability Collaborative (GRC) to investigate the root cause of the low wind turbine gearbox reliability. The GRC follows a multi-pronged approach based on a collaborative of manufacturers, owners, researchers and consultants. The project combines analysis, field testing, dynamometer testing, condition monitoring, and the development and population of a gearbox failure database. At the core of the project are two 750kW gearboxes that have been redesigned and rebuilt so that they are representative of the multi-megawatt gearbox topology currently used in the industry. These gearboxes are heavily instrumented and are tested in the field and on the dynamometer. This report discusses the bearing calibrations of the gearboxes.

  7. Three approaches to reliability analysis

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.

    1989-01-01

    It is noted that current reliability analysis tools differ not only in their solution techniques, but also in their approach to model abstraction. The analyst must be satisfied with the constraints that are intrinsic to any combination of solution technique and model abstraction. To get a better idea of the nature of these constraints, three reliability analysis tools (HARP, ASSIST/SURE, and CAME) were used to model portions of the Integrated Airframe/Propulsion Control System architecture. When presented with the example problem, all three tools failed to produce correct results. In all cases, either the tool or the model had to be modified. It is suggested that most of the difficulty is rooted in the large model size and long computational times which are characteristic of Markov model solutions.

  8. Kepler Reliability and Occurrence Rates

    NASA Astrophysics Data System (ADS)

    Bryson, Steve

    2016-10-01

    The Kepler mission has produced tables of exoplanet candidates (``KOI table''), as well as tables of transit detections (``TCE table''), hosted at the Exoplanet Archive (http://exoplanetarchive.ipac.caltech.edu). Transit detections in the TCE table that are plausibly due to a transiting object are selected for inclusion in the KOI table. KOI table entries that have not been identified as false positives (FPs) or false alarms (FAs) are classified as planet candidates (PCs, Mullally et al. 2015). A subset of PCs have been confirmed as planetary transits with greater than 99% probability, but most PCs have <99% probability of being true planets. The fraction of PCs that are true transiting planets is the PC reliability rate. The overall PC population is believed to have a reliability rate >90% (Morton & Johnson 2011).

  9. Bioantioxidants: the systems reliability standpoint.

    PubMed

    Koltover, V K

    2009-01-01

    The antioxidant power of the so-called antioxidants is negligible because their rate constants and concentrations are too small to compete with the specialized defense enzymes, like superoxide dismutase (SOD), for the reactive oxygen species. In this short review, we present a number of experimental data of our group, along with the relevant literature data, to show that in-vivo antioxidants increase the systems reliability in other tacks. For example, butylated hydroxytoluene can prevent production of O2*- in mitochondria, whereas flavonoids can induce expression of antioxidant enzymes, SOD and catalase. We suggest that the timely introduction of antioxidants can provide the beneficial physiological effects through the prophylactic reliability maintenance against reactive forms of oxygen via the hormonal system.

  10. Assessment of NDE Reliability Data

    NASA Technical Reports Server (NTRS)

    Yee, B. G. W.; Chang, F. H.; Covchman, J. C.; Lemon, G. H.; Packman, P. F.

    1976-01-01

    Twenty sets of relevant Nondestructive Evaluation (NDE) reliability data have been identified, collected, compiled, and categorized. A criterion for the selection of data for statistical analysis considerations has been formulated. A model to grade the quality and validity of the data sets has been developed. Data input formats, which record the pertinent parameters of the defect/specimen and inspection procedures, have been formulated for each NDE method. A comprehensive computer program has been written to calculate the probability of flaw detection at several confidence levels by the binomial distribution. This program also selects the desired data sets for pooling and tests the statistical pooling criteria before calculating the composite detection reliability. Probability of detection curves at 95 and 50 percent confidence levels have been plotted for individual sets of relevant data as well as for several sets of merged data with common sets of NDE parameters.

  11. Ensuring reliability in expansion schemes.

    PubMed

    Kamal-Uddin, Abu Sayed; Williams, Donald Leigh

    2005-01-01

    Existing electricity power supplies must serve, or be adapted to serve, the expansion of hospital buildings. With the existing power supply assets of many hospitals being up to 20 years old, assessing the security and reliability of the power system must be given appropriate priority to avoid unplanned outages due to overloads and equipment failures. It is imperative that adequate contingency is planned for essential and non-essential electricity circuits. This article describes the methodology undertaken, and the subsequent recommendations that were made, when evaluating the security and reliability of electricity power supplies to a number of major London hospitals. The methodology described aligns with the latest issue of NHS Estates HTM 2011 'Primary Electrical Infrastructure Emergency Electrical Services Design Guidance' (to which ERA Technology has contributed).

  12. Nonelectronic Parts Reliability Data 1991

    DTIC Science & Technology

    1991-05-01

    design trade-off decisions involving factors such as cost , weight, power, performance, etc. A parts stress type prediction is typically refined to...provide a quantitative means of estimating the relative cost -benefit of these and other system level trade-off considerations. Predictions which are...Package Watt Wattage Freq Frequency Pop Population Wdg Winding Herm Hermeticity Pos Position X-Sec Cross Section Imp Impedance Reliability Analysis

  13. Reliability of Naval Radar Systems

    DTIC Science & Technology

    1978-09-01

    the driver uses as much prime power to produce a few watts as the output tube does to produce several kilo- watts. Large klystrons , such as the ones...ITEMS: transmitter ANTENNA SITE: 214 diam (7’) power resistors, sero amplifiers, ANTENNA GAIN (d): 41 encoder, klystron (4) POLARIZATION: horizontal...whM.ch will be developed are the differences in mission, power levels, stress, and costs allowed for the development and assurance of reliability. (U

  14. Reliability Research for Photovoltaic Modules

    NASA Technical Reports Server (NTRS)

    Ross, Ronald J., Jr.

    1986-01-01

    Report describes research approach used to improve reliability of photovoltaic modules. Aimed at raising useful module lifetime to 20 to 30 years. Development of cost-effective solutions to module-lifetime problem requires compromises between degradation rates, failure rates, and lifetimes, on one hand, and costs of initial manufacture, maintenance, and lost energy, on other hand. Life-cycle costing integrates disparate economic terms, allowing cost effectiveness to be quantified, allowing comparison of different design alternatives.

  15. Storage Reliability of Reserve Batteries

    DTIC Science & Technology

    2007-11-02

    batteries – Environmental concerns, lack of business – Non-availability of some critical materials • Lithium Oxyhalides are systems of choice – Good...exhibit good corrosion resistance to neutral electrolytes (LiAlCl4 in thionyl chloride and sulfuryl chloride ) • Using AlCl3 creates a much more corrosive...Storage Reliability of Reserve Batteries Jeff Swank and Allan Goldberg Army Research Laboratory Adelphi, MD 301-394-3116 jswank@arl.army.mil ll l

  16. Reliability Prediction for Aerospace Electronics

    DTIC Science & Technology

    2015-04-20

    methodology can be extended to include radiation effects, frequency, and even packaging and solder joint effects to give a complete system...assume that there is no failure analysis (FA) of the devices after the HTOL test, or that the manufacturer will not report FA results to the...effects, frequency and even packaging and solder joint effects to give a complete system reliability evaluation framework. This matrix gives a very

  17. Mission reliability model programmers guide

    NASA Astrophysics Data System (ADS)

    Medina, Joseph M.; Simonson, Jonathan H.; Veatch, Michael H.

    1986-12-01

    The Mission Reliability Model (MIREM) has been developed to evaluate the reliability and sustained operating capability of advanced electronic circuits during the early stages of development. MIREM is applicable to integrated systems that achieve fault tolerance through dynamic fault detection, fault isolation, and reconfiguration. The model can also be valuable in evaluating designs that employ only dedicated or hard-wired redundancy. The most unique feature of MIREM is its ability to accurately reflect the impact of reconfigurable, competing functions on system reliability. The user defines the resources necessary to support a required function, e.g., Global Positioning System (GPS), and the model will compute the probability of losing that functional capability over a certain operating time. A critical failure occurs when there is not a sufficient number of working resources to support a specified function. As an analytic model, MIREM determines a value for Mean Time Between Critical Failure, Mission Completion Success Probability, and Failure Resiliency. The MIREM Programmers Guide addresses the model's program structure, function of routines, interdependence of subprograms and common blocks, and file usage. The information needed to port the model to other computer systems is also provided.

  18. LAMPF reliability history and program

    SciTech Connect

    van Dyck, O.

    1994-09-01

    Many years of service of the 800-MeV LAMPF H{sup +}/H{sup {minus}} linac offers the opportunity to evaluate the long-term reliability characteristics of a high-power machine, which with up to 800-kW beam power available is as close to an ADTT machine as exists in the world today. Records from the last 15 years of operation were analyzed for trends and areas of deteriorating reliability or disproportionate downtime and used to support engineering judgment on facility refurbishment to regain beam availability. This round of analysis has helped define a further level of detail and automation to be implemented in availability recording. Interesting features which emerge from the history include a clear measurement of the lower availability in the first operating cycle following extended maintenance periods, and a consistent picture of the highest availability to be expected in extended operating periods with the facility as used and maintained. The results provide a starting point for informed discussion of reliability goals.

  19. Questioning reliability assessments of health information on social media

    PubMed Central

    Dalmer, Nicole K.

    2017-01-01

    This narrative review examines assessments of the reliability of online health information retrieved through social media to ascertain whether health information accessed or disseminated through social media should be evaluated differently than other online health information. Several medical, library and information science, and interdisciplinary databases were searched using terms relating to social media, reliability, and health information. While social media’s increasing role in health information consumption is recognized, studies are dominated by investigations of traditional (i.e., non-social media) sites. To more richly assess constructions of reliability when using social media for health information, future research must focus on health consumers’ unique contexts, virtual relationships, and degrees of trust within their social networks. PMID:28096748

  20. Reliability testing procedure for MEMS IMUs applied to vibrating environments.

    PubMed

    De Pasquale, Giorgio; Somà, Aurelio

    2010-01-01

    The diffusion of micro electro-mechanical systems (MEMS) technology applied to navigation systems is rapidly increasing, but currently, there is a lack of knowledge about the reliability of this typology of devices, representing a serious limitation to their use in aerospace vehicles and other fields with medium and high requirements. In this paper, a reliability testing procedure for inertial sensors and inertial measurement units (IMU) based on MEMS for applications in vibrating environments is presented. The sensing performances were evaluated in terms of signal accuracy, systematic errors, and accidental errors; the actual working conditions were simulated by means of an accelerated dynamic excitation. A commercial MEMS-based IMU was analyzed to validate the proposed procedure. The main weaknesses of the system have been localized by providing important information about the relationship between the reliability levels of the system and individual components.

  1. Questioning reliability assessments of health information on social media.

    PubMed

    Dalmer, Nicole K

    2017-01-01

    This narrative review examines assessments of the reliability of online health information retrieved through social media to ascertain whether health information accessed or disseminated through social media should be evaluated differently than other online health information. Several medical, library and information science, and interdisciplinary databases were searched using terms relating to social media, reliability, and health information. While social media's increasing role in health information consumption is recognized, studies are dominated by investigations of traditional (i.e., non-social media) sites. To more richly assess constructions of reliability when using social media for health information, future research must focus on health consumers' unique contexts, virtual relationships, and degrees of trust within their social networks.

  2. Methods to improve reliability of video-recorded behavioral data.

    PubMed

    Haidet, Kim Kopenhaver; Tate, Judith; Divirgilio-Thomas, Dana; Kolanowski, Ann; Happ, Mary Beth

    2009-08-01

    Behavioral observation is a fundamental component of nursing practice and a primary source of clinical research data. The use of video technology in behavioral research offers important advantages to nurse scientists in assessing complex behaviors and relationships between behaviors. The appeal of using this method should be balanced, however, by an informed approach to reliability issues. In this article, we focus on factors that influence reliability, such as the use of sensitizing sessions to minimize participant reactivity and the importance of training protocols for video coders. In addition, we discuss data quality, the selection and use of observational tools, calculating reliability coefficients, and coding considerations for special populations based on our collective experiences across three different populations and settings.

  3. Mechanical system reliability for long life space systems

    NASA Technical Reports Server (NTRS)

    Kowal, Michael T.

    1994-01-01

    The creation of a compendium of mechanical limit states was undertaken in order to provide a reference base for the application of first-order reliability methods to mechanical systems in the context of the development of a system level design methodology. The compendium was conceived as a reference source specific to the problem of developing the noted design methodology, and not an exhaustive or exclusive compilation of mechanical limit states. The compendium is not intended to be a handbook of mechanical limit states for general use. The compendium provides a diverse set of limit-state relationships for use in demonstrating the application of probabilistic reliability methods to mechanical systems. The compendium is to be used in the reliability analysis of moderately complex mechanical systems.

  4. Reliability Testing Procedure for MEMS IMUs Applied to Vibrating Environments

    PubMed Central

    De Pasquale, Giorgio; Somà, Aurelio

    2010-01-01

    The diffusion of micro electro-mechanical systems (MEMS) technology applied to navigation systems is rapidly increasing, but currently, there is a lack of knowledge about the reliability of this typology of devices, representing a serious limitation to their use in aerospace vehicles and other fields with medium and high requirements. In this paper, a reliability testing procedure for inertial sensors and inertial measurement units (IMU) based on MEMS for applications in vibrating environments is presented. The sensing performances were evaluated in terms of signal accuracy, systematic errors, and accidental errors; the actual working conditions were simulated by means of an accelerated dynamic excitation. A commercial MEMS-based IMU was analyzed to validate the proposed procedure. The main weaknesses of the system have been localized by providing important information about the relationship between the reliability levels of the system and individual components. PMID:22315550

  5. Research at the Crossroads: How Intellectual Initiatives across Disciplines Evolve

    ERIC Educational Resources Information Center

    Frost, Susan H.; Jean, Paul M.; Teodorescu, Daniel; Brown, Amy B.

    2004-01-01

    How do intellectual initiatives across disciplines evolve? This qualitative case study of 11 interdisciplinary research initiatives at Emory University identifies key factors in their development: the passionate commitments of scholarly leaders, the presence of strong collegial networks, access to timely and multiple resources, flexible practices,…

  6. Fast Algorithms for Mining Co-evolving Time Series

    DTIC Science & Technology

    2011-09-01

    resolution methods : Fourier and Wavelets . . . . . . . . . . . . . . . . . . 9 2.2.4 Time series forecasting...categorical data. Our work is based on two key properties in those co-evolving time series , dynamics and correlation. Dynamics captures the temporal...applications. 2.2 A survey on time series methods There is a lot of work on time series analysis , on indexing, dimensionality reduction, forecasting

  7. Evolving fuzzy rules for relaxed-criteria negotiation.

    PubMed

    Sim, Kwang Mong

    2008-12-01

    In the literature on automated negotiation, very few negotiation agents are designed with the flexibility to slightly relax their negotiation criteria to reach a consensus more rapidly and with more certainty. Furthermore, these relaxed-criteria negotiation agents were not equipped with the ability to enhance their performance by learning and evolving their relaxed-criteria negotiation rules. The impetus of this work is designing market-driven negotiation agents (MDAs) that not only have the flexibility of relaxing bargaining criteria using fuzzy rules, but can also evolve their structures by learning new relaxed-criteria fuzzy rules to improve their negotiation outcomes as they participate in negotiations in more e-markets. To this end, an evolutionary algorithm for adapting and evolving relaxed-criteria fuzzy rules was developed. Implementing the idea in a testbed, two kinds of experiments for evaluating and comparing EvEMDAs (MDAs with relaxed-criteria rules that are evolved using the evolutionary algorithm) and EMDAs (MDAs with relaxed-criteria rules that are manually constructed) were carried out through stochastic simulations. Empirical results show that: 1) EvEMDAs generally outperformed EMDAs in different types of e-markets and 2) the negotiation outcomes of EvEMDAs generally improved as they negotiated in more e-markets.

  8. The Evolving Significance of Race: Living, Learning, and Teaching

    ERIC Educational Resources Information Center

    Hughes, Sherick A., Ed.; Berry, Theodorea Regina, Ed.

    2012-01-01

    Individuals are living, learning, and teaching by questioning how to address race in a society that consistently prefers to see itself as colorblind, a society claiming to seek a "post-racial" existence. This edited volume offers evidence of the evolving significance of race from a diverse group of male and female contributors…

  9. Hip Hop Is Now: An Evolving Youth Culture

    ERIC Educational Resources Information Center

    Taylor, Carl; Taylor, Virgil

    2007-01-01

    Emerging from Rap music, Hip Hop has become a lifestyle to many modern youth around the world. Embodying both creativity and controversy, Hip Hop mirrors the values, violence, and hypocrisy of modern culture. The authors dispel some of the simplistic views that surround this evolving youth movement embraced by millions of young people who are…

  10. Tensions inherent in the evolving role of the infection preventionist

    PubMed Central

    Conway, Laurie J.; Raveis, Victoria H.; Pogorzelska-Maziarz, Monika; Uchida, May; Stone, Patricia W.; Larson, Elaine L.

    2014-01-01

    Background The role of infection preventionists (IPs) is expanding in response to demands for quality and transparency in health care. Practice analyses and survey research have demonstrated that IPs spend a majority of their time on surveillance and are increasingly responsible for prevention activities and management; however, deeper qualitative aspects of the IP role have rarely been explored. Methods We conducted a qualitative content analysis of in-depth interviews with 19 IPs at hospitals throughout the United States to describe the current IP role, specifically the ways that IPs effect improvements and the facilitators and barriers they face. Results The narratives document that the IP role is evolving in response to recent changes in the health care landscape and reveal that this progression is associated with friction and uncertainty. Tensions inherent in the evolving role of the IP emerged from the interviews as 4 broad themes: (1) expanding responsibilities outstrip resources, (2) shifting role boundaries create uncertainty, (3) evolving mechanisms of influence involve trade-offs, and (4) the stress of constant change is compounded by chronic recurring challenges. Conclusion Advances in implementation science, data standardization, and training in leadership skills are needed to support IPs in their evolving role. PMID:23880116

  11. Today`s control systems evolved from early pioneers` dreams

    SciTech Connect

    Smith, D.J.

    1996-04-01

    In the last 100 years, power plant controls have evolved from manual operation and simple instruments to automatic state-of-the-art computerized control systems using smart instruments. This article traces the evolution of controls. The topics of the article include early control systems, developments in the early 20th century, Bailey controls, and developments in the late 20th century.

  12. Evolving Strategies for Cancer and Autoimmunity: Back to the Future

    PubMed Central

    Lane, Peter J. L.; McConnell, Fiona M.; Anderson, Graham; Nawaf, Maher G.; Gaspal, Fabrina M.; Withers, David R.

    2014-01-01

    Although current thinking has focused on genetic variation between individuals and environmental influences as underpinning susceptibility to both autoimmunity and cancer, an alternative view is that human susceptibility to these diseases is a consequence of the way the immune system evolved. It is important to remember that the immunological genes that we inherit and the systems that they control were shaped by the drive for reproductive success rather than for individual survival. It is our view that human susceptibility to autoimmunity and cancer is the evolutionarily acceptable side effect of the immune adaptations that evolved in early placental mammals to accommodate a fundamental change in reproductive strategy. Studies of immune function in mammals show that high affinity antibodies and CD4 memory, along with its regulation, co-evolved with placentation. By dissection of the immunologically active genes and proteins that evolved to regulate this step change in the mammalian immune system, clues have emerged that may reveal ways of de-tuning both effector and regulatory arms of the immune system to abrogate autoimmune responses whilst preserving protection against infection. Paradoxically, it appears that such a detuned and deregulated immune system is much better equipped to mount anti-tumor immune responses against cancers. PMID:24782861

  13. Coevolution Drives the Emergence of Complex Traits and Promotes Evolvability

    PubMed Central

    Zaman, Luis; Meyer, Justin R.; Devangam, Suhas; Bryson, David M.; Lenski, Richard E.; Ofria, Charles

    2014-01-01

    The evolution of complex organismal traits is obvious as a historical fact, but the underlying causes—including the role of natural selection—are contested. Gould argued that a random walk from a necessarily simple beginning would produce the appearance of increasing complexity over time. Others contend that selection, including coevolutionary arms races, can systematically push organisms toward more complex traits. Methodological challenges have largely precluded experimental tests of these hypotheses. Using the Avida platform for digital evolution, we show that coevolution of hosts and parasites greatly increases organismal complexity relative to that otherwise achieved. As parasites evolve to counter the rise of resistant hosts, parasite populations retain a genetic record of past coevolutionary states. As a consequence, hosts differentially escape by performing progressively more complex functions. We show that coevolution's unique feedback between host and parasite frequencies is a key process in the evolution of complexity. Strikingly, the hosts evolve genomes that are also more phenotypically evolvable, similar to the phenomenon of contingency loci observed in bacterial pathogens. Because coevolution is ubiquitous in nature, our results support a general model whereby antagonistic interactions and natural selection together favor both increased complexity and evolvability. PMID:25514332

  14. A Conceptual Framework for Evolving, Recommender Online Learning Systems

    ERIC Educational Resources Information Center

    Peiris, K. Dharini Amitha; Gallupe, R. Brent

    2012-01-01

    A comprehensive conceptual framework is developed and described for evolving recommender-driven online learning systems (ROLS). This framework describes how such systems can support students, course authors, course instructors, systems administrators, and policy makers in developing and using these ROLS. The design science information systems…

  15. The Evolving Status of Photojournalism Education. ERIC Digest.

    ERIC Educational Resources Information Center

    Cookman, Claude

    Noting that new technologies are resulting in extensive changes in the field of photojournalism, both as it is practiced and taught, this Digest reviews this rapidly evolving field of education and professional practice. It discusses what digital photography is; the history of digital photography; how digital photography has changed…

  16. [Cardiac computed tomography: new applications of an evolving technique].

    PubMed

    Martín, María; Corros, Cecilia; Calvo, Juan; Mesa, Alicia; García-Campos, Ana; Rodríguez, María Luisa; Barreiro, Manuel; Rozado, José; Colunga, Santiago; de la Hera, Jesús M; Morís, César; Luyando, Luis H

    2015-01-01

    During the last years we have witnessed an increasing development of imaging techniques applied in Cardiology. Among them, cardiac computed tomography is an emerging and evolving technique. With the current possibility of very low radiation studies, the applications have expanded and go further coronariography In the present article we review the technical developments of cardiac computed tomography and its new applications.

  17. Multivariate Epi-splines and Evolving Function Identification Problems

    DTIC Science & Technology

    2015-04-15

    MULTIVARIATE EPI- SPLINES AND EVOLVING FUNCTION IDENTIFICATION PROBLEMS∗ Johannes O. Royset Roger J-B Wets Operations Research Department Department...fitting, and estimation. The paper develops piecewise polynomial functions, called epi- splines , that approximate any lsc function to an arbitrary...level of accuracy. Epi- splines provide the foundation for the solution of a rich class of function identification problems that incorporate general

  18. Optimists' Creed: Brave New Cyberlearning, Evolving Utopias (Circa 2041)

    ERIC Educational Resources Information Center

    Burleson, Winslow; Lewis, Armanda

    2016-01-01

    This essay imagines the role that artificial intelligence innovations play in the integrated living, learning and research environments of 2041. Here, in 2041, in the context of increasingly complex wicked challenges, whose solutions by their very nature continue to evade even the most capable experts, society and technology have co-evolved to…

  19. Towards Evolving Electronic Circuits for Autonomous Space Applications

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Haith, Gary L.; Colombano, Silvano P.; Stassinopoulos, Dimitris

    2000-01-01

    The relatively new field of Evolvable Hardware studies how simulated evolution can reconfigure, adapt, and design hardware structures in an automated manner. Space applications, especially those requiring autonomy, are potential beneficiaries of evolvable hardware. For example, robotic drilling from a mobile platform requires high-bandwidth controller circuits that are difficult to design. In this paper, we present automated design techniques based on evolutionary search that could potentially be used in such applications. First, we present a method of automatically generating analog circuit designs using evolutionary search and a circuit construction language. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. Using a parallel genetic algorithm, we present experimental results for five design tasks. Second, we investigate the use of coevolution in automated circuit design. We examine fitness evaluation by comparing the effectiveness of four fitness schedules. The results indicate that solution quality is highest with static and co-evolving fitness schedules as compared to the other two dynamic schedules. We discuss these results and offer two possible explanations for the observed behavior: retention of useful information, and alignment of problem difficulty with circuit proficiency.

  20. The Evolving Military Learner Population: A Review of the Literature

    ERIC Educational Resources Information Center

    Ford, Kate; Vignare, Karen

    2015-01-01

    This literature review examines the evolving online military learner population with emphasis on current generation military learners, who are most frequently Post-9/11 veterans. The review synthesizes recent scholarly and grey literature on military learner demographics and attributes, college experiences, and academic outcomes against a backdrop…

  1. Programmed life span in the context of evolvability.

    PubMed

    Mitteldorf, Joshua; Martins, André C R

    2014-09-01

    Population turnover is necessary for progressive evolution. In the context of a niche with fixed carrying capacity, aging contributes to the rate of population turnover. Theoretically, a population in which death is programmed on a fixed schedule can evolve more rapidly than one in which population turnover is left to a random death rate. Could aging evolve on this basis? Quantitative realization of this idea is problematic, since the short-term individual fitness cost is likely to eliminate any hypothetical gene for programmed death before the long-term benefit can be realized. In 2011, one of us proposed the first quantitative model based on this mechanism that robustly evolves a finite, programmed life span. That model was based on a viscous population in a rapidly changing environment. Here, we strip this model to its essence and eliminate the assumption of environmental change. We conclude that there is no obvious way in which this model is unrealistic, and that it may indeed capture an important principle of nature's workings. We suggest aging may be understood within the context of the emerging science of evolvability.

  2. The Evolving Role of the Head of Department

    ERIC Educational Resources Information Center

    Kerry, Trevor

    2005-01-01

    This paper examines three concepts relating to the role of heads of department (HoDs) in secondary schools: boundary management; the roles of subject leadership and departmental functioning as HoD activities; and the place of HoDs in evolving school hierarchies. To throw light on the last an empirical study is reported that explores hierarchies in…

  3. A View from Above: The Evolving Sociological Landscape

    ERIC Educational Resources Information Center

    Moody, James; Light, Ryan

    2006-01-01

    How has sociology evolved over the last 40 years? In this paper, we examine networks built on thousands of sociology-relevant papers to map sociology's position in the wider social sciences and identify changes in the most prominent research fronts in the discipline. We find first that sociology seems to have traded centrality in the field of…

  4. Just My Imagination: Beauty premium and the evolved mental model.

    PubMed

    Oda, Ryo

    2017-01-01

    Imagination, an important feature of the human mind, may be at the root of the beauty premium. The evolved human capacity for simulating the real world, developed as an adaptation to a complex social environment, may offer the key to understanding this and many other aspects of human behavior.

  5. Do Infants Possess an Evolved Spider-Detection Mechanism?

    ERIC Educational Resources Information Center

    Rakison, David H.; Derringer, Jaime

    2008-01-01

    Previous studies with various non-human animals have revealed that they possess an evolved predator recognition mechanism that specifies the appearance of recurring threats. We used the preferential looking and habituation paradigms in three experiments to investigate whether 5-month-old human infants have a perceptual template for spiders that…

  6. Evolving Nature of Sexual Orientation and Gender Identity

    ERIC Educational Resources Information Center

    Jourian, T. J.

    2015-01-01

    This chapter discusses the historical and evolving terminology, constructs, and ideologies that inform the language used by those who are lesbian, gay, bisexual, and same-gender loving, who may identify as queer, as well as those who are members of trans* communities from multiple and intersectional perspectives.

  7. Strategic Planning for Policy Development--An Evolving Model.

    ERIC Educational Resources Information Center

    Verstegen, Deborah A.; Wagoner, Jennings L., Jr.

    1989-01-01

    Strategic planning, a necessary alternative to logical incrementalism in turbulent environments, will let educators move from a reactive to a proactive posture. This article briefly reviews strategic planning literature, focuses on environmental scanning, and describes an evolving model developed for the chief state school officers of a four-state…

  8. HIF2 and endocrine neoplasia: an evolving story.

    PubMed

    Maher, Eamonn R

    2013-06-01

    In this issue of Endocrine-Related Cancer, Toledo et al. report the identification of activating mutations in the HIF2 (EPAS1) transcription factor in a subset of sporadic pheochromocytomas and paragangliomas. These findings add significantly to an evolving and complex story of the role of hypoxic gene response pathways in human endocrine neoplasia.

  9. Developing Collective Learning Extension for Rapidly Evolving Information System Courses

    ERIC Educational Resources Information Center

    Agarwal, Nitin; Ahmed, Faysal

    2017-01-01

    Due to rapidly evolving Information System (IS) technologies, instructors find themselves stuck in the constant game of catching up. On the same hand students find their skills obsolete almost as soon as they graduate. As part of IS curriculum and education, we need to emphasize more on teaching the students "how to learn" while keeping…

  10. An Evolved Wavelet Library Based on Genetic Algorithm

    PubMed Central

    Vaithiyanathan, D.; Seshasayanan, R.; Kunaraj, K.; Keerthiga, J.

    2014-01-01

    As the size of the images being captured increases, there is a need for a robust algorithm for image compression which satiates the bandwidth limitation of the transmitted channels and preserves the image resolution without considerable loss in the image quality. Many conventional image compression algorithms use wavelet transform which can significantly reduce the number of bits needed to represent a pixel and the process of quantization and thresholding further increases the compression. In this paper the authors evolve two sets of wavelet filter coefficients using genetic algorithm (GA), one for the whole image portion except the edge areas and the other for the portions near the edges in the image (i.e., global and local filters). Images are initially separated into several groups based on their frequency content, edges, and textures and the wavelet filter coefficients are evolved separately for each group. As there is a possibility of the GA settling in local maximum, we introduce a new shuffling operator to prevent the GA from this effect. The GA used to evolve filter coefficients primarily focuses on maximizing the peak signal to noise ratio (PSNR). The evolved filter coefficients by the proposed method outperform the existing methods by a 0.31 dB improvement in the average PSNR and a 0.39 dB improvement in the maximum PSNR. PMID:25405225

  11. Rapidly evolving hypopituitarism in a boy with multiple autoimmune disorders.

    PubMed

    Jevalikar, Ganesh; Wong, Sze Choong; Zacharin, Margaret

    2013-09-01

    A 10-year-old boy with acute onset cranial diabetes insipidus and multiple autoimmune disorders had evolving panhypopituitarism, thought to be due to autoimmune hypophysitis. Over 18 months, a dramatic clinical course with progressive hypopituitarism and development of type 1 diabetes mellitus was evident. Serial brain imaging showed changes suggestive of germinoma.

  12. Does evolving the future preclude learning from it?

    PubMed

    Dowrick, Peter W

    2014-08-01

    Despite its considerable length, this article proposes a theory of human behavioral science that eschews half the evidence. There is irony in the title "Evolving the Future" when the featured examples of intentional change represent procedures that build slowly on the past. Has an opportunity been missed, or is an evolutionary perspective simply incompatible with learning from the future?

  13. Evolving Approaches to Educating Children from Nomadic Communities

    ERIC Educational Resources Information Center

    Dyer, Caroline

    2016-01-01

    Evolving policies have increasingly aimed to include nomadic groups in EFA, but an overemphasis on mobility has distracted policy makers from going beyond access logistics to consider learning needs within nomads' contemporary livelihoods and cultural values. Notable global trends are the growth and institutionalization of forms of Alternative…

  14. Evolvable mathematical models: A new artificial Intelligence paradigm

    NASA Astrophysics Data System (ADS)

    Grouchy, Paul

    We develop a novel Artificial Intelligence paradigm to generate autonomously artificial agents as mathematical models of behaviour. Agent/environment inputs are mapped to agent outputs via equation trees which are evolved in a manner similar to Symbolic Regression in Genetic Programming. Equations are comprised of only the four basic mathematical operators, addition, subtraction, multiplication and division, as well as input and output variables and constants. From these operations, equations can be constructed that approximate any analytic function. These Evolvable Mathematical Models (EMMs) are tested and compared to their Artificial Neural Network (ANN) counterparts on two benchmarking tasks: the double-pole balancing without velocity information benchmark and the challenging discrete Double-T Maze experiments with homing. The results from these experiments show that EMMs are capable of solving tasks typically solved by ANNs, and that they have the ability to produce agents that demonstrate learning behaviours. To further explore the capabilities of EMMs, as well as to investigate the evolutionary origins of communication, we develop NoiseWorld, an Artificial Life simulation in which interagent communication emerges and evolves from initially noncommunicating EMM-based agents. Agents develop the capability to transmit their x and y position information over a one-dimensional channel via a complex, dialogue-based communication scheme. These evolved communication schemes are analyzed and their evolutionary trajectories examined, yielding significant insight into the emergence and subsequent evolution of cooperative communication. Evolved agents from NoiseWorld are successfully transferred onto physical robots, demonstrating the transferability of EMM-based AIs from simulation into physical reality.

  15. Reliability Estimation and Failure Analysis of Multilayer Ceramic Chip Capacitors

    NASA Astrophysics Data System (ADS)

    Yang, Seok Jun; Kim, Jin Woo; Ryu, Dong Su; Kim, Myung Soo; Jang, Joong Soon

    This paper presents the failure analysis and the reliability estimation of a multilayer ceramic chip capacitor. For the failed samples used in an automobile engine control unit, failure analysis was made to identify the root cause of failure and it was shown that the migration and the avalanche breakdown were the dominant failure mechanisms. Next, an accelerated life testing was designed to estimate the life of the MLCC. It is assumed that Weibull lifetime distribution and the life-stress relationship proposed Prokopowicz and Vaskas. The life-stress relationship and the acceleration factor are estimated by analyzing the accelerated life test data.

  16. Developing Architectures and Technologies for an Evolvable NASA Space Communication Infrastructure

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Hayden, Jeffrey

    2004-01-01

    Space communications architecture concepts play a key role in the development and deployment of NASA's future exploration and science missions. Once a mission is deployed, the communication link to the user needs to provide maximum information delivery and flexibility to handle the expected large and complex data sets and to enable direct interaction with the spacecraft and experiments. In human and robotic missions, communication systems need to offer maximum reliability with robust two-way links for software uploads and virtual interactions. Identifying the capabilities to cost effectively meet the demanding space communication needs of 21st century missions, proper formulation of the requirements for these missions, and identifying the early technology developments that will be needed can only be resolved with architecture design. This paper will describe the development of evolvable space communication architecture models and the technologies needed to support Earth sensor web and collaborative observation formation missions; robotic scientific missions for detailed investigation of planets, moons, and small bodies in the solar system; human missions for exploration of the Moon, Mars, Ganymede, Callisto, and asteroids; human settlements in space, on the Moon, and on Mars; and great in-space observatories for observing other star systems and the universe. The resulting architectures will enable the reliable, multipoint, high data rate capabilities needed on demand to provide continuous, maximum coverage of areas of concentrated activities, such as in the vicinity of outposts in-space, on the Moon or on Mars.

  17. Applicability and Limitations of Reliability Allocation Methods

    NASA Technical Reports Server (NTRS)

    Cruz, Jose A.

    2016-01-01

    Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.

  18. 76 FR 58101 - Electric Reliability Organization Interpretation of Transmission Operations Reliability Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-20

    ... of Reliability Standard, TOP-001-1, Requirement R8, which pertains to the restoration of real and... Requirement R8 in Commission-approved NERC Reliability Standard TOP-001-1-- Reliability Responsibilities and... 107 Reliability Standards filed by NERC, including Reliability Standard TOP-001-1.\\4\\ \\2\\...

  19. Decision theory in structural reliability

    NASA Technical Reports Server (NTRS)

    Thomas, J. M.; Hanagud, S.; Hawk, J. D.

    1975-01-01

    Some fundamentals of reliability analysis as applicable to aerospace structures are reviewed, and the concept of a test option is introduced. A decision methodology, based on statistical decision theory, is developed for determining the most cost-effective design factor and method of testing for a given structural assembly. The method is applied to several Saturn V and Space Shuttle structural assemblies as examples. It is observed that the cost and weight features of the design have a significant effect on the optimum decision.

  20. A highly reliable LAN protocol

    NASA Astrophysics Data System (ADS)

    Weaver, A. C.

    1986-10-01

    As a research project for NASA's Langley Research Center, a variation on the military standard for avionics buses was developed to increase fault tolerance. The resulting protocol, called implicit token passing (ITP), replaces an explicit token with brief 'soundoff' messages from all nodes participating on the LAN. ITP features high throughput and bounded message delay, and achieves high reliability through tolerance of failed nodes and automatic resynchronization when failed nodes are revived. The protocol is ideally suited for a bus topology and fiber optic media.